Speaking at Siggraph 2024 – an annual tech conference – with Nvidia’s CEO Jensen Huang, Meta CEO Mark Zuckerberg said he thinks AI-powered smart glasses are going to be the next big thing.
Specifically, Zuckerberg said he believes “display-less AI glasses at the $300 point are going to be a really big product that tens of millions of people, or hundreds of millions of people, eventually are going to have.” And when you look at his company’s most recent attempt at this tech, you can see where he’s coming from.
The Ray-Ban Meta Smart Glasses could do with better cameras and open air speakers, but their in-built microphones are solid, the Meta AI functionality is surprisingly neat – even in beta – and at the very least they’re super stylish sunglasses (it’s the height of summer here in the UK, and even when they’re switched off I’m wearing my Ray-Ban Meta shades all the time while outside). If you’re looking to buy an AI wearable these are hands-down the best I’ve tried, and they’re one of the best smart glasses in general.
But while Meta has started to crack the case of what makes a good pair of smart glasses, there are lessons that it and its rivals need to learn before smart glasses achieve anything close to the mainstream success Zuckerberg says they will.
Just work
Whether its apps you have to install through third-party sites rather than official app stores, a confusing collection of adapters and compatibility criteria you need to wrap your head around (and buy) to get good use from your specs, or software that simply doesn’t work as intended, in my experience smart glasses are not always the most user-friendly gadgets out there.
Companies like Xreal have taken this criticism on board and launched devices including the Xreal Beam Pro as the perfect companions to their AR smart glasses. The Beam Pro is an extremely affordable smartphone-like product which comes with spatial cameras, two USB-ports (letting you charge the system while using Xreal’s wired-glasses) and a bevy of Android apps via the Play Store so you can finally buy a complete and not-confusing smart glasses system.
Others, like the Even Realities G1 glasses, unfortunately aren’t so simple.
The private heads-up display offered by Even Realities’ glasses is a neat concept. In practice the tools mostly haven’t worked for me. The on-glasses controls that would enable me to talk to their AI or write out a note for myself by speaking won’t function, and I can’t make the app’s navigation mode appear on the glasses. I could use the teleprompter and translation tool, though the latter isn’t that useful because when the speaker is talking at a reasonable (ie, natural) pace, the glasses will only have time to show what they’ve said in their native language rather than the translation (meaning I have to rely on my phone screen, defeating the point of the smart glasses).
I have reached out to Even Realities to ensure I’m not doing something incorrectly, but if I (a professional tech tester who has used a lot of different smart glasses) can’t work out how to get these core tools to function in a week of testing, how is a regular person meant to fare?
And if my issues are because they just don’t work – like the AI wearables we’ve seen from Rabbit and Humane – that’s a whole other issue, but I’ll need to wait and see what my contact suggests I try.
Either way, to see mainstream success things need to be simplified, and become more reliable.
Style is important…
Then, as Zuckerberg himself touched on in his Siggraph discussion, it’s important to remember that glasses are a piece of fashion just like any other item of clothing. Yes they need to be functional (more on that below) but they also need to be fashionable.
Meta and the likes of Lucyd demonstrate their understanding of this by offering smart glasses in a range of frame designs with a variety of lenses (from clear to prescription, to colored, to shaded) so you can pick up a pair that match your personal style.
But one design tweak we must get in future iterations is the option to easily swap lenses as currently I can only use my shaded Ray-Ban Smart Glasses when it’s sunny outside – which isn’t often here in the UK. If I’m spending even $299 / £299 / AU$449 on tech, I don’t want to be super limited on how often I can use it.
One interesting alternative I’ve seen from some brands – like Xreal in its Air 2 Pro, and in Chamelo’s smart glasses – is electro-chromic dimming which allows you to change how much light the lenses block by using electrical stimuli. It’s more convenient than swapping lenses, though the lenses are always at least a little shaded, and it stops functioning when the glasses are switched off which limits its usefulness if you run out of charge, or merely want some assured privacy.
… but substance is key
Lastly, we need to have a more consistent baseline of what features smart glasses should have.
In this piece I’ve talked about several smart glasses, and they’re all different. Some have displays, some have AI abilities, some are wireless, some have cameras, some have speakers. None has all of these features.
Some differences are to be expected, but without some consistency between specs and pricing it’s incredibly difficult to judge them against one another and decide which smart glasses you should buy.
Zuckerberg focuses on display-less glasses, so keeping that in mind I think a reasonable baseline should be that they’re wireless, have open ear speakers, cameras, and AI tools facilitated by a wireless connection with a smartphone and an internet connection.
I’d also like to see improvements in these areas – i.e. for the camera sensors to boast higher megapixel counts, the speakers to provide more oomph and leak less audio, and for the AI to become more reliable – but these upgrades might take time. Especially if they’re kept at the $299 / £299 / AU$449 price point the Ray-Ban Meta Smart Glasses (and others) are (or should) aim for.
Wearables are in a super exciting place right now. Smartwatches have hit their stride, smart rings are entering the fray in full-force, and smart glasses are developing rapidly. Mark Zuckerberg’s AI-powered glasses dream isn’t coming next week, or likely even next year, but I agree it’s a matter of when, not if they’ll take off.
Saying that, if Zuckerberg has something exciting to show off at Meta Connect 2024 in September perhaps smart glasses will hit the mainstream sooner than I expect.