When the Ray-Ban Meta smart glasses launched they did so without many of the impressive AI features we were promised. Now Meta is finally rolling out these capabilities to users, but they’re still in the testing phase and only available in the US.
During their Meta Connect 2023 announcement, we were told the follow-up to the Ray-Ban Stories smart glasses would get some improvements we expected – namely a slightly better camera and speakers – but also some unexpected AI integration.
Unfortunately, when we actually got to test the specs out its AI features boiled down to very basic commands. You can instruct them to take a picture, record a video, or contact someone through Messenger or WhatsApp. In the US you could also chat to a basic conversational AI – like ChatGPT – though this was still nothing to write home about.
While the glasses’ design is near-perfect, the speakers and camera weren’t impressive enough to make up for the lacking AI. So overall in our Ray-Ban Meta Smart Glasses review we didn’t look too favorably on the specs.
(Image credit: Meta)
Our perception could soon be about to change drastically, however, as two major promised features are on their way: Look and Ask, and Bing integration.
Look and Ask is essentially a wearable voice-controlled Google Lens with a few AI-powered upgrades. While wearing the smart glasses you can say “Hey Meta, look and…” followed by a question about what you can see. The AI will then use the camera to scan your environment so it can provide a detailed answer to your query. On the official FAQ possible questions you can ask include “What can I make with these ingredients?” or “How much water do these flowers need?” or “Translate this sign into English.”
To help the Meta glasses provide better information when you’re using its conversational and Look and Ask features the specs can also now access the internet via Bing. This should mean the specs can source more up-to-date data letting it answer questions about sports matches that are currently happening, or provide real-time info on what nearby restaurants are the best rated, among other things.
Still not perfect
(Image credit: Meta)
It all sounds very science fiction, but unfortunately these almost magical capabilities come with a catch. For now, the new features – just like the existing conversational AI – are in beta testing.
So the glasses might have trouble with some of your queries and provide inaccurate answers, or not be able to find an answer at all. What’s more, as Meta explains in its FAQ any AI-processed pictures you take while part of the beta will be stored by Meta and used to train its AI. So your Look and Ask snaps aren’t private.
Lastly, the Meta Ray-Ban smart glasses beta is only available in the US. So if you live somewhere else like me you won’t be able to try these features out – and probably won’t until 2024.
If you are in the US and happy with the terms of Meta’s Privacy Policy, you can sign up for the Early Access program and start testing these new tools. For everyone else hopefully these features won’t be in beta for long, or at least won’t be US-exclusive – otherwise we’ll be left continuing to wonder why we spent $299 / £299 / AU$449 on smart specs that aren’t all that much better than dumb Ray-Ban Wayfarers at half the cost.
You might also like
Through Ray Ban Meta glasses, I stared into the city, and the city stared back at meThe Meta Quest 3 is here, and I think it’s the best VR headset yetApple Vision Pro 2 leak reveals what’s coming next for Apple’s headset
Meta’s Ray-Ban smart glasses get the AI camera feature I’ve been waiting for with the rollout of Look and Ask in beta. Virtual Reality & Augmented Reality, Computing, Software TechRadar – All the latest technology news Read More