Meta is rolling out software update Version 2.0 to its Ray-Ban Meta Smart Glasses, and it’s set to deliver some very welcome image-quality and audio quality-of-life improvements.
Firstly the smart glasses’ cameras will get some low-light performance upgrades, with reduced noise, and auto exposure leading to sharper images, or so Meta promises – we haven’t been able to test this yet. You should also find your videos have more dynamic range and are sharper.
As for the audio, Meta is adding an audio master control to enable you to adjust the volume of sounds from your glasses. Once update Version 2.0 has hit your Ray-Ban Meta Smart Glasses you’ll be able to control the volume of music, voice commands, and other sounds by swiping up and down on the glasses’ touchpad found by your temple.
Lastly, Meta’s official changelog says the update will deliver “Security and stability improvements” – though it hasn’t explained in detail what this means for the smart specs.
The update takes a few minutes to install, and if you don’t have auto-updates turned on you’ll need to open up the Meta View app on your phone, tap on the picture of your glasses in the top-left corner, tap on ‘Glass & privacy’ in the menu, select ‘Your glasses’, then ‘Updates’, and finally ‘Install update’ if your device finds the Version 2.0 update.
If it doesn’t find it, don’t worry – it usually takes time for updates to roll out to everyone. Make sure to check back later, or turn on auto-updates so that you don’t have to keep checking if the update is ready yet.
Also be sure to turn on your Ray-Ban Meta Smart Glasses and connect them via Bluetooth to your phone, otherwise they won’t be able to install the Version 2.0 software.
Looking and Asking for more
Unfortunately, while the Version 2.0 update includes some camera and audio updates the specs needed, it doesn’t include the long-anticipated rollout of the Ray-Ban Smart Glasses’ Look and Ask AI recognition tools to people outside of the US-exclusive beta.
This feature was for many the standout tool at Meta’s launch presentation at Meta Connect 2023 back in September last year. Using the glasses’ camera, the Meta AI can scan your environment to respond to your questions, like a combination of Google Lens and ChatGPT.
You start with “Hey Meta, look and…” then follow up with something like “What can I make with these ingredients?” or “How much water does this plant need?” or “Translate this into English.” The AI can then search its data banks and the internet (via Bing) to source an answer to your question, using its camera to help it identify food, flowers, or French as required.
The software updates Meta has been rolling out have helped to improve the Ray-Ban Smart Glasses, but until these promised AI tools are available to everyone they won’t feel complete. We’ll just have to hope they come in Version 3.0, and that the next update isn’t too many months away from launching.