Meta has introduced several new features to its Ray-Ban Meta glasses, including live translation, live AI, and Shazam song identification.
Meta’s smart glasses, designed and produced in partnership with eyewear fashion giants Ray-Ban, have continued to be a focus for the tech giant, and they were highlighted at Meta Connect earlier this year.
The v11 software update has begun rolling out today (December 17), and members of Meta’s early access program (available in the US and Canada) can begin to experience the new features included, before rolling out to the wider user base.
Live AI is one of the key new features, adding Meta AI to the glasses. Users can start an AI chat session and converse easily with the glasses’ AI “more naturally than ever before” while the AI can “see what you see continuously”. Some of the use cases suggested by Meta include: “Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood.”
A key feature is during a session, the AI will remember what you have discussed, enabling users to ask follow-up questions, even interrupting the AI to do so.
Meta’s goal is to develop the live AI tool to “Eventually […], at the right moment, give useful suggestions even before you ask,” which is equal parts ominous and intriguing.
According to Meta, users will be able to use about 30 minutes of live AI on a single charge of the device.
Welcome to the future of travelThe next feature, live translation, is one of the things that makes a device like smart glasses worthwhile. Still very limited in its functionality, there’s no doubt this will become a flagship feature for the specs. The glasses will, in real-time, translate Spanish, French, or Italian into English, or vice versa. When speaking to someone who is using one of those three languages, the glasses will translate them either in text-to-speech through their speakers, or as a transcript displayed on your phone screen.
Users will have to download language pairings in advance, and also specify what language each speaker is using.
What’s playing?A fun and useful third feature is the new Shazam integration. If you hear a song you want to identify, simply ask “Hey Meta, what is this song?” and Shazam will provide the information you seek.
Currently, Shazam on Meta glasses is only available in the US and Canada. There’s no information as to when it will be more widely available to users outside North America.
Featured image credit: Meta/Ray-Ban
The post Meta’s smart glasses now have live translation, AI, and Shazam appeared first on ReadWrite.