Meta’s Ray-Ban Smart Glasses Get AI Reminders and Translation Features
This time, there are no significant hardware updates, just some improved transition lenses. What are your thoughts on the lack of major hardware upgrades?
Enhanced AI for Meta’s Ray-Ban Smart Glasses
The latest update from Meta on its Ray-Ban smart glasses is generating buzz. The AI assistant, a hallmark of the second generation of these glasses, has recently seen significant enhancements in its capabilities.
Expanded Features
When launched last fall, the AI assistant had relatively limited abilities. However, the addition of real-time information and multimodal capabilities has opened up a range of new possibilities for the device. Today, Meta has significantly improved the AI capabilities of its Ray-Ban glasses.
Live Reminders and Translations
New features include live reminders and translations. Users can now look at objects in their environment and ask Meta to set a reminder about them, for example, “Hey Meta, remind me to buy this book next Monday.” The glasses can also scan QR codes and dial phone numbers displayed in your view.
Enhanced Environmental Interaction
Moreover, Meta has integrated video support into Meta AI so that the glasses are better equipped to analyze your surroundings and respond to queries about what’s around you. The improvements are also more nuanced. Previously, to get an answer based on what you were looking at, you had to start a command with “Hey Meta, look and tell me.” Now, Meta AI can respond to your questions about what’s in front of you more naturally.
Longer and Live Translations
In my initial testing of Meta AI’s multimodal capabilities on the glasses last year, I found that Meta AI could translate some text snippets but struggled with sentences longer than a few words. Now, Meta AI should be able to translate longer text passages. Additionally, the company plans to add live translation capabilities for English, French, Italian, and Spanish later this year, potentially making the glasses even more useful as a travel accessory.
Improved Real-Time Information Processing
Although I haven’t fully tested the new capabilities of Meta AI on its smart glasses, it already appears to have a better grasp of real-time information than what I observed last year. During a demo with Meta, I asked Meta AI who the Speaker of the House of Representatives was—a question it had previously answered incorrectly—and this time, it responded correctly on the first try.