Meta’s Smart Glasses just got smarter with the help of AI!
Highlights:
- Meta AI announces Multimodal Vision updates to its Ray-Ban smart glasses.
- The latest features allow new AI users to ask questions about what they are seeing.
- The AI Features are currently being rolled out to users in the US and Canada in beta versions.
AI Features Coming to Meta Smart Glassed
While there are new frames, what we need to focus on is the big software updates, which now has generative AI features.
Meta is adding Multimodal Meta AI with Vision to its Ray Ban Smart Glasses. With the Meta AI Assistant now part of Smart Glasses, it can now respond to inquiries about what the user is viewing in addition to speech input. This gives access to real-time information for the users.
Just a few days ago they released Llama 3 publicly across several platforms and also via API. And now Meta is converging their highly capable AI Assistant with their most advanced hardware. They continues to take their AI game even further by enhancing their smart glasses.
1) Ask About What’s in Front of You
The multimodal AI upgrade allows the users to ask their Ray-Ban glasses questions about what they’re seeing. It can now respond with insightful, practical advice.
In the following video, the person is asking for more information about the butterfly that she is looking at.
Ray-Ban Meta smart glasses just got a massive Multimodal upgrade – Meta AI with Vision
— Min Choi (@minchoi) April 24, 2024
It doesn't just take speech input, it can now answer questions about what you are seeing.
Here are 8 features that is now possible
1. Ask about what you are seeing pic.twitter.com/IJQ3WuZMAJ
Imagine going somewhere new, where you are clueless and have no idea what you are seeing or witnessing. Ask Ray-Ban about your doubts and it will help you out by providing accurate information on what you’re seeing.
2) Translate Texts in Real-Time
The Multimodal AI capabilities which allow for text, vision and speech-based inputs have now highly leveraged the Ray-Ban smart glasses. Now with the help of Meta AI, these glasses will let you translate what you see, and that includes texts on any sort of physical or virtual display.
No Need to pull out Google Translator and type the text to click the photo, the glasses will do that. In the following video, the person is asked to translate it in front of her.
2. Ask to translate what you are seeing pic.twitter.com/q8SJ7foj4A
— Min Choi (@minchoi) April 24, 2024
So now travel hassle free anywhere without worrying about the language gap issues as Ray-Ban glasses will be your personal translator.
3. Ask Meta AI Anything
Meta AI is an intelligent assistant which will let you ask any question to your smart glasses. You can ask any question to the Ray-Ban glasses starting from daily life questions such as “How many tablespoons are in a cup” to general vision questions like “What type of butterfly is that?”.
All you have to do to ask a question is begin by saying “Hey Meta,” and you can straightaway ask your question. With the help of voice commands, you can operate the glasses and even obtain real-time information because of Meta AI.
Other New Features Added
There are many more new advancements coming for the glasses, here are some to note:
- Multi-Platform Video Calling: Meta’s multi-platform integration will allow you to share the glasses’ footage to the video calls on WhatsApp and Messenger. So now you can directly initiate the video call from your smart glasses, and connect or ask for advice from your family or friends.
- Play Music: The Ray-Ban smart glasses will also allow you to play music with ease. All you have to do is again begin by saying “Hey Meta,” and then say “Play some music”.
- Take Photos and Videos: You can also take photos with the smart glasses. Similarly, you have to begin with the “Hey Meta,” command and then say “take a photo”. The smart glasses come with integrated audio along with an ultra-wide 12 MP camera.
- Livestreaming: Lastly, The Glasses will also allow you to livestream videos. Imagine attending a concert or watching a live football game. It will allow you to livestream the events in Meta’s virtual platforms where millions of users can witness them.
3. Livestreaming pic.twitter.com/rB8kwiegs7
— Min Choi (@minchoi) April 24, 2024
Remember that the advanced AI features of its smart glasses are specifically for users only in the US and Canada. The rollout is still in the beta phase so users will have to wait a little longer before they can enjoy these features first-hand.
Conclusion
Meta’s Ray-Ban smart glasses are just the perfect example of what Hardware and AI can do in combination. Its vast multimodal capabilities have made it highly desirable. Only time will tell how big of a success the “smart glasses” become in the long run!