{"id":4075,"date":"2024-04-24T10:24:20","date_gmt":"2024-04-24T10:24:20","guid":{"rendered":"https:\/\/favtutor.com\/articles\/?p=4075"},"modified":"2024-04-24T10:25:54","modified_gmt":"2024-04-24T10:25:54","slug":"meta-smart-glasses-ai-features","status":"publish","type":"post","link":"https:\/\/favtutor.com\/articles\/meta-smart-glasses-ai-features\/","title":{"rendered":"3 New AI Features Coming to Meta&#8217;s Smart Glasses"},"content":{"rendered":"\n<p>Meta&#8217;s Smart Glasses just got smarter with the help of AI!<\/p>\n\n\n\n<p><strong>Highlights:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Meta AI announces Multimodal Vision updates to its Ray-Ban smart glasses.<\/li>\n\n\n\n<li>The latest features allow new AI users to ask questions about what they are seeing.<\/li>\n\n\n\n<li>The AI Features are currently being rolled out to users in the US and Canada in beta versions.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>AI Features Coming to Meta Smart Glassed<\/strong><\/h2>\n\n\n\n<p>While there are new frames, what we need to focus on is the big software updates, which now has generative AI features.<\/p>\n\n\n\n<p><strong>Meta is adding Multimodal Meta AI with Vision to its Ray Ban Smart Glasses.<\/strong> With the <a href=\"https:\/\/about.fb.com\/news\/2024\/04\/new-ray-ban-meta-smart-glasses-styles-and-meta-ai-updates\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Meta AI Assistant<\/a> now part of Smart Glasses, it can now respond to inquiries about what the user is viewing in addition to speech input. This gives access to real-time information for the users.<\/p>\n\n\n\n<p>Just a few days ago t<a href=\"https:\/\/favtutor.com\/articles\/meta-llama-3-benchmarks\/\">hey released Llama 3 publicly<\/a> across several platforms and also via API.  And now Meta is converging their highly capable AI Assistant with their most advanced hardware. They continues to take their AI game even further by enhancing their smart glasses.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>1) Ask About What&#8217;s in Front of You<\/strong><\/h3>\n\n\n\n<p>The multimodal AI upgrade allows the users to ask their  Ray-Ban glasses questions about what they&#8217;re seeing. It can now respond with insightful, practical advice.<\/p>\n\n\n\n<p>In the following video, the person is asking for more information about the butterfly that she is looking at.<\/p>\n\n\n\n<div align=center><blockquote class=\"twitter-tweet\" data-media-max-width=\"560\"><p lang=\"en\" dir=\"ltr\">Ray-Ban Meta smart glasses just got a massive Multimodal upgrade &#8211; Meta AI with Vision<br><br>It doesn&#39;t just take speech input, it can now answer questions about what you are seeing.<br><br>Here are 8 features that is now possible<br><br>1. Ask about what you are seeing <a href=\"https:\/\/t.co\/IJQ3WuZMAJ\" target=\"_blank\">pic.twitter.com\/IJQ3WuZMAJ<\/a><\/p>&mdash; Min Choi (@minchoi) <a href=\"https:\/\/twitter.com\/minchoi\/status\/1782978639589454032?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener\">April 24, 2024<\/a><\/blockquote> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n\n\n\n<p>Imagine going somewhere new, where you are clueless and have no idea what you are seeing or witnessing. Ask Ray-Ban about your doubts and it will help you out by providing accurate information on what you\u2019re seeing.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>2) Translate Texts in Real-Time<\/strong><\/h3>\n\n\n\n<p>The Multimodal AI capabilities which allow for text, vision and speech-based inputs have now highly leveraged the Ray-Ban smart glasses. <strong>Now with the help of Meta AI, these glasses will let you translate what you see, and that includes texts on any sort of physical or virtual display.<\/strong><\/p>\n\n\n\n<p>No Need to pull out Google Translator and type the text to click the photo, the glasses will do that. In the following video, the person is asked to translate it in front of her.<\/p>\n\n\n\n<div align=center><blockquote class=\"twitter-tweet\" data-media-max-width=\"560\"><p lang=\"en\" dir=\"ltr\">2. Ask to translate what you are seeing <a href=\"https:\/\/t.co\/q8SJ7foj4A\" target=\"_blank\">pic.twitter.com\/q8SJ7foj4A<\/a><\/p>&mdash; Min Choi (@minchoi) <a href=\"https:\/\/twitter.com\/minchoi\/status\/1782978641367826806?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener\">April 24, 2024<\/a><\/blockquote> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n\n\n\n<p>So now travel hassle free anywhere without worrying about the language gap issues as Ray-Ban glasses will be your personal translator.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>3. Ask Meta AI Anything<\/strong><\/h3>\n\n\n\n<p>Meta AI is an intelligent assistant which will let you ask any question to your smart glasses. You can ask any question to the Ray-Ban glasses starting from daily life questions such as <em>\u201cHow many tablespoons are in a cup\u201d<\/em> to general vision questions like <em>\u201cWhat type of butterfly is that?\u201d.<\/em><\/p>\n\n\n\n<p>All you have to do to ask a question is begin by saying <em>\u201cHey Meta,\u201d<\/em> and you can straightaway ask your question. With the help of voice commands, you can operate the glasses and even obtain real-time information because of Meta AI.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Other New Features Added<\/strong><\/h2>\n\n\n\n<p>There are many more new advancements coming for the glasses, here are some to note:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Multi-Platform Video Calling<\/strong>: Meta&#8217;s multi-platform integration will allow you to share the glasses&#8217; footage to the video calls on WhatsApp and Messenger. So now you can directly initiate the video call from your smart glasses, and connect or ask for advice from your family or friends.<\/li>\n\n\n\n<li><strong>Play Music<\/strong>: The Ray-Ban smart glasses will also allow you to play music with ease. All you have to do is again begin by saying <em>\u201cHey Meta,\u201d<\/em> and then say <em>\u201cPlay some music\u201d<\/em>.<\/li>\n\n\n\n<li><strong>Take Photos and Videos<\/strong>: You can also take photos with the smart glasses. Similarly, you have to begin with the <em>\u201cHey Meta,\u201d<\/em> command and then say <em>\u201ctake a photo\u201d<\/em>. The smart glasses come with integrated audio along with an ultra-wide 12 MP camera.<\/li>\n\n\n\n<li><strong>Livestreaming<\/strong>: Lastly, The Glasses will also allow you to livestream videos. Imagine attending a concert or watching a live football game. It will allow you to livestream the events in Meta\u2019s virtual platforms where millions of users can witness them.<\/li>\n<\/ul>\n\n\n\n<div align=center><blockquote class=\"twitter-tweet\" data-media-max-width=\"560\"><p lang=\"en\" dir=\"ltr\">3. Livestreaming <a href=\"https:\/\/t.co\/rB8kwiegs7\" target=\"_blank\">pic.twitter.com\/rB8kwiegs7<\/a><\/p>&mdash; Min Choi (@minchoi) <a href=\"https:\/\/twitter.com\/minchoi\/status\/1782978643297280358?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener\">April 24, 2024<\/a><\/blockquote> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n\n\n\n<p>Remember that the advanced AI features of its smart glasses are specifically for users only in the US and Canada. The rollout is still in the beta phase so users will have to wait a little longer before they can enjoy these features first-hand.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Conclusion<\/strong><\/h2>\n\n\n\n<p>Meta\u2019s Ray-Ban smart glasses are just the perfect example of what Hardware and AI can do in combination. Its vast multimodal capabilities have made it highly desirable. Only time will tell how big of a success the &#8220;smart glasses&#8221; become in the long run!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Find out about the new features added to Meta Ray-Ban Smart Glasses after integration with Meta AI Assistant.<\/p>\n","protected":false},"author":15,"featured_media":4090,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jnews-multi-image_gallery":[],"jnews_single_post":null,"jnews_primary_category":{"id":"","hide":""},"footnotes":""},"categories":[57],"tags":[56,59,200,205,179],"class_list":["post-4075","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai","tag-ai","tag-generative-ai","tag-meta-ai","tag-meta-smart-glasses","tag-whatsapp"],"_links":{"self":[{"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/posts\/4075","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/users\/15"}],"replies":[{"embeddable":true,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/comments?post=4075"}],"version-history":[{"count":12,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/posts\/4075\/revisions"}],"predecessor-version":[{"id":4095,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/posts\/4075\/revisions\/4095"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/media\/4090"}],"wp:attachment":[{"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/media?parent=4075"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/categories?post=4075"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/favtutor.com\/articles\/wp-json\/wp\/v2\/tags?post=4075"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}