Meta is rolling out live AI and Shazam integration to its smart glasses

The Ray-Ban Meta Smart Glasses already worked well as a head-mounted camera and pair of open-ear headphones, but now Meta is updating the glasses with access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam for identifying music.

Meta first demoed most of these features at Meta Connect 2024 in September. Live AI lets you start a “live session” with Meta AI that gives the assistant access to whatever you’re seeing and lets you ask questions without having to say “Hey Meta.” If you need your hands-free to cook or fix something, Live AI is supposed to keep your smart glasses useful even if you need to concentrate on whatever you’re doing.

Live translation lets your smart glasses translate between English and either French, Italian, or Spanish. If live translation is enabled and someone speaks to you in one of the selected languages, you’ll hear whatever they’re saying in English through the smart glasses’ speakers or as a typed transcript in the Meta View app. You’ll have to download specific models to translate between each language, and live translation needs to be enabled before it’ll actually act as an interpreter, but it does seem more natural than holding out your phone to translate something.

With Shazam integration, your Meta smart glasses will also be able to identify whatever song you hear playing around you. A simple “Meta, what is this song” will get the smart glasses’ microphones to figure out whatever you’re listening to, just like using Shazam on your smartphone.

All three updates baby-step the wearable towards Meta’s end goal of a true pair of augmented reality glasses that can replace your smartphone, an idea its experimental Orion hardware is a real-life preview of. Pairing AI and either VR and AR seems to be an idea multiple tech giants are circling, too. Google’s newest XR platform, Android XR, is built around the idea that a generative AI like Gemini could be the glue that makes VR or AR compelling. We’re still years away from any company being willing to actually alter your field of view with holographic images, but in the meantime smart glasses seem like a moderately useful stopgap.

All Ray-Ban Meta Smart Glasses owners will be able to enjoy Shazam integration as part of Meta’s v11 update. For live translation and live AI, you’ll need to be a part of Meta’s Early Access Program, which you can join right now at the company’s website.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/meta-is-rolling-out-live-ai-and-shazam-integration-to-its-smart-glasses-192602898.html?src=rss   

​ The Ray-Ban Meta Smart Glasses already worked well as a head-mounted camera and pair of open-ear headphones, but now Meta is updating the glasses with access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam for identifying music.
Meta first demoed most of these features at Meta Connect 2024 in September. Live AI lets you start a “live session” with Meta AI that gives the assistant access to whatever you’re seeing and lets you ask questions without having to say “Hey Meta.” If you need your hands-free to cook or fix something, Live AI is supposed to keep your smart glasses useful even if you need to concentrate on whatever you’re doing.

Live translation lets your smart glasses translate between English and either French, Italian, or Spanish. If live translation is enabled and someone speaks to you in one of the selected languages, you’ll hear whatever they’re saying in English through the smart glasses’ speakers or as a typed transcript in the Meta View app. You’ll have to download specific models to translate between each language, and live translation needs to be enabled before it’ll actually act as an interpreter, but it does seem more natural than holding out your phone to translate something.
With Shazam integration, your Meta smart glasses will also be able to identify whatever song you hear playing around you. A simple “Meta, what is this song” will get the smart glasses’ microphones to figure out whatever you’re listening to, just like using Shazam on your smartphone.
All three updates baby-step the wearable towards Meta’s end goal of a true pair of augmented reality glasses that can replace your smartphone, an idea its experimental Orion hardware is a real-life preview of. Pairing AI and either VR and AR seems to be an idea multiple tech giants are circling, too. Google’s newest XR platform, Android XR, is built around the idea that a generative AI like Gemini could be the glue that makes VR or AR compelling. We’re still years away from any company being willing to actually alter your field of view with holographic images, but in the meantime smart glasses seem like a moderately useful stopgap.
All Ray-Ban Meta Smart Glasses owners will be able to enjoy Shazam integration as part of Meta’s v11 update. For live translation and live AI, you’ll need to be a part of Meta’s Early Access Program, which you can join right now at the company’s website.This article originally appeared on Engadget at https://www.engadget.com/ar-vr/meta-is-rolling-out-live-ai-and-shazam-integration-to-its-smart-glasses-192602898.html?src=rss 

Similar Posts