Meta's Ray-Ban Smart Glasses now boast the ability to describe their surroundings using Meta AI

Apr 26, 2024 - 13:18
 0  17
Meta's Ray-Ban Smart Glasses now boast the ability to describe their surroundings using Meta AI

Meta has unveiled new features for its Ray-Ban smart glasses, introducing multimodal capabilities. Now, alongside audio interactions, the glasses can also analyze and interpret visual data from the built-in camera, providing users with enhanced insights about their surroundings.

With the latest AI capabilities, users can now command the glasses to translate text, recognize objects, or provide contextual information, all without using their hands. Additionally, wearers can seamlessly share their perspectives during video calls on WhatsApp and Messenger, enabling hands-free, real-time sharing experiences. Meta has announced that the multimodal AI upgrade will be accessible as a beta feature to users in both Canada and the US.

Meta has announced the expansion of the Ray-Ban Meta smart glasses collection with fresh styles. Alongside, it's introducing video calling capabilities through WhatsApp and Messenger, enabling users to share their perspectives during video calls seamlessly. The tech giant also revealed the rollout of Meta AI with Vision, empowering users to inquire about their surroundings and receive informative insights directly from their glasses.

The second generation of smart glasses has been unveiled through a partnership with eyewear brand EssilorLuxottica. Meta has also disclosed plans to broaden the Ray-Ban Meta smart glasses collection with fresh styles tailored to accommodate a wider range of face shapes. These new glasses offer a hundred diverse custom frame and lens combinations available on the Ray-Ban Remix platform, allowing users to mix and match their glasses to their liking. The latest styles are now accessible in 15 countries, including the US, Canada, Australia, and across Europe.

With Meta AI, users can now effortlessly control the Meta glasses using voice commands; a simple 'Hey Meta!' initiates their prompts. Following testing of multimodal AI updates that commenced in December last year, the functionality is now being rolled out across the US and Canada.

The introduction of multimodal capabilities into smart glasses by the company marks a significant advancement, transforming wearables into potent, context-aware smart assistants.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow