Meta Unveils Advanced AI Features for Ray-Ban Smart Glasses

At the Meta Connect event on September 25, 2024, in Menlo Park, California, Mark Zuckerberg introduced significant upgrades to the Meta Ray-Ban smart glasses, enhancing their multimodal AI capabilities. These improvements aim to facilitate more natural interactions and provide users with unique features such as the ability to 'remember' information, making the glasses a cutting-edge tool for everyday tasks.

The new AI allows the glasses to 'see' and 'hear' in real-time, reducing the need for user context. One notable feature is the glasses' capacity to recall specific details, such as parking locations, by simply stating, 'Remember where I parked.' Users can then query the glasses for this information later.

Additionally, the glasses will incorporate a live translation feature, supporting languages like Spanish, French, and Italian, which was demonstrated during the event. This feature enables real-time conversation translation, showcasing the glasses' potential for enhancing communication across language barriers.

Meta also announced a new AI tool for Instagram Reels that translates audio into English while synchronizing the speaker's mouth movements, currently available in Spanish. This innovative approach reflects Meta's commitment to integrating AI into their products.

In terms of accessibility, the glasses will assist visually impaired users by allowing them to connect with volunteers who can help interpret their surroundings through live video. This feature exemplifies the potential of AI to improve the quality of life for individuals with disabilities.

The new Meta Ray-Bans start at $300 and come in various designs, including a limited-edition transparent style. These advancements position the glasses as a versatile option for consumers seeking both functionality and style.

Source: zdnet.com

क्या आपने कोई गलती या अशुद्धि पाई?

हम जल्द ही आपकी टिप्पणियों पर विचार करेंगे।