Ray-Ban Meta Glasses Get Shazam Integration, Enabling Hands-Free Music Identification on the Go

Date:

Meta Platforms on Monday announced several new features for the Ray-Ban Meta Glasses. While the Live AI with real-time video processing ability and live translation in real-time within the supported languages were the standout additions powered by artificial intelligence (AI), the company also introduced integration with Shazam — Apple’s music identification app — in select global regions. This enables Ray-Ban Meta Glass users to identify songs on the go via voice prompts.

Shazam Integration on Ray-Ban Meta Glasses

Meta detailed the new features arriving on the Ray-Ban Meta Glasses in a newsroom post. The Shazam integration is being rolled out as part of the v11 software update for the smart glasses that is now rolling out to eligible devices. However, it is currently limited to Canada and the US.

It offers hands-free music recognition via voice prompts. Users can ask, “Hey Meta, what is this song?” and the Ray-Ban Meta Glasses will identify the song using Shazam. This feature is said to come in handy in situations when a great track is playing, such as in a store or a cafe, helping users know the track or artist’s name and not miss out.

Notably, the company introduced support for Apple Music earlier this year, adding the ability to stream music via the Apple app without touching the phone. It leverages the wearable’s voice-to-search functionality to play a song, playlist, album, station, or even artist.

Other New Features

In addition to the Shazam integration, Meta also rolled out a Live AI feature. Similar to ChatGPT’s Advanced Voice With Vision, it grants Meta AI access to Ray-Ban Meta Glasses’ cameras to monitor the video feed in real-time. The chatbot can continuously see the user’s surroundings and answer questions about them. Users can invoke the Meta AI without the “Hey Meta” command and can even ask follow-up questions.

See also  Satellite Data Confirms Climate Crisis as Sea Level and Global Temperature Rises

Further, live translation has been added to the smart glasses. It offers real-time speech translation between English and either Spanish, French, or Italian languages. Users can play the translated audio through the open-ear speakers and even get a transcription of it.

Meta Platforms on Monday announced several new features for the Ray-Ban Meta Glasses. While the Live AI with real-time video processing ability and live translation in real-time within the supported languages were the standout additions powered by artificial intelligence (AI), the company also introduced integration with Shazam — Apple’s music identification app — in select global regions. This enables Ray-Ban Meta Glass users to identify songs on the go via voice prompts.

Shazam Integration on Ray-Ban Meta Glasses

Meta detailed the new features arriving on the Ray-Ban Meta Glasses in a newsroom post. The Shazam integration is being rolled out as part of the v11 software update for the smart glasses that is now rolling out to eligible devices. However, it is currently limited to Canada and the US.

It offers hands-free music recognition via voice prompts. Users can ask, “Hey Meta, what is this song?” and the Ray-Ban Meta Glasses will identify the song using Shazam. This feature is said to come in handy in situations when a great track is playing, such as in a store or a cafe, helping users know the track or artist’s name and not miss out.

Notably, the company introduced support for Apple Music earlier this year, adding the ability to stream music via the Apple app without touching the phone. It leverages the wearable’s voice-to-search functionality to play a song, playlist, album, station, or even artist.

See also  Ajith Kumar’s Vidaamuyarchi OTT Release Date: When and Where to Watch it Online?

Other New Features

In addition to the Shazam integration, Meta also rolled out a Live AI feature. Similar to ChatGPT’s Advanced Voice With Vision, it grants Meta AI access to Ray-Ban Meta Glasses’ cameras to monitor the video feed in real-time. The chatbot can continuously see the user’s surroundings and answer questions about them. Users can invoke the Meta AI without the “Hey Meta” command and can even ask follow-up questions.

Further, live translation has been added to the smart glasses. It offers real-time speech translation between English and either Spanish, French, or Italian languages. Users can play the translated audio through the open-ear speakers and even get a transcription of it.

 

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

South Carolina prepares for second firing squad execution

A firing squad is set to kill a South...

RRB ALP Recruitment 2025: Apply for 9,970 vacancies from April 12; check selection process and other details here

The RRB ALP Recruitment 2025 application process for 9,970...

‘Gauti (Gautam Gambhir) bhai has helped me understand my potential’

Washington Sundar, a versatile all-rounder, faces the challenge of...

Apple is left without a life raft as Trump’s China trade war intensifies, analysts warn

Apple remains stranded without a life raft, experts say,...