Back in September during Meta Connect, the company previewed a new ability for its smart glasses lineup called Conversation Focus. The feature, which is able to amplify the voices of people around you, is now starting to roll out in the company’s latest batch of software updates.
When enabled, the feature is meant to make it easier to hear the people you’re speaking with in a crowded or otherwise noisy environment. “You’ll hear the amplified voice sound slightly brighter, which will help you distinguish the conversation from ambient background noise,” Meta explains. It can be enabled either via voice commands (“hey Meta, start Conversation Focus”) or by adding it as a dedicated “tap-and-hold” shortcut.
Meta is also adding a new multimodal AI feature for Spotify. With the update, users can ask their glasses to play music on Spotify that corresponds with what they’re looking at by saying “hey Meta, play a song to match this view.” Spotify will then start a playlist “based on your unique taste, customized for that specific moment.” For example, looking at holiday decor might trigger a similarly-themed playlist, though it’s not clear how Meta and Spotify may translate more abstract concepts into themed playlists.
Both updates are starting to roll out now to Meta Ray-Ban glasses (both Gen 1 and Gen 2 models), as well as the Oakley Meta HSTN frames. The update will arrive first to those enrolled in Meta’s early access program, and will be available “gradually” to everyone else.
Meta’s newest mode of smart glasses, the Oakley Meta Vanguard shades, are also getting some new features in the latest software update. Meta is adding the option to trigger specific commands with a single word, rather than having to say “hey Meta.” For example, saying “photo” will be enough to snap a picture and saying “video” will start a new recording. The company says the optional feature is meant to help athletes “save some breath” while on a run or bike ride.