At its annual developer event today, Snap has announced a series of updates for the Snapchat app, including Lens voice search, a machine learning model update to Lens Studio, and a geography-specific AR system that will turn public Snaps into spatial data (via TechCrunch).
The company said that it will soon roll out the ability to sort through Lenses via voice, a feature that could help Snapchat transition from entertainment-only AR to a platform based around utility.
The new voice search will allow Snapchat users to ask the app to help it surface filters that enable them to do something unique. For example, “Hey Snapchat, make my hair pink.”
Snap also announced a new Local Lenses feature that will allow Snapchat developers to create geography-specific lenses that interact with a wider swatch of physical locations.
“Today, augmented reality is changing how we talk with our friends,” Snap co-founder and CTO Bobby Murphy said in a press briefing. “But in the future, we’ll use it to see the world in all-new ways.”
“[W]e’re taking it a step further, by enabling shared and persistent augmented reality in much larger areas, so you can experience AR with your friends, at the same time, across entire city blocks,” Murphy said.
The company also revealed that it is partnering with AR startup Wannaby to give developers access to their foot-tracking tech to enable lenses that allow users to try on sneakers virtually.