Google Lens, a tool that allows users to search for information in photos, is now available in the search engine giant’s flagship iOS app.
A new report from 9to5Google explains that the app — which uses a phone’s camera to identify animals and plants, scan codes, conduct searches for related products, or recognize contact information including phone numbers and addresses — has been rolling out its visual search tool throughout the year. First in a dedicated app, then integrated with the Google Photos app for iOS.
Now, Google is rolling out the service within its flagship Google app for iOS, giving iPhone users a new way to search the web on mobile. Before, iOS users were required to use the service within the Google Photos app and needed to snap a picture for the app to function properly. In its new iteration, however, iOS users will only have to point their camera at the subject for the app to perform its magic, however, the app can still analyze photos as before.
“Google Lens has a very straightforward interface,” reads the report. “Users can tap on an object or text in the viewfinder to begin an analysis. Points of interest will be highlighted, with results sliding up from the bottom sheet. Controls include turning on the flash to better illuminate a scene and opening your photo gallery to have Lens analyze an existing image.”
— Google (@Google) December 10, 2018
Announced at I/O 2017, Lens combines Google’s computer vision algorithms with its wealth of knowledge, allowing users a way to identify and learn more about objects even if they might not know what it was. Earlier this year, Google brought real-time visual search and object recognition to Android but left iOS users wanting more.
Google says the feature is rolling out widely starting today, so don’t be surprised to see the feature pop up in the Google app very soon.