Google has today announced the launch of ARCore 1.0 out of preview, allowing developers to create AR experiences on Android by publishing apps on the Google Play store that can understand your environment, and also place objects in it. According to VentureBeat, the company also announced its plans to expand the Google Lens feature to Android and iOS devices.
For those who aren’t familiar, ARCore is an Android software development kit (SDK) similar to ARKit for iOS, that brings AR capabilities to existing and future Android phones without requiring additional sensors or hardware. Google is now partnering with Android manufacturers like Samsung, Huawei,, Motorola, Sony and Xiaomi, to ensure that their devices coming out this year will support ARCore.
Meanwhile, Google Lens, a feature that is currently available Google Pixel devices and uses computer vision to quickly recognize objects, businesses, and other things around you, is said to release for both Google Assistant and Google Photos in the next few weeks.
Google Photos for Android and iOS will get Lens built-in so that when you take a picture, you get more information about what’s in your photo. The feature will be limited to English-language users; Android users will need the latest version of the app, while Apple users will also need iOS 9 and newer.
Google Assistant for Android will get live camera-based Lens functionality on “compatible flagship devices” from Samsung, Huawei, LG, Motorola, Sony, and HMD/Nokia.
To find out more about ARCore 1.0, hit up this link.