Recently, at the Google I/O, the company announced features that utilize Augmented Reality (AR) and Google Lens in Google Search.
The new feature uses computer vision in combination with camera and AR to overlay content and information to one’s surroundings.
Soon, users will be able to and interact with 3D objects as AR features will be rolling out in search results. Users will be able to search for a 3D model using their smartphone and place it near their physical surrounding with the help of viewfinder.
Have a look at this example that shows how you can view in 3D and AR.
This technology has several practical uses. For instance, retailers can use AR in their app and let users have a feel of the product.
Wayfair, a reputed retailer and one of the several partners working directly with Google is leaving no stone unturned to have its 3D models rank higher in search results.
Other partners are;
Google Lens features
Google Lens is getting an upgrade to provide more visual answers to visual questions. Google Lens is accessible from the search bar in Google’s mobile app.
A few updates to Google Lens that will be rolled out soon are;
Automatic translations: Google Lens can be simply pointed at the text to translate it to over 100 languages.
Restaurant menu searches: Google Lens can be pointed at a menu to get information about the dish such as photos and reviews.
Text-to-speech: You can get the text read out loud to you by pointing Google Lens at the text.
Google will release the text-to-speech feature on Google Go app before releasing it to the main app.