Google Search Adds New AI-Based Tools for Enhanced User Experience
Google Search has added several new AI-based tools to enhance user experience.
During its Search On virtual event on October 15 (via VentureBeat), Google announced that it’s bringing in a host of improvements that include better understanding of spelling inputs from users, indexing individual passages from webpages, dividing broader searches into subtopics, and dividing videos into segments.
Google has introduced a new spelling algorithm to improve the search engine’s understanding of misspelled words and reduce mistakes. According to Google, the spelling algorithm uses a “deep neural net” to improve its recognition of wrong spellings and also find the right results within a span of 3 miliseconds. The California-based software giant claims that the new AI spelling update “makes a greater improvement to spelling than all of our improvements over the last five years.”
Google Search will also now be able to index specific passages from webpages and not just show the entire webpage based on a query. If a user searches, “How does augmented reality work?” Google will now index specific section from a webpage that talks about the exact query, rather than throw up searches populated by entire webpages on AR.
In another new feature, Google Search now throws up subtopics based on the user’s query. For instance, if someone searches “home exercise equipment,” Google will show subtopics such as budget equipment, premium equipment, or small space ideas.
Google Search has also added a Live View feature that provides essential information about a business before you visit it in person. The update will show you how busy a business is right now to help you maintain social distancing easily. COVID-19 safety information will also be shown on Business Profiles across Google Search and Maps, letting you know if a business requires you to wear a mask or make reservations in advance.
Finally, users can now hum, whistle, or sing a melody to Google via the mobile app by tapping the mic icon and saying, “What’s this song?” or by clicking the “Search a song” button. Humming for 10-15 seconds will give Google’s machine learning algorithm the chance to match the song. The feature is currently available in English on iOS, and in around 20 languages on Android, with more languages coming to both platforms in the future, Google said.