Google has announced a bunch of new features for its search engine as well as for Google Maps at this year’s ‘Search On’ event. The event that took place on October 15 was more focused towards under-the-hood updates that the company is leveraging across its G Suite platforms. One of the exciting features includes new indicators on Google Maps that are aimed to show the busyness of the location amid the COVID-19 pandemic. Additionally, Google Search is getting new improvements to display more accurate results even when there’s a typo in the search. There’s also a ‘hum to search’ feature in the pipeline.
Starting with indicators on Google Maps, the company says that the platform has been helping users to identify congested areas in light of the coronavirus pandemic. In the coming weeks, Google Maps will put indicators such as ‘Usually as busy as it gets’ and ‘Usually not too busy’ under location names to help users make more informed decisions to ensure social distancing. Explaining the feature, Google said, “To calculate busyness insights, we analyse aggregated and anonymised location history data from people who have opted to turn this setting on from their Google Account. This data is instrumental in calculating how busy a place typically is for every hour of the week.”
Coming to Google Search, the company claims that one in 10 queries every day on the platform are misspelt. To tackle this issue, Google Search is leveraging a new spelling algorithm that uses a “deep neural net” to improve its ability to decipher misspellings. Additionally, Google is applying “neural nets” to understand subtopics that will display more specific results even to a broad query. For instance, if the user searches “home exercise equipment,” the platform with the power of AI will be able to understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content. This feature will rollout later this year.
Google Search is also aiming to make video search results more accurate with its chapter-like moments displayed in results. It essentially means that while searching for a video on Google, the AI would pin-points to moments within the video that is more relevant to users. Additionally, users can soon be able to ‘hum to search’ in case they cannot remember the lyrics of the song they’re searching. The availability details of this feature remain unclear.