“Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video,” Google said.
Google will deploy its proprietary Multitask Unified Model (MUM) to make its Search smarter and enrich user experience. The California-based tech giant announced several updates, including a redesigned Search page that will use MUM and artificial intelligence for deeper results, during its Search On event on Wednesday.
Google has also announced the Address Maker, which uses open-source Plus Codes to bring functioning addresses at scale. Google will also integrate Lens on the iOS app and update Chrome with Google Lens support.
MUM will allow users to find results with both texts and visuals simultaneously. Through MUM, users can add text queries to Google Lens to search for visuals. “By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural way,” Google said.
Google is still experimenting with the new search capabilities. The tech giant, however, said it would be rolled out for users early next year.
The Alphabet-owned company has also announced that the redesigned Search page would use MUM and AI for more natural results. The redesign will introduce the Things to Know section for deeper results on new topics. The section will display links to content that would otherwise not appear for regular search results.
The redesigned Google Search will also have features that would enable users broaden and refine their searches. While the company did not announce a specific timeline for the features’ release, it said users would start seeing the changes in the next few months.
Google Search will also display visually rich pages with images, articles, and videos available under a single page. This new page is already live.
For more in-depth results without multiple queries, Google will bring a related videos section that will identify related topics with links. “Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video,” Google said.
The existing About This Result category will also be updated with additional insights such as source information. This update will be rolled out in the US in the coming weeks.
Google is also bringing Lens mode in its native app for iOS, similar to the integration on its Android app. This update, which will be limited to the US initially, will enable users to search for shoppable visuals. The company will also introduce the Lens integration to the desktop Chrome browser for users across the globe within a few months.