As more people take to Google Search in order to find information on a variety of topics, the company claims that it aims to make the search for information more natural and intuitive than ever before. “It turns out that people have an insatiable curiosity about all kinds of information. Every day, 15% of the searches that we see on Google are ones that we have never seen before. The future of search won’t be just about matching the right keywords but about having the most helpful, visually rich useful information available that matches just what the searchers want,” Pandu Nayak, vice president, search, Google, said at the Cannes Lions International Festival of Creativity, 2022.
According to Nayak, advances in artificial intelligence (AI) are playing a critical role in Search. “Imagine a future where you can search anyway and anywhere and find helpful information about what you see, hear and experience in ways that are most intuitive to you. Whether that’s with your voice, or camera, by typing a query or a combination. This is our vision for the future of Search. It’s one we have already taken a step towards,” he added.
Google claims that its ‘hum to search’ feature is being used more than 100 million times every month. Furthermore, Google Lens, which lets users search what they see using their camera right from the search bar, is now being used more than eight billion times a month.
“We have also launched Multisearch in the Google App. This allows one to take a picture and ask a question at the same time. Multisearch is available now in the US in English and we look forward to bringing it to other countries and languages,” Nayak stated.
The company also claims that soon it will take Multisearch even further by adding the ability to search locally. Multisearch Near Me will be available later this year in English globally and will come to more languages over time.
“The Multisearch technology that’s available today recognises and identifies objects captured in a single frame. But in the future, with an advancement that we are calling Scene Exploration, you will be able to use Multisearch to pan your camera, ask questions and instantly get information about multiple objects overlaid on the scene right in front of you,” Nayak explained.
It is critical that we continue to build for everyone, Nayak said. The company claims to have recently expanded Google Translate’s capabilities to 24 new languages. “We are working to make products like Google Assistant more helpful for those with speech impairments.”
The company also aims to focus on better reflecting the world’s diversity in Google Search Results. “We are using the Monk Skin Tone Scale developed by Harvard Professor, Ellis Monk, to help us build more inclusive products across Google. In Search, we now offer skin tone filters in image results for beauty searches so people from all kinds of backgrounds can find more relevant information. In the coming months, we will also be developing a standard way for creators, brands and publishers to label their content with skin tone and hair type,” Nayak explained.
Towards the end, Nayak spoke about the importance of responsible data practices. “Our research has found that users are happy to share personal information with companies they trust as long as they know how it will be used and what they will get in return. Responsible data practices are a great step forward in building that trust. This involves clearly communicating why and how you are collecting data, showing the benefits people can expect, and giving people transparency and control in managing their data preferences,” he elaborated.