From Apple to Meta, all the tech giants are dominating the world when it comes to artificial intelligence. All the tech leaders are busy dipping their hands into experimenting and launching new generative AI features. With this, Google has introduced a line of new features when it comes to its AR services. The most prominent addition is Google Lens’s ability to recognise skin conditions. Along with Lens, Google Maps, Google Search, and Google Mail are all set to roll out new features with artificial intelligence.
Also Read: Google Meet introduces “On-the-Go” mode: how to use and other details
Google Lens
With the new feature, users will be able to search for skin conditions that look similar to what they see on their skin. After clicking an image of the skin, one can crop out the affected area, and the results will appear in a carousel that is displayed above the visual matches. However, Google notes that the search results are only for informational purposes and not for diagnosis. The users are advised to consult their medical authority for further advice.
Apart from the affected areas on their skin, if the user is suffering from a bump on their lip, hair loss on the head, or even a line on the nail, this feature will work with this. Google introduced this feature because it notes that it is difficult at times to describe an odd mole or rash on the skin with just words, and sometimes a picture might aid to explain more.
Google Maps
In order to bring real-time travel progress to the directions or route overview screen and your lockscreen, Google previewed its Glanceable Directions earlier this year. Without having to actually start navigation, you will receive up-to-date ETAs, reroutes, and instructions on which way to turn. This information will also appear on their lockscreen on both Android and iPhone devices. These Glanceable Directions are all set to be introduced this month for walking, cycling, and driving on both Android and iOS.
At I/O 2022, Immersive View was introduced, allowing users to hover over a location they are investigating and view it in various lighting and weather scenarios. Amsterdam, Dublin, Florence, and Venice will soon receive it. This is being expanded by Google to “over 500 iconic landmarks around the world, from Prague Castle to the Sydney Harbour Bridge.” The Recents Sidebar will now save the locations the users are browsing rather than clearing them automatically.
Google Search
There is a feature that will provide the user with an AI-powered snapshot when one prompts a detailed question about a place or location, like a heritage site, a restaurant, or a café. This feature is the Search Generative Experience (SGE). It will give out information from the web, including user submitted reviews, photos, and even business profile details.
Google search: Shopping
When it comes to online shopping, one is always looming over the question of how a particular piece will look on them. This new feature is the answer to this question. Search is here to roll out a virtual try-on tool. It uses a generative AI to tell the users how the clothes will “drape, fold, cling, stretch, and form wrinkles and shadows on a diverse set of real models in various poses.” All of this is possible with just one clothing image. Google has selected people of different shapes and sizes, ranging from XXS to 4XL, who represent different skin tones as per the Monk Skin Tone Scale. They also represent different body shapes, ethnicities, and hair types.
This feature will initially be available in the U.S. for women’s tops from brands like Anthropologie, Everlane, H&M, and Loft. The user needs to select a model that looks most similar to them and then find and click the try on badge. As per the reports, the feature will be available for men in the coming year.
Further, another feature is all set to be added to search that will allow the user to filter clothes according to colour, style, or pattern. Google is all set to use machine learning and “new visual matching algorithms” to bring this feature to different retailers and stores.
Google Mail
With the public testing of generative artificial intelligence features in Gmail and Google Docs, the enrolled iOS and Android testers can try out the “Help you write” feature in Gmail. The tech giant has started rolling out Workspace Labs to a greater number of beta testers. This will enable the users who are enrolled in the Workspace Labs program to use the “Help you write” tool.
Also Read: Google Password Manager gets an update: how to use biometrics on Chrome and other details
Reports suggest that the beta testers who write emails on their phones will receive an introductory prompt for the generative AI. After that, the users can find the “Help me write” button at the bottom-right corner. They can enter their desired prompt and then click on the create button to see the feature work.
The “Insert” or “Replace” option for the recommended text will be provided by the generative AI. The ‘Refine’ option will also give users the options to Formalise, Elaborate, Shorten, I’m Feeling Lucky, and Write A Draft.