Microsoft’s Bing had joined Google, Pinterest and a few other tech companies when it comes to upgrading their image search capabilities. While Google will remain to be a dominant search engine, but the competition has just dropped a pretty amazing new feature, as Microsoft’s Bing can now enable its users to search for images within images. This essentially means that the company is expanding its image search toolset with a new product that lets users search for specific items shown within a larger image. The new tool called as Bing Visual Search is a part of the search engine’s existing image search tools and appears to be quite easy to use and performs impressively too.
The new feature available now on Bing.com (both PCs and smartphone) or in the Bing mobile app. Bing is also making visual search available to developers via its image search APIs. The search engine has said that its new visual search tool will work with existing internet images as well as new user-taken photos. An initial check showed that the new feature was really impressive in searching for certain objects on the web, but it still could not recognise a few items. Some of Bing’s rivals too have released or announced similar ‘search within an image’ tools recently. Here’s a look at some new features you can use in search engines, that you may not know about:
Google ‘Similar Items’
Google’s image search results on the mobile web and in the Android Search app are now showing “similar items” – so, if you are looking at ‘lifestyle’ images and click on one that you like, Google may show you additional product images from places where you can buy the product. Julia E, the product manager on Google Image Search, announced on the Google search blog that you need to use schema.org product metadata on your pages and schema.org/Product markup to make sure your products are eligible for inclusion on these image results. This scheme was actually introduced last December, but Google never announced it.
You may also like to watch:
Recently, Google in its I/O 2017 keynote said that it is touting its advancements in machine learning and artificial intelligence — one of which is a new visual search tool called Google Lens. Google CEO Sundar Pichai described Lens as “a set of vision-based computing capabilities that can understand what you’re looking at and help you take action.” The tool, he said, will initially be available in Google Assistant and Google Photos (both of which received several other updates, too).
On first glance, Lens is reminiscent of a 2009 technology called Google Goggles that offered the ability to do searches based on a smartphone photo. But Goggles was mostly just for identifying something; Lens has the ability to not only identify what’s in the photo but also to give the added context of, for example, a restaurant’s phone number, its rating score and so forth.
— Google (@Google) May 17, 2017
Pinterest: Phone camera becomes search bar
The social-network-slash-search-engine introduced Lens, a mobile app that uses the phone’s camera to recognise physical objects and pull up related items from Pinterest. The app appears to be Pinterest’s version of Google’s neglected Goggles app. Bing also has the feature built into its app, added last year. Based on some GIFs provided by Pinterest, Lens focuses on objects appearing within a circular window- likely to reduce the signal-to-noise ratio; and then presents tags that it associates with the object as well as relevant pins from its platform.
Instagram wants its users to shop without always having to interrupt their scrolling with a browser window. That’s why it started showing shoppable tags on photos from 20 retail brands like Kate Spade and JackThreads to iOS users in the US. The retailers tag products in their photos, which are hidden behind a ‘Tap to view products’ button. After selecting their product of choice, users see an in-app details page with a specific product’s price, description, additional photos, and a ‘Shop Now’ button to buy it on the web.