Google will now let you search for things you can’t even describe

Google will now let people search for things that they can’t even describe.
The new feature, called Google Lens Multisearch, means that people will be able to use an image to explain the thing they are looking for.
If someone has found a dress that they like but want it in green rather than the yellow they have a picture of, for instance, they can upload that picture and tell Google to look for “green”. If they have a plant they want to know how to look after but don’t know the name, they can take a picture of it and add “care instructions” to find out how it should be treated.
Google describes the tool as “an entirely new way to search”. It said that it is part of an attempt to get people to “go beyond the search box and ask questions about what you see”.
The feature is powered by Google’s updates artificial intelligence, which it says will make it easier for people to find what they are looking for. In the future, the feature could be improved by “MUM”, or Multitask Unified Model, a new technology that Google says will make searching much easier.
Google has been working on that tool for some time, and announced a number of changes built around it at a “Search On” event last year.
The feature is rolling out in beta now, and can be found in the iOS and Android version of the app. It will only be available for users in the US.
To find it, users can just open up the app and will be presented the option to use their camera or voice as well as text.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
0Comments