Google has launched a beta version of the Google Lens Multisearch feature at its Search On event, and while it is still a US-only beta feature, it’s rolling out in the Google app on iOS and Android. The new feature, Google Lens Multisearch, will give users the ability to use an image to explain the thing they are looking for. With the new search option, you can combine text and images in a single query.
For instance, you come across a dress that you like but you’ll prefer in green rather than the yellow colour it presents. You can then upload that picture and tell Google to search for a “green”. The same goes with a plant you want to research but don’t know the name, take a picture and add “care instructions” to find out how it should be treated.
While the feature is mostly aimed at shopping initially, it’s not limited to such queries. Google Search product manager Belinda Zeng and the Company’s search director Lou Wang suggests it could do a lot more than that. “You could imagine you have something broken in front of you, don’t have the words to describe it, but you want to fix it… you can just type ‘how to fix,’” says Wang.
Zeng adds that in fact, it might already work with some broken bicycles. Zeng further explained that she has also learned about styling nails by taking a screenshot of beautiful nails on Instagram, she then proceeded by typing the keyword “tutorial.” That then got her the kind of video results that wouldn’t normally pop up on social media.
Google has described the new tool as “an entirely new way to search”. The company has confirmed that the creation of the new tool is part of an attempt to get people to “go beyond the search box and ask questions about what you see”. “We want to help people understand questions naturally,” says Wang, explaining how multi-search will expand to more videos, images in general, and even the kinds of answers you might find in a traditional Google text search.
The new feature is expected to be powered as a result of updates to Google’s artificial intelligence. The company says this will make it easier for people to find what they are looking for. Multisearch marks Google’s latest effort to make search more flexible and less bound to words on a screen. Google has long offered an image search engine.
In the future, the feature could be improved by “MUM”, or Multitask Unified Model, a new technology that Google says will make searching much easier. Google is hoping AI models can drive a new era of search, and there are big open questions about whether context — and not just text — can take it there. This experiment seems limited enough (it doesn’t even use its latest MUM AI models) that it probably won’t give us the answer. But it does seem like a neat trick that could go fascinating places if it became a core Google Search feature.
However, it won’t work with everything just like your voice assistant doesn’t work with everything. The reason is that there are infinite possible requests, something the tech giant is still busy trying to figure out intent, according to The Verge. Should the system pay more attention to the picture or your text search if they seem to contradict? Good question. For now, you do have one additional bit of control: if you’d rather match a pattern, like the leafy notebook, get up close to it so that Lens can’t see it’s a notebook. Because remember, Google Lens is trying to recognize your image: if it thinks you want more notebooks, you might have to tell it that you actually don’t.
For now, the Google Lens Multisearch feature is rolling out in beta only, and can be found in the iOS and Android versions of the app, but available for users in the US only. To find multi searches in Google’s mobile app, you have to tap a camera icon on the right side of the search bar, which pulls up Google Lens. You can take or upload a picture, and then tap a little bar containing a plus sign and the phrase, “add to your search.” This lets you type words to better explain what you want.
Kindly download the Techbooky App to engage with us and stay updated with fresh news on the go.