Google recently revealed a new version of Google Lens for Android mobile devices at a demonstration in Paris. Google will make it possible for anyone using Lens on Android to search everything that is shown on the screen “in the coming months.” During the presentation, Google stated: “If you can see it, you can search it.”
A building’s name, cuisine recipes, car models, or any other image that might contain searchable information might all be searched for by users. You wouldn’t have to leave the screen to conduct a Lens search, and it will operate across applications and websites.
Google unveiled an update to its Multisearch tool, which enables users to conduct an image search and add text to narrow down search results, at the same event. Now, you may add text to an image to describe the kinds of outcomes you want.
In the tweet above, Google highlights how you can conduct an image search, use Lens to narrow down the results, and then add a text search with a different style.
Google also debuted Bard, a ChatGPT competitor that would soon be included in Search, at the same event. Microsoft recently revealed AI-powered improvements for Bing and the Edge browser.