O Google will make your search less reliant on keywords and more based on images and questions. This Wednesday (28), the company showed, in its annual Search On, some of the tools that will be available. They should arrive soon, both in the search engine and in apps like Google Lens and Google Maps.
Google executives heard by the site The Verge say that the internet has changed a lot in the last few decades, and the search can no longer depend on the user typing the right terms.
In fact, this is less and less necessary, with advances in artificial intelligence, which allow for more accurate and articulate processing of the language used in the questions.
Google is also seeing the emergence of some unlikely competitors. A portion of young people and adolescents say they search on Instagram and TikTok. Networks are generally the most interesting sources for topics like cooking, fashion and tips, where the best answers often come from other users.
Multisearch is like pointing and asking, says Google
One of the new features is called Multisearch. With it, you can use images and words in the same search.
Google gave two examples: taking a picture of a shirt and typing “tie” to find ties with that pattern, and choosing an image of a green dress and typing “purple” to find it in that color.
The company says this works “the way you point at something when you want to ask a question about what you’re seeing.”
Multisearch is now available in English and should arrive in Brazil, in Portuguese, in the coming months. More than 70 languages will have access to the feature.
Another way to use Multisearch is to choose an image, or take a photo, and then type the term “near me”. Google then shows you where to find that dish or product in the region where you are.
This function is not yet available; it should arrive in the US later this year.
Google Lens Translator gets prettier
One of the most interesting features of Google Lens is translating text captured by the camera. The tool delivers what it promises and you can understand what is written, but the result is not always very beautiful, as the app blurs the background image.
From now on, the application will erase the original text using artificial intelligence and overlay the translation on the image, as if it had already been done with that text.
Responses generated by other users
Even with so many new features, there are those who prefer to do the search by typing. Now Google will suggest questions that might be interesting for the term being written.
In the example shared by the company, when typing “Mexico”, the search engine can suggest “best cities in Mexico for families”.
By selecting a city, you will be able to see short videos posted by other users, as well as relevant information such as transportation and activities.
The features will still roll out in the US, in English, in the coming months.
Google Maps will also show the “vibe” of a particular neighborhood or region with community-generated information such as photos, tips and recommendations. The app will gain this option in the coming months, both on Android and iOS.
These tools seem to be a response to the aforementioned preference of younger people for Instagram and TikTok.
Those who use these networks do not want precise information, but rather to explore existing suggestions, discover cool things and know what other users recommend.
With these options, Google wants to help these users — and recover that small portion of the audience that today uses social media platforms to search.