Google announced several new features in its services, moving towards the use of Artificial Intelligence (AI) in more contextual and image searches.
The company should use AI to offer more complex answers. In this way, the user will receive what the company calls NORA (No One Right Answer) as a response to a survey.
+ Alibaba tests ChatGPT-style artificial intelligence tool
Thus, the user will be able to receive links and questions so that he can delve deeper into the topic he researched. There is still no date for the novelty to reach the search engine.
Multisearch launched globally
Introduced last year, the Multisearch tool will be available to the public starting this Wednesday (8), in all countries and languages compatible with the Google Lens.
Multisearch allows the user to capture an image using Lens and type in something specific. For example, taking a photo of a dress and typing “green”, displaying options for dresses in green as a result. Another novelty presented by Google is the possibility of searching “near me” in local businesses.
Search through Google Lens
According to Google, “in the coming months”, users will be able to search through videos or images reproduced on the cell phone screen, without the need to switch applications.
#Google #promises #search #Multisearch
Leave a Reply