For several years now, mobile phones have been called smartphones (smart phones) due to all the capabilities and functions they offer. However, these devices have always worked under the commands and movements that people order them to.
But thanks to the arrival of Artificial Intelligence, the technology that makes machines skillful, the concept of a smart device is going to change, and Google’s Project Astra is one of the first examples that demonstrate this.
The company has shown some of the capabilities that its universal AI assistant prototype will offer, whether integrated into a smartphone or smart glasses, that You will be able to answer any user question taking into account the environment around them and their actions in real time.
In the case of Project Astra on a Pixel smartphone, the user interacts with the assistant using voice commands and the camerawith what is capable of understand both real-time images of the environment, as well as comments and requests that the user performs out loud.
To do this, it is enough for the user to focus the camera on what they want to obtain information about, for example, a monument on the street, a plate of food or an appliance, and ask related questions at the same time. Thus, the assistant could offer help with relevant information about the monument, detail the ingredients with which the plate of food has been prepared or explain which program is most suitable for washing a certain item of clothing in the washing machine.
YouTube Video
The assistant’s responses are displayed as both written on the smartphone screen‘ as via voice responsefor a fluid conversation with the user.
Likewise, the assistant can also perform related searches or recommendations with, for example, a list of restaurants written down on a sheet, simply by focusing on said list, as well as remembering information to offer it to the user quickly and easily later.
#Google #Project #Astra #mobile #phones #smart #devices