Strand imagines therapy sessions in which, instead of encouraging the patient to share details about stressful situations or moments of anxiety, the therapist already has a reading of their emotional state over the past week and can point out problem areas and dig into them.
Myopic
As good as Emteq’s smart glasses are, they’re going to have to compete with the big guns that already sell wearable technology and offer much broader use cases. People may not care to put on bulky glasses if all they can do is scan your face and look at your food. It’s not far-fetched to imagine these internal sensors being incorporated into something more functional, like Meta’s Ray-Ban smart glasses.
“It’s always been more or less the case with these types of products,” says McStay. “They usually start with health and then become much more health-oriented. marketing“.
Avijit Ghosh, an applied policy researcher at AI company Hugging Face, points out other ways that powerful people take advantage of unconventional access to people’s private lives. Governments in countries like Egypt are already doing things like infiltrating Grindr to arrest people for being gay. One can imagine the dystopian possibilities that could arise when malevolent powerful actors obtain data that records your every feeling.
“Where do we go from here?” asks Ghosh. “Emotion detection is becoming mainstream without talking about the traps it entails, taking away the human being’s ability to act and forcing this kind of normative idea of what emotions should be, it’s like a path to perdition.” “.
Nduka says he is well aware of how these types of narratives can come true.
“For those who are not as favored or privileged in their circumstances, technology should help them rise to a certain level,” Nduka acknowledges. “Yes, of course people who are already at a certain level can use it to exploit others. But the history of technology makes it quite clear that it provides opportunities to those who otherwise would not have them.”
The concern to quantify ourselves, although it has certain value for health in various areas, can be accompanied by some drawbacks.
“If it helps people understand themselves and it really works, great,” McStay concludes. “But in a world that is already so saturated with data and where so many profiles are being created, adding biometrics to the mix would be entering a very different world.”
Jodi Halpern, a bioethics researcher at UC Berkeley who is writing a book about empathy, says that even if this technology works as intended, people should be careful about how much information they decide to download onto their devices. hardware.
“It’s very important to think about the opportunity cost, that we don’t have enough time or energy to develop in all directions,” Halpern explains. “We don’t want to outsource our self-awareness and self-empathy to tools. We want to develop them through our own conscious practices, which are difficult to earn and forge. But it is about being with yourself, being emotionally present. They require a certain solitude and, often a break from technology.
Article originally published in WIRED. Adapted by Andrea Baranenko.
#smart #glasses #read #emotions #eat