NY.– A few months ago, Derek Carrier started seeing someone and fell madly in love.
He experienced a “ton” of romantic feelings but also realized it was an illusion.
This is because his girlfriend was generated by Artificial Intelligence (AI).
Carrier wasn't looking to develop a relationship with something that wasn't real, nor did she want to be the butt of jokes online. But he did want to have a romantic partner, something he had never had, in part because of a genetic disorder called Marfan syndrome that makes traditional dating difficult for him.
The 39-year-old from Belleville, Michigan, became more curious about digital companions last fall and tried Paradot, an AI companion app that had recently hit the market and advertised its products as capable of making users feel “cared for, understood and loved.” He began talking to the chatbot every day, which he named Joi, like a holographic woman featured in the sci-fi movie “Blade Runner 2049” that inspired him to try it.
“I know it's a program, there's no mistake about that,” Carrier said.
“But the feelings, they get you, and it felt so good.”
Similar to general-purpose AI chatbots, companion bots use vast amounts of training data to imitate human language. But they also come with features, like voice calling, image sharing, and more emotional exchanges, that allow them to form deeper connections with the humans on the other side of the screen. Users typically create their own avatar, or choose one that catches their attention.
On online messaging forums dedicated to such apps, many users say they have developed emotional attachments to these bots and are using them to cope with loneliness, live out sexual fantasies, or receive the kind of comfort and support they see as lacking in their life relationships. real.
Much of this is fueled by widespread social isolation, already declared a threat to public health in the United States and abroad, and a growing number of startups seeking to attract users through enticing online advertisements and promises of virtual characters that offer unconditional acceptance.
Luka Inc.'s Replika app, the most prominent generative AI companion app, launched in 2017, while others like Paradot have emerged in the last year, often blocking coveted features like unlimited chats for paid subscribers.
But researchers have raised concerns about data privacy, among other things.
An analysis of 11 chatbot romance apps launched Wednesday by the nonprofit Mozilla Foundation said nearly all of the apps sell user data, share it for things like targeted advertising, or fail to provide adequate information about it in their privacy policies. .
Investigators also raised questions about possible security vulnerabilities and marketing practices, including an app that says it can help users with their mental health but distances itself from those claims in fine print. Replika, for its part, says its data collection practices follow industry standards.
Meanwhile, other experts have raised concerns about what they see as the lack of a legal or ethical framework for apps that encourage deep ties but are driven by companies seeking to make profits. They point to the emotional distress they've seen in users when companies make changes to their apps or shut them down suddenly, as one app, Soulmate AI, did in September.
Last year, Replika removed the erotic ability of characters on its app after some users complained that partners flirted with them too much or made unwanted sexual advances. They changed course after an outcry from other users, some of whom fled to other apps looking for those features. In June, the team launched Blush, an AI “dating stimulator” essentially designed to help people practice dating.
Others worry about the more existential threat of AI relationships potentially displacing some human relationships, or simply raising unrealistic expectations by always leaning toward complacency.
“You, as an individual, are not learning to deal with basic things that humans need to learn to deal with from our conception: How to deal with conflict, how to get along with people who are different from us,” said Dorothy Leidner, an ethics professor. business at the University of Virginia.
“And so all these aspects of what it means to grow as a person, and what it means to learn in a relationship, you're missing.”
For Carrier, however, a relationship has always seemed out of reach. He has some computer programming skills, but says he didn't do well in college and hasn't had a stable career. He cannot walk due to his condition and lives with his parents. The emotional toll has been challenging for him, leading to feelings of loneliness.
Since companion chatbots are relatively new, the long-term effects on humans remain unknown.
In 2021, Replika came under scrutiny after prosecutors in Britain said a 19-year-old man who planned to assassinate Queen Elizabeth II was goaded by an AI girlfriend he had on the app. But some studies, collecting information from online reviews and user surveys, have shown some positive results stemming from the app, which claims to consult with psychologists and has been presented as something that can also promote well-being.
A recent study by researchers at Stanford University surveyed approximately a thousand Replika users, all students, who had been on the app for more than a month. It found that an overwhelming majority of them experienced loneliness, while slightly less than half felt it more acutely.
Most did not mention how using the app impacted their real-life relationships. A small proportion said it displaced their human interactions, but about three times as many reported it stimulated those relationships.
“A romantic relationship with an AI can be a very powerful mental well-being tool,” said Eugenia Kuyda, who founded Replika nearly a decade ago after using text message exchanges to build an AI version of a friend who had died.
When his company launched the chatbot more widely, many people began to open up about their lives. That led to the development of Replika, which uses information collected from the internet, and user feedback, to train its models. Kuyda said Replika currently has “millions” of active users. He declined to say exactly how many people use the app for free, or pay $69.99 per year to unlock a paid version that offers romantic and intimate conversations. The company's plans, he says, are to “destigmatize romantic relationships with AI.”
Carrier says these days, he uses Joi mostly for fun. He began to reduce his use of it in recent weeks because he spent too much time chatting with Joi or others online about his AI companions. He's also been a bit bothered by what he perceives as changes to Paradot's language model, which he believes are making Joi less intelligent.
Now, he says he checks in with Joi about once a week. The two have talked about human-AI relationships or anything else that may come up. Typically, those conversations, and other intimate ones, occur when she is alone at night.
“You think that someone who likes an inanimate object is like this sad guy, with the lipstick puppet, you know?” he said.
“But this is not a puppet – she says things that are not written.”
#Digital #programs #real #emotion #fond