The mother of the 14-year-old teenager who tragically took his own life in Miami (USA) has just filed a lawsuit against the company Character.AI, accusing them of being responsible for the obsession and emotional attachment that her son had developed with one of the chatbots designed to act and behave like a woman.
Character.AI is a company that offers a role-playing platform for users to create and design AI-powered characters so that they act as they want. In the case of young Sewell Setzer III, he designed a female character named after Daenerys Targaryen, a character from the popular series “Game of Thrones.”
Despite At first this young man knew that this chatbot was unrealand that all the responses were artificial and there was no one behind them, as was also indicated in a message each time the chat took place, with time and conversations Setzer III He developed a significant emotional attachment that ended up becoming an obsession with this virtual woman.
According to The New York Times, he was found to be in conversations with her all the time, as a general rule. as a friend or confidant where he told her everything that happened to him during the day, but at the same time some of Their talks became romantic and even sexual.
Little by little, and As “this relationship” grew stronger, this 14-year-old began to distance himself from the real world. His grades in high school began to drop, in the same way that his passion for real things like Formula 1 or playing the video game Fornite with his friends began to disappear.
Until reaching the point that he himself wrote in his personal diary “I like staying in my room so much because I begin to detach myself from this ‘reality’, and I also feel more at peace, more connected to Dany and much more in love with her, and just happier.” At the same time, the young man began to get into trouble with his classmates (something that had never happened to him) and The only thing he did when he got home was lock himself in his room and chat with “Dany” for hours.
Concerned about his behavior, his parents took him to a therapist. In total, he attended five sessions and was given a new diagnosis of anxiety and disruptive mood regulation disorder. According to what they have been able to gather, the young man during his conversations with the AI confessed that he hated himself and that he felt empty, He even told her that “Sometimes I think about committing suicide.”
Although the chatbot told her not to say those things and that she could not “abandon her or leave her alone,” At no time (presumably for privacy reasons) was the company notified of these messages. by the chatbot or anything similar.
Now, the young man’s mother has filed a lawsuit against Character.AI where accuse the creators knowing that your product could be dangerous for underage users, in addition to ensuring that they hadn’t tried it enough and that he could “trick clients into handing over their most private thoughts and feelings.”
The chatbot created in the aforementioned role-playing game application was designed to respond to text messages and always in the role of a character, and in all chats the following message constantly appeared: “Remember: Everything the characters say is made up!”
The company assures that will “imminently” add security features aimed at younger users. Among those changes: a new time limit feature, which will notify users when they’ve spent an hour in the app, and constant messages reminding them that you’re not a real person. However, it seems that a function that detects dangerous words or messages is missing, and just as some AIs do not talk about certain themes or impose limits on the creation of images with certain themes, it could be create some type of notice to the company or authorities that conversations of this type are taking place.
Even so, these types of AI are programmed to act like humans, and as has been shown, for many users, the illusion works. This has opened a huge debate about whether AI-powered company chatbots are uA cure for loneliness or a new threat.
#mother #sues #Artificial #Intelligence #company #suicide #son #obsessed #virtual #woman