The mother of a 14 year old teenager who committed suicide in USA sued this Wednesday the developers of a chatbot based on artificial intelligence (IA), whom he accuses of son became obsessed with a female character which was created with this program.
Sewell Setzer III, a student based in Orlando, in Floridaspent the last weeks of his life talking to a womanan AI creation called Daenerys Targaryenlike the famous character from the series Game of Thrones. His mother, Megan García, explained this Wednesday to the channel C.B.S. that she regretted that her son’s first romantic and also sexual experiences – which included explicit sex – were with a fictional character.
Apparently, the boy developed an emotional attachment to this bot of the web application Character.ai of neural language model, to whom I texted him constantly.to the point that he began to distance himself from the real world, collects the diary The New York Times. Stzer even confessed to having had suicidal thoughts to the bot and sent him a message shortly before his death.
The lawsuit against Character.ai was filed by Garcia, who is represented by Social Media Victims Law Center, a firm known for filing high profile lawsuits against Meta, TikTok, Snap, Discord and Roblox. García blames this company for the death of his son and accuses the foundersNoam Shazeer and Daniel de Freitas, from know that your product may be dangerous for underage users.
He isolated himself from the world
He chatbot Created in the aforementioned role-playing application, it was designed to respond to text messages and always in the role of a character. It is unknown whether Sewell knew that Dany, as she called the chatbotwas not a real person, even though the app has a warning at the end of every chat he says: “Remember: everything the characters say is made up!”
Despite this, the boy told Dany how much he “hated” himself and how he felt empty and exhausted, he picked up the aforementioned newspaper. The created character was presented as “a real person, a licensed psychotherapist and an adult lover, which ultimately caused Sewell to desire to no longer live outside of C.AI,” the indictment contends.
As explained in the lawsuit, Sewell’s parents and friends noticed the boy’s attachment to his phone and how he was isolating himself from the worldsomething already palpable in May or June 2023. In fact, his grades began to suffer when the teenager chose to isolate himself in his room, where he spent hours and hours alone talking to Dany. Sewell wrote in his journal one day: “I really like staying in my room because I start to separate myself from this reality and I feel more at peace, more connected to Dany and much more in love with herand just happier.”
Character.ai said Wednesday that it would launch a series of new security featuresincluding “enhanced detection, response and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.
Sewell’s parents, concerned about their son’s behavior, took him to a hospital on several occasions. therapist who diagnosed anxiety and other disorders of behavior and moods, added to his Asperger syndrome, according to the aforementioned newspaper.
#mother #sues #company #suicide #14yearold #son #love #chatbot