A new report sheds some disturbing light onuse of generative artificial intelligencenow very widespread also in the videogame sector. Specifically a so-called “AI girlfriend” has been hackedrevealing that many users used it to simulate child abuse.
What is an AI girlfriend? Exactly what it says it is: a chatbot that pretends to be a virtual girlfriend. The newspaper 404 Media reported that the site Muah.ai, which allows you to create chatbots with which you can exchange explicit messages, has suffered a huge theft of data, including many user prompts. Then we discovered the type of interactions users attempted. According to what emerged, many have tried to have the chatbot interpret children.
To think badly…
The hacker behind the feat described the site as really weak on a structural level (essentially it is made up of various forcibly linked open source projects). When he noticed the various vulnerabilities, he began to snoop, discovering a scenario of devastating squalor.
So he contacted 404 Media to report what he had learned, which was that some of the prompts were concerning the sexual abuse of infantscomplete with a request for an orgy with babies and kids. 404 Media could not confirm whether Muah.ai gave pedophiles what they wanted, but the very fact that certain requests exist suggests that many really want to exploit artificial intelligence.
Harvard Han, the administrator of Muah.ai, said that the chatbot has a team of moderators who suspend and delete all chatbots in the card gallery (collection of chatbots created by users) related to minors. In reality the issue appears to be more controversial. Muah.ai allows the creation of explicit conversations and images. In theory, all content featuring minors is banned, but when two people posted a minor AI character on the site’s Discord server, they were told to exchange certain material in private. From this it was understood that whoever administers the site knows how it is used. The suspicion is that he is just trying not to let it emerge to avoid repercussions, instead of actually intervening in certain situations.
#ChatGPTstyle #artificial #intelligence #simulate #child #abuse