United States | “Girls just cry and cry” – Nude images created by artificial intelligence have led to scandals

According to experts, legislation has not kept up with technology.

As one 14 years old on an October morning Ellis woke up to find that he had received several calls and messages. They were all about the same thing: nude pictures of Ellis circulating on social media.

Ellis hadn't taken nude pictures of herself, but that didn't seem to matter. The so-called deepfake images created with artificial intelligence looked so real.

As artificial intelligence flourishes, so has deepfake pornography. Real-looking pictures and videos are created with very little effort and money. This has led to scandals and bullying in many US schools. Similar cases of counterfeiting have also been revealed in Europe and Finland.

It is difficult to intervene, because US federal legislation does not prohibit the making of deepfake photos.

The photos of Ellis, who lives in Texas, USA, and his friend were taken from their Instagram accounts. Their faces were removed from the images and placed on AI-generated nude bodies.

The same had also been done to several other students. All the victims were girls. The edited photos were shared with classmates on the Snapchat messaging app.

“It looked real, the bodies looked real. I remember being very, very scared,” says Ellis.

Ellis's Mum Anna Berry McAdams was amazed at how real the pictures looked.

“The girls just cried and cried. They were very ashamed. They didn't want to go to school,” McAdams recalls.

At the end of November, a similar incident happened in the US state of New Jersey.

“This kind of thing will happen more and more often,” says the mother of the 14-year-old victim Dorota Mani.

Mani points out that pornographic deepfake images can spread on the internet without the victim having any knowledge of them.

“Many victims don't even know about the pictures and therefore can't protect themselves.”

According to experts, legislation has not kept up with technology, even though simpler versions of so-called fake porn have been made for years. Usually, the videos and pictures are focused on public figures, but nowadays anyone who has published their picture on, for example, the LinkedIn community service, can become a victim.

In October President of the United States Joe Biden signed a regulation related to artificial intelligence. He urged the federal government to create ways to prevent the creation of sexual exploitation material about children and the creation of intimate images of real people against their will.

Although it is often difficult to trace the creators of individual images, the artificial intelligence companies or social media platforms behind them should bear responsibility, says the computer science professor Hany Farid from the University of California.

Professor at the University of Virginia Renee Cummings on the other hand points out that deepfake images can create a conflict with the law.

“Even though your face is put on a body, the body is not really yours,” he says.

According to Cummings, this is why it can be argued that the laws against sharing sexual images without permission do not apply to deepfakes.

He stresses that deepfake porn can ruin a person's life. He refers to victims who have had anxiety, depression and post-traumatic stress disorder.

in Texas the police and school staff talked to Ellis, but there didn't seem to be any real help. The student who made the deepfake photos was temporarily suspended.

Ellis, on the other hand, has requested a change of school. He says that he is constantly anxious.

“I don't know if someone saved the pictures and forwarded them. I don't know how many pictures he took,” says Ellis.

“So many people have been able to get them.”

Ellis's mother, on the other hand, is afraid that the pictures may surface again.

“This can affect them (the victims) for the rest of their lives,” he says.

#United #States #Girls #cry #cry #Nude #images #created #artificial #intelligence #led #scandals

Related Posts

Next Post

Leave a Reply

Your email address will not be published. Required fields are marked *

Recommended