Through tears, after listening to their deceased relatives, several women confessed their astonishment at the Show television The Anthill. “It felt super real and I needed it, I really needed it,” sobbed a young woman in front of the cameras. “The voice is amazing, very happy with the experience,” added another woman, wiping the tears from her face. Pablo Motos' program had taken advantage of artificial intelligence to recreate, from real audio, the voices of dead people. Not only did they reproduce it, something very simple to do and which has caused misinformation problems due to the use of these deepfakes with the voice of Joe Biden or the leader of the British Labor Party, Keir Starmer. The audios generated posed questions that suggested to the participants — “did we have any conversations left?” — in this “real experience,” as the program labels it, which has immersed itself in prime time in an emerging market, that of recreating the deceased with artificial intelligence (AI). Psychologists warn that it can interfere with the natural adaptation to grief and make the most painful phases chronic.
The death of someone close is like losing a part of yourself. It is the source of several difficulties in emotional well-being and many would be willing to do anything to alleviate that feeling of overwhelming loss. Even talk face to face with that loved one, if possible. It sounds like science fiction, but companies like HereAfterStoryFile and Replika They are doing it and it is nothing supernatural. Using interviews and other content, they create digital versions of deceased individuals to interact with the living, whether through chat, voice or video. In China, this business is already growing, with several companies claiming to have created thousands of these digital personas or bots ghosts Some even claim they can do it. with just 30 seconds of audiovisual recording of the deceased.
The American StoryFile interviews people throughout their lives via video, asking a series of questions about key experiences such as their childhood, their wedding or their biggest challenge, along with others that the interviewee decides to add. Based on the responses and with the use of artificial intelligence, a conversational video is generated with which children, parents, friends and relatives can interact in the future. According to what they indicate, approximately 5,000 people have already created profiles on the platform. The cost of the service varies between 40 and 450 euros, depending on the number of questions you wish to include. They also offer a free trial.
Stephen Smith, co-founder of StoryFile, explains that the company was born a decade ago with the aim of preserving the memory of Holocaust survivors. But it was at the end of 2021 that the platform became what it is today, where anyone can record videos with a webcam from home or in a studio.
The co-founder emphasizes that the platform does not invent content, but rather “recovers something that was pre-recorded,” already existing. But it is possible to go further and add information from other formats. “We have done it using the conversational archive methodology. It means using content from the person's life, like a video where we can clone the voice and then have them say things they said in their lives. For example, you could use an email and then have it read to you. If someone wants that to happen, it is possible,” he tells EL PAÍS via videoconference.
The danger of getting hooked
Perhaps the most disturbing element is that some people could become dependent or even addicted to talking with virtual avatars, because they generate a false feeling of closeness with the dead, as the Antena 3 program showed. The women who volunteered spoke directly to the voice ―“I would tell you…”, “I miss you”― as if that synthetic recreation were his grandmother who died a year earlier.
“At first, there is relief. But then an addiction, a dependency arises,” warns José González, a psychologist specializing in grieving processes. “If the AI literally reproduces what one was like, there is a great danger of chronification, especially in very intense bonds. It's easy to get into that fantasy that he's not dead. It can cause that freeze in the denial phase,” he continues.
The expert, who has worked with more than 20,000 grieving people over 25 years, agrees that conversational videos can be useful to keep memories alive, tell anecdotes or pass information between generations with emotion. Also when it comes to replicating some of the techniques that are done in consultation to close pending issues, which could not be resolved by talking. “I ask some questions about the bond with the person who has died, for example 'what I liked most about you' or 'when you disappointed me the most'. With those answers, the mourner writes a letter and reads it to an empty chair,” he describes. According to him, AI could be applied to dynamics like this, in a timely manner, as long as it is closely supervised by a professional.
González points out that there is also a risk associated with what is expressed in these recordings. Farewell messages can be very powerful and help alleviate suffering because it is the moment in which you tell your family how much you love them, it frees them from guilt and that makes grieving much more bearable. However, without expert supervision, even the best of intentions could cause an adverse effect. “Imagine that I am the father of an only daughter and I tell her: 'I leave you as a vital goal to take good care of your mother.' It can be very beautiful, but it can also be a sentence if the person who would be the mother is extremely sick,” he exemplifies. It is at this moment when a professional would recommend the father speak in a different way to prevent the creation of an emotional burden. And if there is no supervision, the probability of misunderstandings increases.
An ethical problem
To what extent can an avatar be faithful? Who owns it? What type of data can be used for its creation? These are just some of the questions that arise around this topic. For Gry Hasselbalchan ethicist at the European Research Council, the implications expand to an existential sphere: “Every technology that is based on the fact or the idea that it can compete with humans raises the question of what it means to be human, what are our limits and if it is possible to use it to overcome a limit.”
Hasselbalch, who is also co-founder of the Danish Think Tank DataEthics.eu, believes that the proliferation of avatars of the deceased represents a dilemma that goes beyond data, consent or who has the rights. “It could change the identity of humanity and the human being, because it questions the very idea of mortality,” she says.
Among several potential problems, the AI ethics expert highlights the possibility of a tool that not only collects the content of social networks, emails and mobile messages of a deceased person, but also their internet search patterns. This could reveal unknown hobbies or interests of the p
erson, from a passion for an animal or a sport to, in the worst case, a dark secret.
If artificial intelligence combines this information with other elements that constitute your identity, but gives greater relevance to certain aspects, it could result in the creation of an avatar or robot that bears little or no resemblance to what that person was like in real life. It is a scenario in which “control would be lost,” he warns. And it's not the only one. “How easily could you be manipulated if a loved one you miss tells you to vote a certain way or buy specific things? We don’t know what companies will emerge behind this,” she reflects.
'Deepfakes' and 'copyright'
One of StoryFile's clients has been the late Sam Walton, founder of the giant Walmart business. “We worked with his company file. “We reviewed many hours of material, transcribed his speeches, his videos and created 2,500 answers to questions he had answered during his life with exactly the same words he used,” he describes. The result was a digital recreation that has Walton's face, voice and a life-size hologram. Can it be very realistic? “People who knew Sam get misty-eyed because of how realistic he is,” says Alan Dranow, an executive at this company. The businessman's family had given consent for this production, but other famous have had their faces and words recreated by AI without common agreement.
This is the case of the American comedian George Carlin, who died in 2008, and whose voice and style were cloned for the creation of the podcast “George Carlin: I'm Glad to Be Dead.” posted on YouTube at the beginning of January. Last week, a lawsuit was filed in federal court in Los Angeles requesting that Dudesy, the company behind it, immediately remove the audio special. His daughter, Kelly Carlin, had already criticized the production, in which a synthesis of the artist's voice comments on current episodes. “My father spent a lifetime perfecting his craft from his humanity, brain and imagination. No machine will ever replace his genius. These AI-generated products are ingenious attempts to recreate a mind that will never exist again. “Let the artist’s work speak for itself,” he said at the platform.
According to StoryFile, the service that integrates the most advanced in this technology is aimed only at a select group. “We do not offer it as a product on our website at this time, but rather for private clients. We do not want our technology to be used to create a deepfake from someone else,” Smith qualifies.
However, there are alternatives that do. The company HeyGen, for example, allows you to generate videos with voice cloning, lip synchronization and speaking styles. If you don't look very carefully, it is almost impossible to notice that it is an artificial creation. Although the platform is presented as a solution to personalize and translate content in the corporate world, in practice it can be used for any purpose of this type: saying goodbye to a loved one or using it to generate money.
You can follow EL PAÍS Technology in Facebook and x or sign up here to receive our weekly newsletter.
#Talking #longer #risk #resurrecting #dead #artificial #intelligence