The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

AI that simulates dead people risks ‘haunting’ relatives, scientists warn

‘Psychological effect could be devastating,’ Cambridge University ethicists say

Anthony Cuthbertson
Thursday 09 May 2024 11:46 BST
Comments
Ai that simulates dead people is a central theme of the the Black Mirror episode ‘Be Right Back'
Ai that simulates dead people is a central theme of the the Black Mirror episode ‘Be Right Back' (Screengrab/ Black Mirror)
Leer en Español

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

AI simulations of dead people risk “unwanted digital hauntings”, researchers have warned.

A new study by ethicists at Cambridge University found that AI chatbots capable of simulating the personalities of people who have passed away – known as deadbots – should require safety protocols in order to protect surviving friends and relatives.

Some chatbot companies are already offering customers the option to simulate the language and personality traits of a deceased loved one using artificial intelligence.

Ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence say such ventures are “high risk” due to the psychological impact they can have on people.

“It is vital that digital afterlife services consider the rights and consent, not just of those they recreate, but those who will have to interact with the simulations,” said co-author Dr Tomasz Hollanek, from the Leverhulme Centre, said:

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.”

The findings were published in the journal Philosophy and Technology in a study titled ‘Griefbots, Deadbots, Postmortem Avatars: on Responsible Applications of Generative AI in the Digital Afterlife Industry’.

The study details how AI chatbot companies that claim to be able to bring back the dead could use the technology to spam family and friends with messages and adverts using the deceased person’s digital likeness.

Such an outcome would be the equivalent of being “stalked by the dead”, the researchers warned.

“Rapid advancements in generative AI mean that nearly anyone with internet access and some basic know-how can revive a deceased loved one,” said study co-author Dr Katarzyna Nowaczyk-Basinska.

“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.

“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

Recommendations from the study include safeguards around terminating deadbots, as well as improved transparency in how the technology is used.

Black Mirror episode ‘Be Right Back’ features AI simulating dead people
Black Mirror episode ‘Be Right Back’ features AI simulating dead people (Netflix)

Similar to the Black Mirror episode ‘Be Right Back’, chatbot users are already utilising the technology in an effort to emulate dead loved ones. In 2021, a man in Canada attempted to chat with his deceased fiancée using an AI tool called Project December, which he claimed emulated her personality.

“Intellectually, I know it’s not really Jessica,” Joshua Barbeau told The San Francisco Chronicle at the time. “But your emotions are not an intellectual thing.”

In 2022, New York-based artist Michelle Huang fed childhood journal entries into an AI language model in order to have a conversation with her past self.

Ms Huang told The Independent that it was like “reaching into the past and hacking the temporal paradox”, adding that it felt “very trippy”.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in