Communicating with the Deceased Through AI
If a loved one were to pass away tomorrow, would you want to continue communicating with them? Not merely through memories or saved messages, but via artificial intelligence—a chatbot that utilizes their texts, emails, and voice notes to respond in their unique tone and style.
A growing number of technology companies now provide such services within the "digital afterlife" industry, valued at over £100 billion. Some individuals use these AI tools as a means to cope with grief.
Cardiff University's Dr Jenny Kidd has led research on these so-called deathbots, published in the Cambridge University Press journal Memory, Mind and Media. She described the findings as both
"fascinating and unsettling".

Historical Context and Technological Advances
Attempts to communicate with the dead are not a new phenomenon. Practices such as séances and spiritualist mediums have existed for centuries. However, with technological advancements, AI has the potential to make these interactions more convincing and scalable.
In 2024, James Vlahos shared with the BBC how, after learning his father was dying from cancer, he recorded his dad's voice and created an AI chatbot. He described the experience as wonderful for preserving his father's memory, stating that while it did not alleviate the pain of loss,
"I have this wonderful interactive compendium I can turn to."

Perspectives on Deathbots and Grief
The Workplace Bereavement support group noted that while deathbots are not yet widely used, there is growing curiosity about them. Jacqueline Gunn, the group's founder, emphasized the limitations of these AI tools:
"These deathbots and AI tools are only as good as the information they are given. They don't grow or adapt in the way grief does. For some they may offer a stepping stone, but they cannot be the destination. Grief is a deeply personal human response to death, needing time, understanding and human connection."
Research into AI Deathbots
Dr Kidd collaborated with Eva Nieto McAvoy from King's College London and Bethan Jones from Cardiff University to examine how these technologies operate in practice. Their research focused on AI systems designed to imitate the voices, speech patterns, and personalities of deceased individuals by analyzing their digital traces.
Although these AI deathbots are often marketed as sources of comfort and connection, the researchers argue that they rely on a simplified understanding of memory, identity, and relationships.
Kidd's interest in this subject began during the Covid-19 pandemic, when AI-generated animated photographs became prevalent on social media. People uploaded old photographs of ancestors and watched them blink, smile, and move their heads as software "reanimated" their loved ones.
"These things were really creepy, but really quite interesting as well,"Kidd said.
"All of a sudden they were everywhere and millions of people were sharing them. That was us stumbling into this kind of work of AI revival."

Testing Deathbots: Personal Experiments
The research team decided to test several commercial deathbot platforms themselves, exploring four different services. Kidd described the experience as
"weird interacting with ourselves in that way but largely unsatisfying because of the technical limitations of these platforms at the moment."
In one experiment, Kidd used her own voice data to create a chatbot. She noted,
"It didn't sound like me, in fact it sounded quite Australian."
While Kidd believes the technology will improve, she remains skeptical about the emergence of a large market for AI deathbots.
"We already have a lot of established rituals and traditions around death,"she said.
"The fact that there hasn't really been a take-off technology in this space maybe indicates there isn't much of a market for it."
Researchers’ Views on Digital Afterlife
When asked whether they would want their own families to recreate them digitally after death, the researchers expressed mixed feelings.
Kidd stated,
"My initial gut reaction is if they want to do that and it's kind of playful, that's fine. But if there's any sense to which, certainly in the future, the persona continues to evolve or says things that I would never say, or has allegiances I would never have, and this begins to mangle people's actual recollections of me and my values then I think I would have a big problem."
Dr Eva Nieto McAvoy shared a different perspective, saying,
"I'm not very religious and I don't have strong thoughts about the afterlife, once I'm dead, who cares? If it helps them, you know... but it can be misconstrued for sure. And do I want my family to pay for a service... I don't know, it's complex."







