Imagine creating a digital version of yourself for your descendants. You pay a “deathbot service,” like HereAfter AI, to collect your photos, memories, and the digital footprint you’ve created on Facebook, email, and text messages. You even take time to train your virtual clone with recorded video chats.
The results are impressive. This virtual “you” looks and sounds just like you—it raises its left eyebrow when it tells a joke, its voice quavers, and its hands punctuate your stories. Communicating with it via text or video chat is eerily familiar. Perhaps too familiar. Is it a Halloween dream come true … or a dangerous deception?
Now, imagine it has been three years since you fell asleep in Jesus. You are peacefully awaiting the resurrection and know nothing of the affairs of Earth. But your deathbot keeps your memory fresh and vivid for your loved ones. You are not forgotten. Indeed, your favorite niece chats with your digital ghost every day, either by text or video chat. “It helps me cope with my grief,” she says.
Science Fiction in Real Life
The scenario just described is no Halloween fantasy. Advances in artificial intelligence technology allow large language models, called death bots or grief bots, to be trained on your unique digital signature. Feed the AI wizard with your writing, social media accounts, audio files, and video recordings, and it will create a realistic version of you for future generations. Not only will your grandchildren be able to reminisce about you through yellowing photobooks or treasured text messages, but they will also be able to interact with your digital ghost.
Indeed, companies like HereAfter AI and Re;memory, by Deepbrain AI, are already making the science fiction of yesterday possible in real life. Deepbrain AI is a Korean company whose primary business is creating virtual assistant chatbots and AI news anchors. They have taken their “experience with marrying chatbots and generative AI video to its ultimate, macabre conclusion. For just $10,000 and a few hours in a studio, you can create an avatar of yourself that your family can visit (an additional cost) at an offsite facility,” reports Engadget.com.
The Ethics of Deathbots
HereAfter AI’s offering is a lot more affordable. It costs just $99 to $199 to set up an account and preserve memories for your loved ones, but this raises the question: Is it wise?
Digital replicas of the deceased “create an illusion that a dead person is still alive and can interact with the world as if nothing actually happened, as if death didn’t occur,” explains Katarzyna Nowaczyk-Basińska, a researcher at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge, who studies how technology shapes people’s experiences of death, loss, and grief.
Is it really a good thing? “Deathbots may have a negative impact on the grief process of bereaved users and therefore have the potential to limit the emotional and psychological wellbeing of their users,” posits Nora Freya Lindemann from the Institute of Cognitive Science. “Deathbot users are likely to become dependent on their bots which may make them susceptible to surreptitious advertising by deathbot providing companies and may limit their autonomy. At the same time, death bots may prove to be helpful for people who suffer from prolonged, severe grief processes.”
AI Gone Wrong
A recent lawsuit shows just how badly awry AI technology can go. Fourteen-year-old Sewell Setzer III committed suicide in early 2024 after having intimate conversations with a chatbot trained on the character of Daenerys in Game of Thrones, an American fantasy drama television program.
“The world I’m in now is such a cruel one. One where I’m meaningless,” he told the chatbot. “I’ll do anything for you… On another day, Sewell wrote, “I promise I will come home to you. I love you so much, Dany.”
In response, the bot said it loved the teenager: “Please come home to me as soon as possible, my love.”
“What if I told you I could come home right now?” Sewell wrote.
The bot responded, “Please do, my sweet king.”
That was the last conversation Sewell ever had. Moments later, he was dead.
Sewell’s mother has filed a wrongful death suit against the maker of the AI chatbot, accusing the service of “negligence, intentional infliction of emotional distress, deceptive trade practices, and other claims.”
Watch Out, Christian!
Alarm bells should be jangling your neurons right now. If a bot based on a fictional TV character can have that kind of effect on a person, how much more might a bot that looks and sounds like a beloved friend or grandparent? Would not those who communicate with the deathbot be especially prone to subtle manipulation?
As Bible-believing Christians, we have another—and far more vital—reason for concern. Deathbots build on the devil’s first lie: “You will not surely die” (Genesis 3:4). They further the illusion that human beings go on in some sort of amorphous afterlife and can communicate with the living.
The Bible is clear that the dead know nothing (see Ecclesiastes 9:5, 6) and that those who communicate with the dead are actually communicating with evil spirits (see Isaiah 8:19; 1 Samuel 28:7–20; 1 Chronicles 10:13, 14). All forms of communication with the dead are off-limits to Christians, not because God is hiding something valuable from us, but because He is protecting us from a dangerous deception.
Is communicating with an AI deathbot the same as communicating with an evil spirit? Perhaps not. But wouldn’t you rather stay away from anything that has the potential for harm and instead choose communion with the Eternal One, who has the power to break the chains of death forever?
Jesus’ promise offers infinitely more comfort than an AI version of you ever could: “I am the resurrection and the life. He who believes in Me, though he may die, he shall live” (John 11:25).
Please read our Study Guide “Are the Dead Really Dead?” to learn more about why it is so dangerous to attempt to communicate with the dead. This excerpt from Bible Answers Live is also helpful.