Can AI Bring Back the Dead? A Christian Perspective on Griefbots
Recently, several companies have begun to promote and market “grief tech,” promising consumers an AI-powered method to ease the pain of losing a loved one. Grief tech is an umbrella term that encompasses applications like programs that design memorial videos, prepare end-of-life documents, or plan funerals. But, of course, generative AI can do much more than execute tasks; it can mimic human function and behavior with startling accuracy. So it may come as no surprise that companies are beginning to promote a bold new grief-tech application: AI-powered video chatbots in the form of their deceased loved ones—griefbots.[1] The marketing strategy is simple: by speaking to lifelike avatars of deceased family members or friends, users can ease the pain of grief. But at least one company goes even further, claiming their application offers people the chance to “reunite with your loved ones.”[2]
How should Christians think about such claims? What are some key things we need to consider in the event that chatting with video bots of dead loved ones becomes mainstream? And most basic, should Christians use grief bots at all?
To sort through the issue, I’ll use a four part framework, considering (1) the nature of the act itself, (2) the effect on the actor, (3) the aim of the act, and (4) the effect on our associations. For a better understanding of this framework, read this article by Dan Trippie: Think Christian When Answers Are Not Entirely Clear.
Consider the Act
The companies promoting this technology want users to create a video chatbot in the likeness of their deceased loved one, and then interact with it as if it actually is that person. More bluntly, we are asked to reduce our loved one to a set of physical characteristics to maintain our contact with them after they die.
But is it actually possible to capture the essence of a person in a video bot? I don’t think so, and let’s consider what a bot is to understand why. The most sophisticated griefbots use a Large Language Model (LLM)—a form of AI which mimics human conversation by providing understandable and appropriate responses to user prompts. Imagine a son—call him Jim—wants to create a griefbot in the likeness of his dad, Bob. To create it, Jim would provide the system with a large amount of personal data, including text, pictures, audio samples, and biographical “information about the deceased interests, traits, preferences, and character.”[3] Each type of data is tokenized into numerical form that represents the semantic meaning and relational patterns. The more detailed the information, the more accurately avatar-Bob can replicate real-Bob’s actions, speech patterns, expressions, and reactions when prompted. So, for example, when Jim does something as simple as say, “I love you,” the system processes each of his words and the surrounding image through its neural network, and then generates an output that seems appropriate for Bob: “I love you too, son.”[4]
And while that output seems appropriate at first blush, scratch a little deeper and you’ll recognize it’s not at all an appropriate response in a thicker sense. When humans think of a proper response to love, we mean something more than words. We mean something that flows from the essence of one’s being. We mean something fueled by emotion, sensorial affection, and cognitive awareness. But the griefbot, no matter how well constructed, is not capable of that. It is a machine programmed to give an output. It can’t feel, consider, or care; it computes data and generates data. In other words, a bot and a human are completely ontologically distinct even though we sometimes use humanlike words to describe machine operations. But when we do, we do so by metaphor or analogy.[5] While the bot may look and sound like Bob, it’s not actually him.
In contrast, the Bible teaches that human beings are not merely physical things; we are given “the breath of life” (Gen. 2:7) by God himself. There is a difference between real life and a simulation of it. Unlike human creators, God is uniquely capable of creating sentient beings and endowing them with real capacities of soul like reason, a will, and relationality.
The Effect on the Actor(s)
Every decision we make contributes to our character formation in some way. So if we create a griefbot in our loved one’s likeness, what type of people will we become? What virtues will be developed or thwarted in us?
Genuine human relationship requires two persons with the same capacities of soul. When we talk with a living person, we are interacting with a self-aware being with an embodied soul who has real emotions, real agency, and a real will. And the effect of those relationships are positive. We are challenged, encouraged, and inspired. But a griefbot is not the same type of being. When a videobot produces an unpleasant groan after processing a frown image and a sad statement by a user, it’s not the same as human emotion. Likewise, when that same videobot responds to a heartfelt statement of love by saying, “I love you,” in a comforting tone, it is not expressing genuine love. As a computer program, it has no will, awareness, or relationality, and therefore, it is incapable of challenging, encouraging, inspiring, or reciprocating love as a human would.
But it can be tricky, because a bot’s simulated emotions can evoke real emotions in the user. If, for example, a bot correctly processes a person’s cries, and generates a comforting output for him, the user may feel comforted. And similarly, if the bot mimics familiar forms of intimacy, he may even experience feelings akin to love.
Here lies a moral problem, though, because by using avatar-Bob to satisfy his own emotional needs, Jim must objectify real-Bob. This is not relationship; it’s dependency. And over a prolonged period of time, the fruit that develops is selfishness (not selflessness) and addiction (not love).
The Aim of the Decision
The main aim of this project is to ease the pain of grief and loneliness. And of course, seeking to comfort the grieving and lonely is appropriate (Ps. 34:18; Rom. 12:15; 1 Thess. 4:13-18), but Christians must wonder if creating a griefbot of a deceased loved one is an appropriate way to do that. Moreover, does it even accomplish its stated aim?
The emotional pain of grief when someone dies is evidence of the depth of our love and value for him/her. It’s also an implicit confession that there’s something wrong about death. We grieve because our relationship is lost and we must move on with life without our loved one here on earth. But as Christians, we also confess that physical death is not the final end, for Jesus promised, “Whoever believes in me, though he die, yet shall he live” (John 11:25). So even as we grieve, we do so with hope, looking forward to a reunion with our loved ones in Christ (1 Thess. 4:13-18).
But griefbots threaten all of that. Attempting to maintain an ongoing relationship with our loved ones here on earth even after their physical death doesn’t ease grief—it is an attempt to avoid it altogether. And instead of instilling a long-term, heavenly perspective; it overemphasizes the earthly present.
Associational Impact
This technology could be very attractive to our society. And if it does go mainstream, real relationships with real loved ones might be jeopardized. Here are just two ways it could affect us on a societal level:
Time spent with loved ones won’t be as consequential as it once was, and therefore, the priority we give to loved ones will decrease relative to other things. If we can continue relationships even after physical death, then why should we rush through work to get home on a Friday night?
Physical death won’t be understood as tragic or sad. If griefbots can closely mimic loved ones—and relationship can continue—then why mourn the failure of the physical body?
In conclusion, Christians should resist the temptation to create a griefbot. Rather than honoring our loved one, it objectifies them. Instead of aiding the grieving process, it attempts to bypass it altogether. And we have something much better to look forward to than a digital resurrection anyway, for Jesus himself promises a bodily resurrection and a real reunion for those who love him (John 6:40).
Notes and Works Cited:
[1] Here are just a few: HereAfter AI, Replika, YOV, and re;memory
[2] This is an explicit claim of re;memory on their home page.
[3] Kwan Yiu Cheng, “The Law of Digital Afterlife: The Chinese Experience of AI 'Resurrection' and 'Grief Tech',” International Journal of Law and Information Technology 33 (January 2025), 4.
[4] Jerry Kaplan, Generative Artificial Intelligence: What Everyone Needs to Know (Oxford, UK: Oxford University Press, 2024), 30-63.
[5] John R. Searle, “Minds, Brains, and Programs,” in The Philosophy of Artificial Intelligence, ed. Margaret A. Boden (Oxford: Oxford University Press, 1990), 71-72.