Quick Answer: Digital reincarnation uses AI to simulate deceased individuals through their digital data β texts, emails, voice recordings. While offering grief relief to some, it raises profound ethical questions about consent, psychological harm, identity rights, and the commodification of death. Navigating this requires clear legal frameworks, psychological guidance, and personal boundaries.
The dead are speaking again β and this time, through chatbots.
In 2021, Joshua Barbeau made international headlines when he used the GPT-3-powered platform Project December to hold conversations with a simulation of his late fiancΓ©e, Jessica Pereira, who had died of a rare neurological disease. He fed the system her old text messages, social media posts, and written memories. The result was a bot that replied in her voice, referenced their shared history, and even cracked her signature jokes. Barbeau described the experience as simultaneously healing and heartbreaking.
This wasn't science fiction. It was Tuesday morning in 2021 β and it signals something fundamentally new about how humanity is confronting grief in the digital age.
What Is Digital Reincarnation?
Digital reincarnation refers to the reconstruction of a deceased person's personality, voice, or conversational patterns using artificial intelligence trained on their digital footprint. This includes:
- Text-based avatars: Chatbots trained on emails, social media archives, and message histories
- Voice clones: AI-generated audio replicating the deceased's vocal characteristics
- Visual deepfakes: Video avatars recreating a person's appearance and mannerisms
- Composite entities: Full multimodal avatars combining voice, video, and conversational AI
Companies like HereAfter AI, StoryFile, Eternos, and Soul Machines have commercialized versions of this technology. South Korean broadcaster MBC aired a documentary in 2020 in which a mother used VR to meet a digital avatar of her seven-year-old daughter who had died of leukemia. The segment was viewed over 35 million times online within weeks.
The Grief Landscape: Where Technology Meets Loss
Grief is not a linear process. The KΓΌbler-Ross model of five stages β denial, anger, bargaining, depression, acceptance β is widely referenced, but modern grief psychology has evolved considerably. Researchers like George Bonanno (Columbia University) have shown that grief trajectories are highly individualized, with many people naturally resilient and others experiencing prolonged grief disorder (PGD), a condition now formalized in the DSM-5-TR.
For individuals with PGD β estimated to affect 7β10% of bereaved people globally according to a 2021 World Psychiatry meta-analysis β even minor stimuli can trigger destabilizing episodes. This is precisely the population most likely to turn to digital avatars, and also the most vulnerable to their misuse.
The question isn't simply can we build these systems. It's should we β and under what conditions?
The Core Ethical Tensions
1. Consent and Posthumous Identity Rights
Did the deceased consent to being digitally reconstructed?
This is the foundational problem. Current legal frameworks in most jurisdictions offer no explicit right to control one's posthumous digital identity. California's Celebrities Rights Act and the EU's GDPR offer partial protections, but they were not designed with conversational AI in mind.
A person's WhatsApp archive may legally belong to the platform. Their personality, however β their irony, their hesitations, their characteristic way of expressing love β is something far more intimate. Training an AI on that data without prior written consent arguably violates the spirit of personal autonomy, even if it doesn't violate current law.
Advance digital directives β legal documents specifying what can and cannot be done with one's data after death β are emerging as a practical solution. Sweden and the UK have begun informal discussions at the policy level. Individuals can draft such directives today through estate attorneys familiar with digital asset law.
2. Psychological Risk: Comfort or Complication?
The clinical evidence is nascent but cautionary.
A 2023 study published in Death Studies by researchers at the University of Melbourne examined 40 bereaved individuals who engaged with text-based AI simulations of deceased loved ones. Key findings:
- 62% reported short-term emotional comfort
- 38% reported increased difficulty with acceptance
- 21% reported intrusive thoughts that worsened over the following month
The study was small, but it points toward a bifurcation: for some, brief, structured engagement with an avatar may ease transition. For others β particularly those predisposed to PGD β it may freeze the grief process, replacing acceptance with a digital dependency.
Licensed grief therapists are increasingly advising clients to treat AI grief tools the way one treats alcohol at a wake: situationally acceptable in small doses, potentially destructive when it becomes a substitute for real processing.

