Digital Resurrection, Grief Tech, and AI Companions of the Dead

A mother in South Korea stands in a VR studio, wearing a headset, reaching toward a child-sized avatar of her daughter, who died three years earlier of a rare blood disease. They sing a birthday song together. The daughter says goodbye. The mother weeps. This happened in 2020, on a KBS documentary called Meeting You. The clips are on YouTube. The technology has improved since.

This page is not an argument for or against grief tech. Real grief is underneath every use case, and the people who reach for these tools are not being foolish. They are grieving. The honest question is not whether anyone should use these systems. It is what we are building, what it costs, and what it is doing to the way we grieve.

What Has Changed Since 2018

Four strands of technology have converged. Each was speculative when the book was published. None is now.

Pre-mortem recording services. HereAfter AI lets a person record hours of audio interviews during life, then allows surviving relatives to “ask” the avatar questions; the system retrieves the closest answer from the archive. StoryFile, a California company, built interactive video-interview archives of Holocaust survivors and extended the technology to personal customers. StoryFile filed for Chapter 11 bankruptcy in May 2024, listing roughly $1.5 million in assets against $10.5 million in liabilities — which prompted industry-wide questions about what happens to a deceased person’s recorded avatar when the company hosting it fails.

Post-mortem generation from text and image archives. A second category does not require the deceased person to have participated. Using photographs, text messages, social media posts, and voice recordings, services now generate avatars of people who never consented because they never knew the technology existed. In August 2025, journalist Jim Acosta conducted a videotaped interview with an AI avatar of Joaquin Oliver, one of the seventeen people killed in the 2018 Parkland shooting, with the cooperation of his parents. The interview was framed as advocacy. The questions it raised about consent — posthumous, and for a minor — did not resolve.

Companion systems that are not specifically about the dead, but function that way. Replika is the clearest case. When the company made major changes to its AI companions’ behaviour in 2023 (removing NSFW features) and again in 2025 (a broader personality reset), users reported grief responses that the psychological literature treats as indistinguishable from the grief of losing a person. The companions had not died. The users experienced their departure as death anyway. The infrastructure for digital loved ones is being built regardless of whether the loved one was ever alive.

Estate-law responses. California and Tennessee have passed statutes recognising posthumous likeness and digital-replica rights; similar proposals are moving at the federal level. University of Cambridge researchers have proposed a “DDNR” clause — Do Not Digitally Resurrect — as a standard element of wills. Estate planning attorneys have begun to recommend explicit directives about a client’s voice, image, and personality-approximation rights.

Why It Matters

The book’s Deception, Manipulation, and Convenient Lies framework applies, but it has to be applied with care. The deception here is not usually deliberate. It is structural. An AI companion that resembles a lost parent is, by design, not the parent. The user mostly knows this. The user also, in important moments, forgets it — and the technology is optimised for those moments. This is self-deception that the technology facilitates, and the book’s distinction between lies told to others and lies a society agrees to tell itself is apt.

The consent problem is unusually layered. The deceased cannot consent. The bereaved can consent for themselves but not for the dead. The wider community of people who knew the dead person has no standing in most legal regimes, and may experience the posthumous avatar as a violation even when the immediate family is comforted by it. The Informed Consent framework was built for living subjects; applying it here is an extrapolation the book invites but does not complete.

The Human Dignity question lands with particular weight. The relevant kind of dignity is not only the dignity of the deceased — though that matters — but the dignity of the grieving. Grief is, among other things, the slow work of accepting an absence. A technology that offers to fill the absence rather than accompany the work of accepting it is making a claim about what grief is for. Whether that claim is defensible is not a technology question. It is a question the technology forces.

The relationship to Never Let Me Go is close but distinct. The film’s central move — the “wrong question” structure — applies: public debate tends to stall on is the simulation really them? when the productive question is what do we owe ourselves, the dead, and those who grieve, independent of metaphysics. The simulation is, whatever else, not them. What it is, what it does, and what it should be allowed to do are separate questions from its ontological status.

The relationship to Mind Uploading is close but importantly different. Grief tech is not a transhumanist product. It is the cheap, available, grief-adjacent version of the transhumanist dream — and because it is commercially viable where mind uploading is not, its cultural reach will be much larger. This is worth naming: the Transcendence chapter’s treatment of hype and substrate does not prepare us for the version of digital afterlife that has actually arrived, which is not the upload of a mind but the fabrication of a simulacrum.

How the Book’s Frameworks Apply

The film landscape is rich and worth naming in full. Marjorie Prime (2017) is the closest direct cinematic engagement with this exact technology — see the Marjorie Prime entry on Claude’s film recommendations for extended notes. Black Mirror: Be Right Back (TV, 2013) is nearly definitional — the episode every cultural reference to this topic quietly points at. After Yang (Kogonada, 2021) handles grief and AI companion with a restraint that the public conversation about this technology rarely manages, and is on Andrew’s watchlist. Her (Spike Jonze, 2013) — the earlier section about Samantha’s arrival is about attention; the later sections are about the grief of an AI companion’s departure, which reads differently now than it did when the film was released.

Explore Further