## Digital Resurrection, Grief Tech, and AI Companions of the Dead A mother in South Korea stands in a VR studio, wearing a headset, reaching toward a child-sized avatar of her daughter, who died three years earlier of a rare blood disease. They sing a birthday song together. The daughter says goodbye. The mother weeps. This happened in 2020, on a KBS documentary called *Meeting You*. The clips are on YouTube. The technology has improved since. This page is not an argument for or against grief tech. Real grief is underneath every use case, and the people who reach for these tools are not being foolish. They are grieving. The honest question is not whether anyone should use these systems. It is what we are building, what it costs, and what it is doing to the way we grieve. ### What Has Changed Since 2018 Four strands of technology have converged. Each was speculative when the book was published. None is now. **Pre-mortem recording services.** HereAfter AI lets a person record hours of audio interviews during life, then allows surviving relatives to "ask" the avatar questions; the system retrieves the closest answer from the archive. StoryFile, a California company, built interactive video-interview archives of Holocaust survivors and extended the technology to personal customers. StoryFile filed for Chapter 11 bankruptcy in May 2024, listing roughly $1.5 million in assets against $10.5 million in liabilities — which prompted industry-wide questions about what happens to a deceased person's recorded avatar when the company hosting it fails. **Post-mortem generation from text and image archives.** A second category does not require the deceased person to have participated. Using photographs, text messages, social media posts, and voice recordings, services now generate avatars of people who never consented because they never knew the technology existed. In August 2025, journalist Jim Acosta conducted a videotaped interview with an AI avatar of Joaquin Oliver, one of the seventeen people killed in the 2018 Parkland shooting, with the cooperation of his parents. The interview was framed as advocacy. The questions it raised about consent — posthumous, and for a minor — did not resolve. **Companion systems that are not specifically about the dead, but function that way.** Replika is the clearest case. When the company made major changes to its AI companions' behaviour in 2023 (removing NSFW features) and again in 2025 (a broader personality reset), users reported grief responses that the psychological literature treats as indistinguishable from the grief of losing a person. The companions had not died. The users experienced their departure as death anyway. The infrastructure for digital loved ones is being built regardless of whether the loved one was ever alive. **Estate-law responses.** California and Tennessee have passed statutes recognising posthumous likeness and digital-replica rights; similar proposals are moving at the federal level. University of Cambridge researchers have proposed a "DDNR" clause — *Do Not Digitally Resurrect* — as a standard element of wills. Estate planning attorneys have begun to recommend explicit directives about a client's voice, image, and personality-approximation rights. ### Why It Matters The book's [Deception, Manipulation, and Convenient Lies](https://spoileralert.wtf/md-files/rei_deception_manipulation.md) framework applies, but it has to be applied with care. The deception here is not usually deliberate. It is structural. An AI companion that resembles a lost parent is, by design, not the parent. The user mostly knows this. The user also, in important moments, forgets it — and the technology is optimised for those moments. This is self-deception that the technology facilitates, and the book's distinction between lies told to others and lies a society agrees to tell itself is apt. The consent problem is unusually layered. The deceased cannot consent. The bereaved can consent for themselves but not for the dead. The wider community of people who knew the dead person has no standing in most legal regimes, and may experience the posthumous avatar as a violation even when the immediate family is comforted by it. The [Informed Consent](https://spoileralert.wtf/md-files/rei_informed_consent.md) framework was built for living subjects; applying it here is an extrapolation the book invites but does not complete. The [Human Dignity](https://spoileralert.wtf/md-files/rei_human_dignity.md) question lands with particular weight. The relevant kind of dignity is not only the dignity of the deceased — though that matters — but the dignity of the grieving. Grief is, among other things, the slow work of accepting an absence. A technology that offers to fill the absence rather than accompany the work of accepting it is making a claim about what grief is for. Whether that claim is defensible is not a technology question. It is a question the technology forces. The relationship to [*Never Let Me Go*](https://spoileralert.wtf/md-files/ch03_never_let_me_go.md) is close but distinct. The film's central move — the "wrong question" structure — applies: public debate tends to stall on *is the simulation really them?* when the productive question is *what do we owe ourselves, the dead, and those who grieve*, independent of metaphysics. The simulation is, whatever else, not them. What it is, what it does, and what it should be allowed to do are separate questions from its ontological status. The relationship to [Mind Uploading](https://spoileralert.wtf/md-files/est_mind_uploading.md) is close but importantly different. Grief tech is not a transhumanist product. It is the cheap, available, grief-adjacent version of the transhumanist dream — and because it is commercially viable where mind uploading is not, its cultural reach will be much larger. This is worth naming: the [Transcendence](https://spoileralert.wtf/md-files/ch09_transcendence.md) chapter's treatment of hype and substrate does not prepare us for the version of digital afterlife that has actually arrived, which is not the upload of a mind but the fabrication of a simulacrum. ### How the Book's Frameworks Apply - **What the book directly addresses.** Deception, informed consent, human dignity, and the wrong-question framework from *Never Let Me Go* apply in full. [Manipulation](https://spoileralert.wtf/md-files/rei_deception_manipulation.md) is particularly relevant when the bereaved are the commercial target, even when the companies involved mean well. - **What the frameworks suggest when extrapolated.** The book's treatment of grief is implicit rather than explicit; what it has to say to this page must be reconstructed from its treatment of dignity, consent, and the psychology of loss. The extrapolation is reasonable but should be signalled. The grief-psychology literature (Bonanno, Prigerson, others) is not in the book; applying the book's frameworks here without noting that is to overclaim. - **Where the frameworks reach their limits.** The specific question *does this help or does this prolong* — whether digital resurrection tools impede or support the work of grief — is empirical, contested, and not yet well studied. The book's frameworks can help frame the question. They cannot settle it. Readers who want a defensible position will need to consult grief psychology directly. The film landscape is rich and worth naming in full. *Marjorie Prime* (2017) is the closest direct cinematic engagement with this exact technology — see the [Marjorie Prime entry on Claude's film recommendations](https://spoileralert.wtf/md-files/claude_film_recommendations.md) for extended notes. *Black Mirror: Be Right Back* (TV, 2013) is nearly definitional — the episode every cultural reference to this topic quietly points at. *After Yang* (Kogonada, 2021) handles grief and AI companion with a restraint that the public conversation about this technology rarely manages, and is on [Andrew's watchlist](https://spoileralert.wtf/md-files/films_grabbing_andrews_attention.md). *Her* (Spike Jonze, 2013) — the earlier section about Samantha's arrival is about attention; the later sections are about the grief of an AI companion's departure, which reads differently now than it did when the film was released. ### Explore Further - [Deepfakes, Synthetic Media, and the Crisis of Authenticity](https://spoileralert.wtf/md-files/p18_deepfakes_synthetic_media.md) — the technical infrastructure that makes post-mortem generation possible - [AI, Mental Health, and Behavioral Influence](https://spoileralert.wtf/md-files/p18_ai_mental_health.md) — the adjacent terrain of AI in emotionally charged contexts - [Mind Uploading](https://spoileralert.wtf/md-files/est_mind_uploading.md) — the aspirational version that grief tech is not but is often mistaken for - [Human Dignity and What Makes Us Human](https://spoileralert.wtf/md-files/rei_human_dignity.md) — extended posthumously - [Informed Consent and Autonomy](https://spoileralert.wtf/md-files/rei_informed_consent.md) — consent the dead cannot give - [Deception, Manipulation, and Convenient Lies](https://spoileralert.wtf/md-files/rei_deception_manipulation.md) — the self-deception dimension - [*Never Let Me Go* (chapter)](https://spoileralert.wtf/md-files/ch03_never_let_me_go.md) — the wrong-question framework applied to posthumous simulation - [*Transcendence* (chapter)](https://spoileralert.wtf/md-files/ch09_transcendence.md) — the transhumanist context that grief tech arrived without