Open-access Metabioethics and ChatGPT-based posthumous memories at the service of palliative medicine

Dear Editor,

Biocomputer literature has speculated on ChatGPT application in post mortem interactions, i.e., the interlocution between living people and supervening “memories” of already deceased persons.1,2 Within the generative artificial intelligence (AI) research spectrum, such speculations raise a bioethical dilemma related to the violation of the dignity of dead people, who did not consent or authorize such invasive appropriation and exploitation of their memories and ethos in their lives. In a sense, our perspective presupposes and proposes a new question: is the bioethical justification of the referred ChatGPT usefulness possible? Behold an approach with its relevance relying on metaphysical insight that individual autonomy contains a post mortem projection: perhaps this is a core statement of metabioethics (or bioethical metaphysics).

Any approach seeking for bioethical justification for ChatGPT technology application in post mortem interactions potentially significantly impacts palliative medicine. Indeed, it can lead terminally ill patients to improve their own autonomy by encouraging them to customize, train or personalize ChatGPT so that such a tool could enable the post mortem interlocution of their memories (entropic audiovisual records on key issues) with friends and relatives. Conjecture that might trigger positive effects regarding end-of-life perception/consciousness as a relevant existential value.3 Here is a clear unfolding of the essay “Castling Against Death: A Chess-Based Insight into the Paradox of Physician-Assisted Suicide”.4

Nonetheless, how to perform that in terminally ill patients retaining limited autonomy due to neurological impairments (e.g., dementia or Asperger’s syndrome, the latter of being a form of an autism5 spectrum disorder)? We might deduce at least one hypothetical implication (emerging brain-reading medical technology-linked AI for palliative purposes) related to memory back-up, widening and re-signifying the intervention horizon of palliative care physicians. Brain-computer interface would be modelled on a device6,7 (e.g., electrode grids inside the cortex) that uses algorithms for tracking, scanning, translates and converts neuronal data into pictogram/pixel sets (non-verbal5 ludic communication) adapted to architecting display memories through a custom-made ChatGPT, associable to an immersive virtual space (metaverse) with the holographic avatar of a terminally ill patient (Figures 1 and 2) or interactive cyberspace (multiverse) via a phenotypically similar android, exhibiting similar morphology to that of anthropomorphic sex robot8 prototypes (Figure 3).

Figure 1
In China, a young man uses AI/chatbot to develop a “Matrix” audiovisual interaction with the avatar of his late grandmother (AI-generated persona)

Figure 2
The same young man uses AI/chatbot to perform a textual dialog with the avatar of his late grandmother

Figure 3
Sex android Denise

Brain-computer interfaces are already often implanted as assisted living devices for individuals with behavioral, language, sensorimotor, or cognitive disabilities, within neuroprosthetics, a multidisciplinary field at the interface of neurosciences and biomedical engineering, aiming at replacing parts of the nervous system compromised by neurological disorders or injury.9 Recent proof-of-concept studies suggest that electrical neuromodulation strategies could also be useful in alleviating/palliating certain cognitive and memory deficits, particularly concerning dementia.6,9

Related to the above-mentioned “dead chat” experiment (Figures 1 and 2), the adherence of terminally ill patients to design and calibrate a personalized chatbot reveals several important comparative advantages:

  • higher-level bioethical legitimacy: due to prior consent and the most possible extent of autonomous participation of terminally ill patients;

  • increased authorial credibility: due to the increased verisimilitude of post mortem feedback (resulting from algorithmic consciousness trained and sanctioned by the terminally ill);

  • improved regulatory transparency: due likely to the scrutiny of medical associations and research ethics committees or a forthcoming National Palliative Care Authority (health policy related to interface of medtech and society), especially when e.g., the planning of chatbots precedes physician-assisted suicide;

  • wider palliative range: due to the reliability/versatility acquired in palliating post-traumatic grief, mainly among the relatives of terminally ill patients.

Perpetuating the simulated memory (with higher accuracy or authenticity over time by optimizing data curation, toxicity filters, etc.)10-12 as the family legacy of terminally ill patients could become a smart and organic property/challenge of the semantic corpus of chatbots, like self-taught and continuous flow synapses. In addition, it frames a long-term journey to create a care-driven digital ancestry ecosystem (AI-generative “grandparents” would still manage to scour the web for everything from clinical trial outcomes to fake news/deepfakes involving their living descendants). All with the prior and direct consent of terminally ill patients, or indirect13 authorizations via legal representatives.

REFERENCES

  • 1 Mobius G. Chat with Chat GPT on Life and Death. Zenodo. 2023 Published February 25, version 3 [cited 2025 Apr 24]. Available from: https://doi.org/10.5281/zenodo.7684286
    » https://doi.org/10.5281/zenodo.7684286
  • 2 Yan A. Dead chat: Shanghai man uses AI technology to "resurrect" late grandmother by creating virtual version to talk to, triggering controversy in China. Sout Chin Mor Post, Shanghai, 17 April 2023 [cited 2025 Apr 24]. Available from: https://www.scmp.com/news/people-culture/trending-china/article/3216945/dead-chat-shanghai-man-uses-ai-technology-resurrect-late-grandmother-creating-virtual-version-talk
    » https://www.scmp.com/news/people-culture/trending-china/article/3216945/dead-chat-shanghai-man-uses-ai-technology-resurrect-late-grandmother-creating-virtual-version-talk
  • 3 Chambaere K, Vander Stichele R, Mortier F, Cohen J, Deliens L. Recent trends in euthanasia and other end-of-life practices in Belgium. N Engl J Med. 2015;372(12):1179-81.
  • 4 de Araújo AF. Letter to the Editor: Castling Against Death: A Chess-Based Insight into the Paradox of Physician-Assisted Suicide. J Palliat Med. 2023;26(7):892.
  • 5 Tunç B, Yankowitz LD, Parker D, Alappatt JA, Pandey J, Schultz RT, et al. Deviation from normative brain development is associated with symptom severity in autism spectrum disorder. Mol Autism. 2019;10(1):46.
  • 6 Moses DA, Metzger SL, Liu JR, Anumanchipalli GK, Makin JG, Sun PF, et al. Neuroprosthesis for Decoding Speech in a Paralyzed Person with Anarthria. N Engl J Med. 2021;385(3):217-27.
  • 7 Flesher SN, Downey JE, Weiss JM, Hughes CL, Herrera AJ, Tyler-Kabara EC, et al. A brain-computer interface that evokes tactile sensations improves robotic arm control. Science. 2021;372(6544):831-6.
  • 8 Araújo A. Anthropomorphic sex robots across the genitalia-computer interface: AI-generated lover persona, infopower feminist bioethics, and Alexa-style humanity. AI Ethics. 2025;5:3383-6.
  • 9 Gupta A, Vardalakis N, Wagner FB. Neuroprosthetics: from sensorimotor to cognitive disorders. Commun Biol. 2023;6(1):14.
  • 10 Editorial N. The AI writing on the wall. Nat Mach Intell. 2023;5(1):1.
  • 11 Thorp HH. ChatGPT is fun, but not an author. Science. 2023;379(6630):313.
  • 12 Jakesch M, Hancock JT, Naaman M. Human heuristics for AI-generated language are flawed. Proc Natl Acad Sci USA. 2023;120(11):e2208839120.
  • 13 Minssen T, Vayena E, Cohen IG. The Challenges for Regulating Medical Use of ChatGPT and Other Large Language Models. JAMA. 2023;330(4):315-6.

Publication Dates

  • Publication in this collection
    18 July 2025
  • Date of issue
    2025

History

  • Received
    04 Jan 2025
  • Accepted
    17 Apr 2025
location_on
Instituto Israelita de Ensino e Pesquisa Albert Einstein Avenida Albert Einstein, 627/701 , 05651-901 São Paulo - SP, Tel.: (55 11) 2151 0904 - São Paulo - SP - Brazil
E-mail: revista@einstein.br
rss_feed Acompanhe os números deste periódico no seu leitor de RSS
Reportar erro