Artificial intelligence is widely believed to be the next frontier — an uncharted future, primed for exploration and new life — but since learning about ChatGPT, I can only think about the after-life. Will the way we view death transform with technology? It’s a question I’ve been asking myself for over 10 years.
My recognition of a new type of graveyard began when I was a teenager and I understood a young person’s death meant they left behind more than just family or belongings — it also meant Facebook pages with entire personalities frozen in time.
A profile of a deceased classmate would hold 300+ photos of a night buying slurpees at 7-Eleven, a badge proclaiming they “liked” baby sloths, and comments from friends saying they’d see them tomorrow. We were part of generation “me,” which made the sudden absence of “them” feel even more haunting in the digital realm.
At that time, Facebook was also not prepared for the fact that its “friends” would die and their profiles would remain. After users started to get notifications to “add” their dead loved ones, the company made updates to its profile settings and created a process by which users could be allowed to control a deceased connection’s “digital legacy.” An explainer video on Facebook reads: “We live in a digital world, what happens when we leave it behind?”
When my dad left his behind in 2019, I couldn’t bear to visit his Facebook page, but knowing it existed made it hard to avoid. There were too many comments left from people I didn’t know, all of whom proclaimed to love my Dad. It felt like a virtual tombstone that my family did not create. I didn’t know what was worse — the heartfelt messages from strangers or knowing that he’d never respond.
I investigated how to get the page taken down so I wouldn’t be tempted to look at it again, but gave up filling out the online form required to do so when Facebook asked me to provide a death certificate. I thought, You want me to throw my dad’s death certificate into my at-home scanner to prove you should delete his page? I guess it’s probably a fair ask when I think about it now.
One might wonder then, after years of avoiding his digital poltergeist, why I’d be hoping to get ChatGPT to speak like my dad. The concept came to me around the time of my wedding, when I contemplated revealing to our guests my life’s biggest regret: that my husband and dad had not gotten a chance to meet. I wanted to admit my failing to bring them together in my vows as a sort of public, emotional self-lashing, but that would have been a total downer.
Having been exposed to ChatGPT in my work, I saw AI as my out. If I could train the bot to follow the speech patterns of my father, maybe he and my husband might finally be able to have a conversation.
I knew this task would be challenging, as ChatGPT requires very specific prompts to produce optimal results. I would need to provide the bot with a full profile of my dad’s speaking characteristics for it to copy, and maybe even feed it past examples from our text conversations. For added flair, I could ask the bot to interrupt every now and then with a cough, as my dad frequently did from smoking cigarettes all his life.
After working out lines of basic descriptions, like his age and the amount of words with which he would usually respond, I realised how difficult the rest of this process was going to be. Chats with my dad were often extremely random and I’d usually walk away from them wondering what it was we were even talking about. This was not because he didn’t know how to have a discussion, but because he was likely thinking of numerous things at once.
As a documentary producer who worked in his field for 40 years, my dad was very creative and could make any conversation into a story. You could argue he was always producing your very interaction. He might begin singing lyrics to a song you didn’t know or drop names of people you’ve never met and then it was over. That was him. You never knew what you would get. So, could I get a bot to be him?
The more I tried, the more my excitement waned. I spent hours researching the proper prompts and wrote long soliloquies dissecting my father’s persona. But every time the bot offered a response that sounded nothing like my dad — like when it called me a “youngster,” I could see the mask slip.
In the midst of this private experiment, I began working with a creative who goes by Shxpir (pronounced Shakespeare). Even before we met, I knew we would have a lot in common, but I didn’t think we’d both be thinking about death. As we got to talking on the set of an ad campaign, he revealed he also had lost his father and was using tech to work through those emotions. In contrast to my guilt, his stemmed from a lack of having ample photographs of his father. He told me he’d been feeling his dad’s physicality in this world slipping from his memory with each passing day.
Now, he’s working to render his father virtually, and in 3D, from the images he does have. His project will be completed in a few months and it will allow him to create new images of his father from every angle. In awe of his dedication, I asked Shxpir how he finds the strength to continue the project, as I continuously stumble over emotional roadblocks — memories, pain, the occasional cry — that keep me from trying again.
“Sometimes I don’t know if I can handle it,” he admitted to me.
Back in 2021, Shxpir began working on a proposal for a business that would recreate deceased loved ones digitally and is now actively working on his own Metaverse funeral for whenever he’s no longer on this planet. With a deep understanding of art and years of digital rendering experience, Shxpir knew that the business of death and mourning, like that of fashion and cosmetics, could be thrust into the digital space at any moment. Although it’s become more of a side project, rendering his dad has remained an important mission for him.
“At the end of the day,” he told me with a resolve I could feel through the phone, “as an artist, it’s an experiment for my personal growth.”
After some time, I stopped my experiment with ChatGPT and logged into an AI video generator for the first time. I typed “older man talking on the phone” into the generator bar and waited to see what it would return to me.
For a long time, I wanted to create art about a vivid dream I had after my dad died. I even sought advice from a dream analyst to help interpret it. In the dream, I was on the phone with my dad and I suddenly realised his voice started to sound different, as if someone had been impersonating him the whole time. It felt so horrifyingly vivid, I woke up thinking I should call my dad and make sure he was OK.
After the AI digested my prompt, a face appeared on the screen: a brunette older man, slightly balding, wearing large glasses and hunched over a desk while talking on the phone. I was surprised. Wow, there he is, I thought. He had about four extra fingers, a feature AI tends to trip up on, but he looked a lot like my dad.
I went on from there, creating small one-second video scenes using the AI, until I had compiled a full visual retelling of my dream. The process of acquiring the “footage” felt strangely peaceful in its impassiveness. The AI did not ask why I wanted to create this rendering, it just provided scenes in response to my prompts.
I added words and phrases like “film noir style” to get a darker effect. I could calmly say “try again” if it did not give me the render I wanted. The graphics it output looked human, but also alien — part dream, part nightmare. The shapes of people were not fully formed. The colours moved on the screen like ink in water. I thought one of the brunette women it rendered looked like me too.
I finished the piece by adding my own audio narration and titled it “The Call.” After watching the final product, tears flowed from the bottomless pit that is the well of grief. But I also breathed a sigh of relief. Communicating with AI — creating this piece — gave me a sense of closure I hadn’t felt before.
Still, I wondered if this whole process was healthy, or if I had just fallen for an illusion of feeling understood.
I reached out to my friend and colleague Marni Blank for her perspective. Marni is a death doula — a person who guides people through the end-of-life process — for either themselves or a loved one. Her response to my project made me feel better.
“Some people need to go through the stage of revisiting memories of the person or photos, over and over again — or some people need distance,” she told me. “I think it’s just about what that person needs at that moment.”
Social media has also reprogrammed our brains to want to keep a white-knuckled grasp on our memories.
“We live in an age where it’s really hard to let go of anything,” Marni said. “Pre-Instagram, you could completely forget someone you broke up with existed.”
When I asked her if she would seek to have her social media accounts deleted after she passed, she sounded surprised by my question.
“I think I’d want to leave it up to the people who are grieving … I think to myself, who is going to be affected? Because it won’t be me.”
My thoughts snagged on her words — her focus on “the people who are grieving.” That was me. I had only been focused on my dad, and how he could be real again. I thought AI was the answer, but it proved to be a mirror.
Months after my ChatGPT experiment, I realised that my struggle with the AI bot was not that it could not “be” my dad. It was the deep sadness that came with facing the truth that he would never again be able to do something new — a new thought, a new conversation — like one I’d hoped he would have had with my husband. Only when I, myself, sought to create something new, like “The Call,” did AI truly give something valuable back to me.
There’s irony in the fact that my nightmare of my dad’s voice not really being his voice was so similar to what I was experiencing while attempting to give AI his personality. I think back to Shxpir and his quest for “personal growth” and how I hadn’t realised he’d given me the answer I had been searching for all along.
Though I thought AI would be a fitting match for powerful emotions like sorrow and grief, which seem to transcend the physical world, it turns out AI’s power may be providing pathways to better understanding ourselves. Technology will never be able to replace the strength we need to go deep inside ourselves and make something of our darkest sorrows. The will to keep moving forward with our grief in tow will never be rendered for us by machines. While so many of us are trying to decipher how AI will play a role in our lives, we must seek to keep ourselves at the centre of that search. Because the scariest thing about AI might be realising that it can’t do it all — especially when it comes to curing the pain of losing someone real.
Diana Weisman is a creative director at Bustle Digital Group and has spent her career exploring emerging platforms and formats fuelled by new technology. She has worked with multidisciplinary artists such as engineers, programmers, 3D and VFX artists, and recently produced two ChatGPT-powered experiences. Last year, she led a multimedia series titled “bhad dreamz” which explored the psychology behind commonly had nightmares. In her free time, she is an avid gamer and is always thinking of ways to weave her love of gaming and world-building into her writing and creative work. You can follow her on Instagram @dweismantrophy.