Joaquin Oliver was 17 years old when he was shot in the hallway of his high school. An older teenager, expelled some months previously, had opened fire with a high-powered rifle on Valentine’s Day in what became America’s deadliest high school shooting. Seven years on, Joaquin says he thinks it’s important to talk about what happened on that day in Parkland, Florida, “so that we can create a safer future for everyone”.
But sadly, what happened to Joaquin that day is that he died. The oddly metallic voice speaking to the ex-CNN journalist Jim Acosta in an interview on Substack this week was actually that of a digital ghost: an AI, trained on the teenager’s old social media posts at the request of his parents, who are using it to bolster their campaign for tougher gun controls. Like many bereaved families, they have told their child’s story over and over again to heartbreakingly little avail. No wonder they’re pulling desperately at every possible lever now, wondering what it takes to get dead children heard in Washington.
But they also wanted, his father, Manuel, admits, simply to hear their son’s voice again. His wife, Patricia, spends hours asking the AI questions, listening to him saying: “I love you, Mommy.”
No parent in their right mind would ever judge a bereaved one. If it’s a comfort to keep the lost child’s bedroom as a shrine, talk to their gravestone, sleep with a T-shirt that still faintly smells like them, then that’s no business of anyone else’s. People hold on to what they can. After 9/11, families listened until the tapes physically ran out to answerphone messages left by loved ones, calling home to say goodbye from burning towers and hijacked planes. I have a friend who still regularly re-reads old WhatsApp exchanges with her late sister, and another who occasionally texts her late father’s number with snippets of family news: she knows he isn’t there, of course, but isn’t quite ready to end the conversation yet. Some people even pay psychics to commune, in suspiciously vague platitudes, with the dead. But it’s precisely because it’s so hard to let go that grief is vulnerable to exploitation. And there may soon be big business in digitally bringing back the dead.
As with the mawkish AI-generated video Rod Stewart played on stage this week, featuring the late Ozzy Osbourne greeting various dead music legends, that might mean little more than glorified memes. Or it might be for a temporary purpose, such as the AI avatar recently created by the family of a shooting victim in Arizona to address the judge at the gunman’s sentencing. But in time, it may be something more profoundly challenging to ideas of selfhood and mortality. What if it were possible to create a permanent AI replica of someone who had died, perhaps in robot form, and carry on the conversation with them for ever?
Resurrection is a godlike power, not for surrendering lightly to some tech bro with a messiah complex. But while the legal rights of the living not to have their identities stolen for use in AI deepfakes are becoming more established, the rights of the dead are muddled.
Reputation dies with us – the dead can’t be libelled – while DNA is posthumously protected. (The 1996 birth of Dolly the sheep, a genetic clone copied from a single cell, triggered global bans on human cloning.) The law governs the respectful disposal of human tissue, but it’s not bodies that AI will be trained on: it’s the private voicenotes and messages and pictures of what mattered to a person. When my father died, personally I never felt he was really in the coffin. He was so much more obviously to be found in the boxes of his old letters, the garden he planted, the recordings of his voice. But everyone grieves differently. What happens if half of a family wants Mum digitally resurrected, and the other half doesn’t want to live with ghosts?
That the Joaquin Oliver AI can never grow up – that he will be for ever 17, trapped in the amber of his teenage social media persona – is ultimately his killer’s fault, not his family’s. Manuel Oliver says he knows full well the avatar isn’t really his son, and he isn’t trying to bring him back. To him, it seems more a natural extension of the way the family’s campaign already evokes Joaquin’s life story. Yet there’s something unsettling about the plan to give his AI access to a social media account, to upload videos and gain followers. What if it begins hallucinating, or veering on to topics where it can’t possibly know what the real Joaquin would have thought?
While for now there’s a telltale glitchiness about AI avatars, as technology improves it may become increasingly hard to distinguish them from real humans online. Perhaps it won’t be long before companies or even government agencies already using chatbots to deal with customer inquiries start wondering if they could deploy PR avatars to answer journalists’ questions. Acosta, a former White House correspondent, should arguably have known better than to muddy the already filthy waters in a post-truth world by agreeing to interview someone who doesn’t technically exist. But for now, perhaps the most obvious risk is of conspiracy theorists citing this interview as “proof” that any story challenging to their beliefs could be a hoax, the same deranged lie famously peddled by Infowars host Alex Jones about the Sandy Hook school shootings.
The professional challenges involved here, however, are not just for journalists. As AI evolves, we will all increasingly be living with synthetic versions of ourselves. It won’t just be the relatively primitive Alexa in your kitchen or chatbot in your laptop – though already there are stories of people anthropomorphising AI or even falling in love with ChatGPT – but something much more finely attuned to human emotions. When one in 10 British adults tell researchers they have no close friendsof course there will be a market for AI companions, just as there is today for getting a cat or scrolling through strangers’ lives on TikTok.
Perhaps, as a society, we will ultimately decide we’re comfortable with technology meeting people’s needs when other humans sadly have not. But there’s a big difference between conjuring up a generic comforting presence for the lonely and waking the dead to order, one lost loved one at a time. There is a time to be born and a time to die, according to the verse so often read at funerals. How will it change us as a species, when we are no longer sure which is which?