Loading…
0:00
11:26

Technology is, inevitably, encroaching on the human experience of death, both in popular culture and real life. On screen, Smallville’s Clark Kent gets advice on how to become Superman from an AI version of his deceased father, and in “Be Right Back,” a haunting episode of the sci-fi series Black Mirror, a woman copes with grief by ordering an android simulation of her deceased partner.

In the real world, people are experimenting with griefbots. Muhammad Aurangzeb Ahmad, principal data scientist at KenSci, is often the center of attention when the topic of griefbots comes up.

He created a chat-style program that allows him to have conversations with a simulation of his father, who passed away a few years ago. Muhammad took several philosophy classes with me when he was an undergraduate at Rochester Institute of Technology, and over the years we’ve become friends.

Muhammad’s experiences bring into stark relief the pain that loss can bring, and he has publicly taken up the cause of trying to spark conversations about the benefits and dilemmas involved with creating simulations of the deceased. Read Muhammad’s reflections in “How the Dearly Departed Could Come Back to Life — Digitally” and “After Death: Big Data and the Promise of Resurrection by Proxy.”

I asked Muhammad about his motivations. “The main inspiration behind the simulation is to give people a way to interact with their loved ones after they are gone,” he said. “Software agents, which are becoming increasingly sophisticated, have been used as stand-ins for people for many years. Most of these systems have been used for commercial purposes to date. With this project, I wanted to explore the human element of this technology and how software simulations of a deceased loved one can potentially give us closure and expand the range of possible experiences for us.”

There’s a more personal motivation too, Muhammad explained. “I am calling this project the Mushtaq Ahmad Mirza Project, after my father. The idea literally came to me when my brother called me one autumn almost five years ago to inform me that the doctor has given a week or so notice for our father’s death. At that time, I was not married and didn’t have any kids. One of the first thoughts that came to my mind was that my potential kids will be deprived of knowing what a wonderful man their grandfather was and won’t have any way to interact with him.”

I had a hard time imagining what Muhammad’s conversations are like and suspect most of us would. Apart from fiction, there aren’t too many frames of reference for this kind of thing.

Muhammad Aurangzeb Ahmad (Sonu) chatting with a simulation of his father, Abu Jani.

It’s hard not to be moved by the exchange between Abu Jani (which means “dear father” in Urdu) and Sonu (one of Muhammad’s nicknames). The dadbot does what dads do best: remind their kids to take care of themselves. Indeed, Muhammad has been so taken aback at times that he’s found himself needing to close his computer and go out for a walk.


Humans Aren’t Always Creatures of Habit

A simulation of a person will differ from an actual person in many ways. Technology hasn’t advanced to a point where replicas are perfect copies. Frankly, I doubt that it ever will, though Elon Musk might disagree. The billionaire CEO of SpaceX and Tesla believes that the odds heavily favor our current reality being a simulation (like the one depicted in The Matrix) that people mistakenly perceive as real life.

If Musk is right — and I really hope he’s not — griefbots are a mere simulation within a simulation. They’re a dream within someone else’s dream, and the tears we shed over them are more like shadows than water.

Muhammad is under no illusion that he’s speaking with the dead. To the contrary, Muhammad is quick to point out the simulation he created works well when generating scripts of predictable answers, but it has difficulty relating to current events, like a presidential election. In Muhammad’s eyes, this is a feature, not a bug.

Muhammad said that “out of good conscience” he didn’t program the simulation to be surprising, because that capability would deviate too far from the goal of “personality emulation.”

This constraint fascinates me. On the one hand, we’re all creatures of habit. Without habits, people would have to deliberate before acting every single time. This isn’t practically feasible, so habits can be beneficial when they function as shortcuts that spare us from paralysis resulting from overanalysis.

On the other hand, people don’t have to be caricatures of themselves. We can learn and grow. We can fight the urge to become old dogs that refuse to learn new tricks, dogs that remain satisfied barking out the same old “bowwows” over and over again.

Habits run deep because we’re all shaped by the environments we live in and the attitudes that people express all around us. This is why intergenerational conversations can be so tricky. Conversations between parents and children, for example, can put the prejudices of a time period and place on full display.

And yet no matter how inflexible someone is, there’s still a danger of freezing her in time by imagining that the way this person once was is the way she necessarily always would have to be.

On a recent episode of the television show Ellen, peanut farmer Nathan Mathis brought the audience to tears by recounting why he publicly protested against Alabama Chief Justice Roy Moore’s homophobic statements.

Earlier in his life, Mathis was aghast when his daughter came out and naively thought her homosexuality could be cured. Her subsequent suicide, however, led Mathis to change his mind, regret his earlier bias, and become a gay rights advocate. Mathis’ mind opened and so did his heart.

If a simulation were made of Mathis that drew only from things he said earlier in life, it probably would end up saying things that he currently finds extremely offensive.

Interestingly enough, things went wrong for the simulation of Superman’s father for exactly the opposite reason. The Jor-El bot was supposed to be an upgrade over the original because it was programmed to “lack the regret, pride, and guilt of his creator.”

Many of the Smallville plots revolve around the negative consequences of this makeover. Dadbot turned out to be much more severe and far less empathetic than it was intended to be.


Talking to Empty Chairs

The easiest way for new uses of technology to be misunderstood is to believe they are so innovative that no historical parallels can be made. In the case of griefbots like the one Muhammad uses, parallels can be made to the empty chair technique.

During the 2012 Republican National Convention, Clint Eastwood famously yelled at an empty chair that he used as a proxy for President Obama. For many, the spectacle was as horrifying as it was meme-able.

Far from being a carefully planned attention grabber, the speech was a result of Eastwood making a spontaneous decision to go with the rhetorical ploy of furniture-venting. The idea came to him after he happened to hear Neil Diamond’s song “I Am…I Said,” which is filled with the melancholy lyrics of loneliness: “And no heard at all, not even the chair.” I’m not making any comparisons to this debacle.

The empty chair technique that I’m referring to was popularized by Friedrich Perls (more widely known as Fritz Perls), a founder of Gestalt therapy. The basic setup looks like this: Two chairs are placed near each other; a psychotherapy patient sits in one chair and talks to the other, unoccupied chair. When talking to the empty chair, the patient engages in role-playing and acts as if a person is seated right in front of her — someone to whom she has something to say. After making a statement, launching an accusation, or asking a question, the patient then responds to herself by taking on the absent interlocutor’s perspective.

In the case of unresolved parental issues, the dialog could have the scripted format of the patient saying something to her “mother,” and then having her “mother” respond to what she said, going back and forth in a dialog until something that seems meaningful happens. The prop of an actual chair isn’t always necessary, and the context of the conversations can vary. In a bereavement context, for example, a widow might ask the chair-as-deceased-spouse for advice about what to do in a troubling situation.

It would be interesting to see studies done on whether griefbots can be effectively used to accomplish similar goals.

Since I’m not a therapist, I thought it would be useful to talk to one about Muhammad’s project. I deployed my mom, Sherry Schachter (who, helpfully, happens to be executive director emerita of bereavement services at Calvary Hospital) and asked if she thought people could benefit from using griefbots.

Her positive response emphasized research that has been conducted on different styles of grief. Mom said, “Instrumental grievers tend to be more cognitive in the way they express their grief. They tend to be problem solvers and not as open to attending support groups as intuitive grievers. Those individuals, primarily women, flock to support groups and are more comfortable crying and showing and sharing emotions.”

“I’m thinking that instrumental grievers,” Mom remarked, “might be more comfortable using Muhammad’s technique since it’s very private.”


The Sins of the Child

Back in ancient Greece, the philosopher Aristotle grappled with how hard it is to determine if someone has lived a good life.

Reversals of fortune happen all the time, and people who are thriving can become miserable after the rug gets pulled out from under them. If we think about the full force of contingency — the ever-present possibility that forces outside our control can undo our efforts or that things that appear to be stable turn out to be short-lived — we end up with a terrifying thought. Our happiness is so fragile that it can be subverted after we die. People we care about, like our kids, can honor or dishonor our good names, without, as Aristotle recognized, us being able to do a damned thing about it.

Griefbots pose many challenges for the people who use them and for the reputations of those who are deceased. But technology presents us with new opportunities to share memories and cope with loss. I once went to an online funeral, and it was nothing like I expected it to be. Until you try something like the program Muhammad is using, it’s best to keep an open mind.