Clapping can be weird, especially when we’re at the movies. Some people cheer for characters who fall in love, or jeer for villains who get their comeuppance. People often applaud when a movie ends and the credits roll, even though the cast and crew aren’t there to bask in the outpouring of love. We do this, despite knowing that the screen is a one-way form of communication, because it’s a joyful, fleeting moment of community and solidarity, a sense that we share something with the room.
So why would we clap for a performing hologram? As I was sat in the Jorgensen Center at the University of Connecticut waiting for a hologram of the deceased opera star Maria Callas to appear on stage, I was questioning how we interact with a virtual human, and how we might feel interacting with one in the form of someone who has died. To believe the program, you’d think the company behind this performance had created the second coming of Maria Callas.
When this “Holo-Callas” took the stage, it was light-years away from the danger zone of the uncanny valley. It had a strong physical likeness, and convincingly expressive gestures with realistic, graceful movements. Holo-Callas was accompanied by real humans Eímear Noone, an award-winning conductor and composer, and the Symphony NH. It followed a realistic script that’s programmed to behave, as much as possible, like her diva predecessor.
During a scene from George Bizet’s Carmen, Holo-Callas forsees the death of herself and her lover in a deck of cards, and throws them into the air. They freeze before hitting the ground, momentarily motionless as if time itself stood still. This wasn’t the derivative excellence of a copy degrading the original. It was magnificent poetry in motion, a hologram coming into its own as a special effects machine. And though the rest of the audience applauded, let me explain why I refused.
The audience wasn’t applauding the technical work of the engineers who toiled behind the scenes. They weren’t celebrating the spectacle. Instead, they were reacting to a simple and well-designed prompt. At perfectly timed moments, Holo-Callas made carefully choreographed gestures, including bows, that were optimized with symbolic grace. The solicitous body language signaled to the audience that we should respond with adoration, the two-step dance of stimulus and trained response.
The talented Noone enhanced the spectacle by playing a central role in the charade. Sticking to the playbook that called for her to reinforce the fantasy, Noone acted as if she were in the presence of a great artist (not artistic greatness!), her visual cues directing us to follow her lead and do her bidding, just like the orchestra she authoritatively conducts. Keeping up this facade until the very end, Noone took the fiction as far as it could go, and with a grand gesture, she tried her best to make it look like she gifted Holo-Callas a rose.
In reality, of course, Holo-Callas can’t hold real flowers, and the simulation didn’t register anyone’s affection. Holo-Callas isn’t embodied or conscious, and it can’t perceive anything, whether praise or slight. While human singers, musicians, actors, and dancers can connect with their audiences by feeling and responding to the positive energy, Holo-Callas is as moved by other people’s emotions as a rock, despite what its sleight-of-hand eyes and lips suggest.
We’re living in a time where the line between humans and simulations is blurring in contexts where it can be problematic to lose sight of the differences.
We know this, and yet if you’re in the audience, willingly suspending its disbelief, the only way to avoid falling for these tricks is to actively resist the seduction. After all, the powers of suggestion and habit are great. “How far was I willing to suspend reality?” asked Tom Huizenga at NPR. “When we eagerly begged for that encore — an age-old musical transaction between performer and audience — who were we really imploring? Callas the dead diva, Callas the hologram, or the technology that created her? What’s wrong with wanting to feel like you’re actually at a Callas recital?”
The Wall Street Journal’s drama critic Terry Teachout described being moved to tears, seemingly connecting emotionally with hologram itself. “It wasn’t until the first encore, Puccini’s ‘Vissi d’arte,’ which comes from ‘Tosca,’ one of the operas with which the soprano is most closely identified, that I connected on an emotional level with the virtual Callas. Tears came to my eyes without warning, and I thought this must be how it felt to have really seen her on stage.”
Let’s get one thing straight: Callas has not returned. We’re living in a time where the line between humans and simulations is blurring in contexts where it can be problematic to lose sight of the differences. Artificial assistants like Siri are designed to sound like conversational humans, and thanks to natural evolution, it doesn’t take much for us to anthropomorphize them. Robotics scholars have identified several profound dangers that can arise when machines with ever-greater intelligence are designed to appear human-like:
- Humans become so emotionally connected to these machines that they’re manipulated by them, nudged to waste resources like money, develop ideologically tainted preferences and beliefs, or are exploited to give up privacy.
- Humans become so emotionally connected to these machines that they begin to see them as worthy of ethical care, perhaps even rights, and as a result end up with less resources to allocate to other humans in need.
- Humans become so emotionally connected to these machines that they waste social capital that would be better spent on pro-social, human interactions. (Non-legally binding hologram marriages have already taken place.)
- Humans develop bad habits when engaging with these machines and transfer this behavior to their interactions with other people.
- Humans fail to recognize how their interactions with these machines reinforce prejudiced stereotypes.
Now that we’re living in what some call a “post-fact society,” we need to scrutinize our relationships with things that simply aren’t what they seem. Deceptive digital tools like deepfakes pose ever-greater threats to democracy, and holograms could become politically weaponized.
During the 2014 campaign for Prime Minister in India, Narendra Modi used hologram technology to “address rallies throughout the country simultaneously as a hologram.” In order to give the impression that he was undaunted by the demands of campaigning to a population of 1.3 billion, Modi ostensibly became the first politician to spread his message through “holographic doppelgangers.” Some confusion, if not deception, occurred. “Many poorly educated voters had stayed behind after rallies to check behind the dais to see if he was really there, officials said.”
More recently, the Reagan Presidential Library and Museum rolled out a hologram of a 1984 version of the Gipper. As a thought experiment, imagine President Trump’s team sending out holograms of him and Reagan, both wearing MAGA hats and seemingly holding simultaneous rallies across the country. Yet the real President would be holed up in Mar-a-Lago watching ego-bolstering highlights of his rallies from bed, occasionally glancing up to catch glowing coverage on Fox News. Reagan’s presence, albeit virtual, would give the false impression that he endorses Trump — something that isn’t possible because the 40th President of the United States died in 2004. If choreographed well, the image could be persuasive propaganda.
So, would Callas herself have endorsed Callas in Concert? After seeing the Holo-Callas appear to be moved by our applause, we’re encouraged to think she might. Yet, there are plenty of reasons to be skeptical. It’s fine to be impressed by a performing hologram, and even shed a tear for the emotional sights and sounds. But there’s a lot to lose if we allow ourselves to give in fully and clap for a hologram of the deceased as if it’s a living person.