Deepfakes and Digital Resurrections

The Ethical Perils Are No Longer Fiction

FaithTech
FaithTech Institute
8 min readJan 11, 2022

--

Editor’s Note: This essay is the 2nd Place winner in FaithTech’s 2021 Writing Contest! See past winners here!

When Star Wars: Rogue One was released in 2016, it was a prequel to movies that were by then nearly 40 years old. The events in the film immediately precede those of A New Hope, which released in 1977. However, for Rogue One, the creators faced a conundrum. The original actors were 40 years older, and some were vital to the world of the story. But that wasn’t their biggest problem.

While many actors were still available, one actor, Peter Cushing, who played Grand Moff Tarkin, had passed away in 1994. Rather than giving up on having him in the movie, the lucrative movie franchise decided to resurrect him. They digitally inserted his appearance using conventional CGI.

At the time, some people wondered about the ethical issues of using the likeness of someone who was no longer around to approve or object (although it was done with the approval of his estate). In an interview in the New York Times, John Knoll, the visual effects supervisor for Rogue One, denied that it was the start of a slippery slope toward using more dead actors in movies:

It is extremely labor-intensive and expensive to do. I don’t imagine anybody engaging in this kind of thing in a casual manner.

On the contrary, deepfake technology has already reduced the labour and expense of inserting someone’s likeness in this way, and it seems like precisely the sort of thing where the experience curve and economies of scale will rapidly make it even more accessible — not just to movie producers but even to grieving family members and opportunistic impersonators. Who should call the shots on this type of “digital resurrection”, and what limits should be respected?

Rights For The Dead

Twelve years earlier, in 2004, author Chris Walley published a 2-part novel titled The Shadow and the Night. The first in his The Lamb Among the Stars trilogy, the story is set in a distant future inspired by post-millennial eschatology and science fiction in roughly equal measures. In Walley’s fictional universe, the righteous citizens of the 1,600-planet Assembly of Worlds are governed by, among other laws and doctrines, the “Technology Protocols”, a set of boundaries around the use of technology. The preface of these protocols states that,

The Assembly therefore solemnly covenants that the only technology that will be accepted is that which can be shown will not lead to the loss or damage of individuality or personality. (p. 81)

One of the more obscure Technology Protocols of the Assembly is number six, which states that, “the rights of an individual to be protected from direct or indirect technological abuse are not extinguished by death.” At first glance, it seems like something that would rarely become an issue since most people would leave the dead to rest in peace, but a transgression of this boundary is one of the first signs that something is going wrong in the idyllic Assembly of Worlds.

One of the implications of this protocol is that singers and actors could leave their voice (digitally re-created) for posterity but have the right to place limits on how it would be used, in line with their artistic vision or any other concerns they may have. In an incident early in the story, a character is preparing a digital arrangement of a Christmas carol and uses the voice of a long-dead singer; however, he “enhances” her voice so she can sing outside of her natural alto range, in violation of her will. As the novel progresses, other more blatant and serious violations surface for almost every clause in the Technology Protocols.

Protocol Six is the only one that is directly cited in the novel; in a demonstration of effective world-building, the contents of the others are left implicit. For example, when a troubling discovery upends their accepted approach to technology, one character explains that, “all our machines proclaim that they are machines”. From comments like this, it is clear that besides Protocol Six, other violations of the Protocols are abuses that blur the boundaries between machines and persons, humanity and animals, or life and death.

Deepfakes for Good?

In the years since the novel was published, machine learning techniques have enabled pattern recognition and matching to reach an unprecedented level of sophistication. Because of them, fabricating audio-visual content is now commercially available. These fabrications are commonly labelled as “deepfakes”, which Wikipedia defines as, “synthetic media in which a person in an existing image or video is replaced with someone else’s likeness.” But deepfakes aren’t the only form this technology is taking. There is a redemptive edge as well.

A more recent example is that of actor Val Kilmer. In 2017, the actor who played the iconic “Iceman” in Top Gun lost his voice from a surgery to fight throat cancer. Then in 2021, an AI-driven system digitally restored his natural voice. An article in the Washington Post describes the effort and gives voice to some potential ethical issues:

But the technology also sparks legal, ethical, and economic concerns, particularly among voice actors concerned about their livelihood drying up. Deepfake technology has [also] been used to make videos of politicians such as Donald Trump and Barack Obama, spotlighting the dangers of technology designed to make it appear as if people are saying things they never said.

Whether it’s positive examples like Kilmer’s, troubling abuses like Walley imagines, or approved uses like Rogue One, the mere ability to create deepfakes will have societal consequences. In the field of entertainment, as these examples show, this technology poses challenges for people who are established in the industry like loss of creative control (they can be inserted into works without their consent — likely fan fiction will lead the way on this). For people who are trying to get established it may mean loss of job opportunities (since they are now also competing with popular figures who may be retired or even deceased). In fields like news and politics, the challenges go further, posing a risk to our already-frayed societal epistemology. In Walley’s novels, the challenges go further still, with characters exposed to doubt about whether their video calls are really with the person they see on the other end or have been intercepted and altered. Although real-time alteration of high-bandwidth channels is still science fiction at this point, the possibility is already easy to imagine.

Deep Pranks and Acceptable Deceptions

Within a Biblical worldview, it should be obvious that using deepfake technology to bear false witness against someone is wrong. Among other possibilities, this could take the form of putting words in someone’s simulated mouth that would create problems for them. Likewise, spreading gossip or rumours — even as a prank — is warned against in Scripture:

Like a madman who throws firebrands, arrows, and death, so is the man who deceives his neighbor and says, “Was I not joking?” For lack of wood the fire goes out, and where there is no whisperer, contention quiets down. Like charcoal to hot embers and wood to fire, so is a contentious man to kindle strife. The words of a whisperer are like dainty morsels, and they go down into the innermost parts of the body. (Proverbs 26:17–22)

While this essay is mainly focused on ethical questions around producing deepfakes, the last verse of this passage has a good reminder as well for media consumers to watch their information “diet”; in the unfolding media landscape this is a very timely warning. The book of Proverbs has a lot more to say on the topics of truthfulness and reputation. It will be as relevant as ever in the media landscape shaped by emerging technologies like deepfakes.

Opposing deceptive uses of deepfakes is something that most religions and ethical frameworks would agree on. A concern that might be more specific to Christian theology surrounds the use of this technology to re-animate the dead (pun intended), in a sort of ersatz resurrection. Using human efforts and ingenuity to deny the reality of death or striving to elude it in some way is a challenge to some of Jesus Christ’s central claims. No doubt there will be some debate about where to draw the line, but deepfakes of the dead should certainly draw scrutiny from believers.

While the examples from The Shadow and the Night and Rogue One do not pretend the deceased person is still alive, other applications are coming to the market that try to offer a more interactive experience with late friends and relatives.

Daniel Reynolds is a director and producer at a company called Kaleida that makes holograms for “digital resurrections”. Despite the company’s purpose, Reynolds said, in an interview with The Face, “I wouldn’t want my family using video and bringing me back. I would put it in my will: you’re not allowed to do this.”

So even some people involved in creating deepfake resurrections are not fully comfortable with it. And dealing with the issue in a will clarifies the legality, but it does not fully address the morality.

Faith After Deepfakes

Bringing it all together, technology for making convincing fakes of audio-visual footage is reaching a level that was the purview of science fiction merely a decade or two ago. People of faith (such as author Chris Walley) have valuable insights into the ethics of these kinds of emerging technologies, and science fiction is an underutilized avenue for exploring those topics. They can help us imagine the consequences of future technologies before they present live issues and work through the ethical dilemmas proactively.

Deepfakes certainly have some positive uses as in the case of Val Kilmer, and perhaps in some creative endeavours when participants consent. But as I’ve shown, deepfakes also enable numerous applications that are questionable, if not unethical. These range from works of fan fiction to deliberate deceit. Using the likenesses of people who are no longer alive will be especially fraught. Not only will deepfakes be something we see on the big screen, but readers could well face these end-of-life issues for themselves or their loved ones. Our faith teaches that God has placed a longing for eternal life inside each one of us. Digital resurrections are a poor substitute for the genuine thing, and the way that we as Christians choose to approach this emerging technology will demonstrate where our real hope lies.

Daniel Scott lives in eastern Canada. He is recently married and even more recently the father of a baby girl. For almost ten years he’s been working as a process engineer on industrial wastewater treatment projects. Before that, he studied at the University of New Brunswick and the University of Waterloo.

Learn more about FaithTech at faithtech.com.

Want to join the movement? Start here.

--

--