Manipulation rebranded.

Leila Ismailova
The Startup
Published in
6 min readAug 15, 2019

Deepfake’s Ethical Dilemma

In late 2017, people were introduced to a new term: ‘deepfake.’ The news outlets raised the alarm, warning about the impending threat. The technology is so new that the editor tool in Microsoft Word marks it with the red squiggly line as a non-existing word. Many signs have indicated that deepfake is not going anywhere.

Deepfake only indicates the improvement of technology to manipulate visual images, and not at all the novelty of the phenomenon itself. Politicians of the past also had the idea that it was possible to change the course of history by retouching photos or cropping videos. For example, during the time of the Soviet Union, people who were disliked by Stalin were “cut out” of photos. This is how Nikolai Yezhov, Mikhail Kalinin, Grigory Petrovsky and other former associates disappeared. Shortly after the death of the dictator in 1953 and the report of Nikita Khrushchev “On the Cult of Personality and its Consequences,” photographs of Stalin himself began to disappear after being deleted by censors. There is nothing new in manipulating images and the video stream.

Let’s look at examples in cinema. In 1994, director Robert Zemeckis placed Forrest Gump in a historical shot with John F. Kennedy. Gump appears on television in the White House and shakes hands with the President. Another great example: actor Paul Walker died in a car accident in 2013, while he was shooting the blockbuster “Fast and the Furious-7,” but before the shooting had finished. To create a scene with Walker, the filmmakers used a digital copy of the actor’s face. In the fifth season of Game of Thrones, nude Cersei Lannister passes through Queen’s Landing. But in fact, the actress was replaced by 27-year-old model Rebecca Van Cleve. To add authenticity to the scene, Cersei’s emotional face were “attached” to Rebecca during her “walk of shame.”

Every year, the technology of falsification of video images improves. Instead of using Photoshop or a video editor, modern designers and programmers train neural networks to create fake videos from photos. Today, everyone can create video fakes using the appropriate software. For example, the Pinscreen mobile app offers the option to “take a selfie and automatically generate a 3D avatar in seconds” or to “display a dense three-dimensional grid of your face.” The free FakeApp program is also available, which allows you to generate realistic videos with face replacement.

Deepfake is a technology powered by artificial intelligence which is used to produce or modify video content, quite plausibly depicting something as being real, when it was not.

If deepfake is used for entertainment, a harmless rally, or to simulate an original congratulation supposedly from a famous person, there is nothing objectionable in this. On the other hand, fake videos can seriously damage the reputation of many people — not only famous ones–and significantly affect the sympathies and persuasions of the electorate. One can already see how Barack Obama, Angela Merkel, and Donald Trump look ridiculous on fake videos; other celebrities such as actors and models have suddenly been portrayed and victimized in fakes as “advertise dildos” or “engaging in sexual orgies.”

How can Deepfake be applied?

As far as creativity stretches, deepfake can be applied in many different fields. The most reasonable and legal is the application of this AI-infused technology in cinematography. It may decrease production/postproduction costs, time, and risks. Deepfake can help the actor “finish” the job in case he/she is not able to do it. The example with Paul Walker proves this concept.

Funny enough, deepfake started in pornography. It boosted a new application of technology such as revenge porno.

Deepfake is a threat to the political institutions. In 2018, American filmmaker Jordan Peale and BuzzFeed worked together to publish a video address by former US President Barack Obama in which he calls Donald Trump an “asshole.” In fact, Obama said nothing of the kind. The video was created using the program Fakeapp and the graphic editor Adobe After Effects. Thus, the journalists and the director wanted to show how fake news would look in the future.

The legal system is under threat as well. Federal Rule of Evidence 902: FRE 902 (13) and (14), which went into effect at the end of 2017, is about authentication — specifically, of electronically-stored information (ESI). However, these new amendments are not a magic bullet to keep deepfakes from creeping into evidence. Riana Pfefferkorn, the associate director of surveillance and cybersecurity at Stanford Law School’s Center for Internet and Society, was one of the first legal representatives who put deepfake on the radar. In her interview with Law.com, she shared:

“I definitely don’t think litigators and judges are thinking about these issues sufficiently yet. But we should be getting ready, while we still have a little lead time before deepfakes start cropping up everywhere. That’s where I’m planning to go next in my work on deepfakes: developing practical guidelines and suggestions for how courts should go about the task of rooting out deepfake evidence, what the do’s and don’ts are for litigators as they’re collecting evidence for their case, and maybe also the role of expert witnesses.

“Experts are yet another part of the picture of deepfake in the courtroom. This is such a cutting-edge issue that there are only a few people who right now are qualified enough to give expert opinions as to whether or not something is a deepfake. If deepfakes come up in enough cases, then a handful of individuals are going to be in very high demand. So, in addition to the need for lawyers and judges to prepare, I also foresee an issue with the expert pipeline,” Pfefferkorn shared.

Deepfake is a threat to business. Imagine a CEO making a racist comment publicly before closing a round of investment. Chances are high that investors would back out. While it is possible to prove the innocence of the CEO, the damage to his/her image, and the company would already have been done.

Personal communication is under threat, too. While you may feel confident that you are chatting with a friend, you could actually be chatting with a bot posing as your friend and manipulating you into doing something.

The application of Deepfake technology is practically limitless. And observing how much it has evolved since its naissance in 2017, fake videos are getting progressively more real.

I spoke to Sergey Gonchar, the founder of MSQRD, the company that created Instagram and Facebook filters. MSQRD created the first deepfake filter of Leonardo DiCaprio.

Gonchar explained how such videos are created, how face-to-face replacement technology works, and why they look and sound so realistic:

“Fake videos can do a lot of harm before anyone checks them. Therefore, we believe it is very important to warn everyone about modern technological capabilities so that we have the right idea and are critical of what we see.

Unfortunately, technologies for creating deepfakes are developing faster than technologies for exposing them. Today no service or technology could expose deepfake.”

The U.S. Department of Defense is working on artificial intelligence, which is learning to recognize deepfakes.

There are several signs of a deepfake: people on such videos rarely or seldom blink, you may notice unnatural head movements or eye color.

While scientists are creating a solution to counteract the abusive effects of the most advanced tools for manipulation, it is the media’s and legal representatives’ responsibility to implement strict guidelines on how to verify the authenticity of videos that are being published. Some may think that it will limit freedom of speech, but most likely, it will prevent the tragic consequences which a single deepfake video could cause. Until scientists find a reliable and effective way to detect and immediately block the deepfake videos, increased control over published video content seems like the only rational solution for the problem.

--

--