Deepfakes and the Spread of Misinformation

Olivia Harkin
Encode Justice
Published in
6 min readMay 3, 2021
Webroot

“Yellow Journalism” is a phrase coined during the Spanish-American war, to describe reporting based on dramatic exaggeration, and sometimes flat-out fallacies. With its sensational headlines, and taste for the theatrical, yellow journalism, or as it’s better known as now, “fake news,” is particularly dangerous to those who regard it as the absolute truth. In the days of Hearst and Pulitzer, it was much easier to know what was fabricated, and what was fact. However, in this age of technology, yellow journalism isn’t so “yellow” anymore. It’s the jarring blue glow emitted from millions of cell phones aimlessly scrolling through social media; it’s the rust-colored sand blowing against Silicon Valley complexes where out-of-touch tech giants reside; it’s the uneasy shifting of politician’s black suits as their bodies fail to match up with their words; it’s the curling red lips of innocent women, their faces pasted onto figures entwined in unpleasant acts.

It’d simply be a redundancy to say the internet has allowed misinformation to spread, but it’s more necessary than ever, to examine the media you’re consuming. Deepfakes, which are artificial images made by overlaying someone’s — usually a famous public figure’s — face, so they can be manipulated to say anything, have begun to surface more and more recently. In conjunction with deepfakes is artificial intelligence, or AI, which is when a machine exhibits human-like intelligence by mimicking our behavior. This includes things such as recognizing faces, making decisions, solving problems, and of course, driving a car, as we’ve seen with the emergence of Teslas. AI has been particularly eye-opening in revealing just how much trust we put into mere machines, and deepfakes are a perfect demonstration of how that trust can so easily be shattered.

When you search “deepfakes,” one of the first few results you get are websites where you can make your own. That’s how easy it is. The accessibility of such technology has long been seen as long been seen as an asset, but now, it’s like Pandora’s Box has been opened. Once people realize virtually anything is possible, there’s no end to the irresponsible uses of the internet. However, legally, many of the deepfake scandals can be considered defamation. A case recently came to light in Bucks County, PA, where a jealous mother created deepfakes of her daughter’s teammates, intended to frame them for inappropriate behavior. Police were first informed of this when one of the teammate’s parents reported their daughter had been harassed with messages from an unknown number. The messages included “pictures from the girl’s social media accounts which had been doctored to look as if she was naked, drinking alcohol, or vaping.” These photos were intended to get the girl kicked off the team. Fortunately, police were able to trace the IP address, and arrested the perpetrator. She now faces three misdemeanor counts each of cyber harassment of a child, and harassment, proving just because an act is done “anonymously” via the internet, doesn’t mean you can’t get caught. In fact, the internet provides just as many opportunities for conviction as it does narrow escape. As technology becomes more and more apt to cause damage, cyber harassment is considered a serious crime. If convicted, the mother faces six months to a year in prison. Pennsylvania has active anti-cyberbullying legislation in place, that emphasizes how authorities have the right to interfere in instances that occur off school property. The state makes cyber harassment of a child a third-degree misdemeanor, punishable through a diversionary program.

Women have frequently been the victim of sexual violence via the usage of deepfakes. For example, “pornographic deepfakes exist the realm of other sexually exploitative cybercrimes such as revenge porn and nonconsensual pornography.” According to the Fordham Law Review, one journalist described deepfakes as “a way for men to have their full, fantastical way with women’s bodies,” emphasizing how this is still a sexually abusive act, as it demeans and reduces women to nothing but fantastical objects. Additionally, with the uncertainty of how many of this new technology works, it’s easy for these videos to be leaked, and a woman to have her reputation ruined over something she herself never did. Deepfakes have been used to intimidate and invalidate powerful women as well; men who find themselves threatened by a woman’s advance in authority may see this as a means to bring them down.

In 2018, Rana Ayyub, a successful, budding journalist in Mumbai, fell under scrutiny after a deepfake of her face superimposed on a porn video came into circulation. The stress from the endless harassment sent Ayyub to the hospital, and she withdrew from public life, abandoning her aspirations of working in media. Forty-eight states as well as D.C. have laws against “revenge porn,” but there’s still limitations against prosecuting websites that distribute this content. Section 230 of the Communications Decency Act is a federal law that protects websites protection from prosecution for content posted by third parties. Luckily, this immunity goes away if the website or webmaster actively becomes a part of distributing the content. Additionally, most states impose a fine, and/or a prison sentence for the distribution of nonconsensual porn by a citizen. Federal legislation to address the problem of deepfake pornography was introduced in 2018, and it was called The Malicious Deepfake Prohibition Act of 2018. Unfortunately, this legislation didn’t advance, proving there’s still a long way to go in administering justice to victims of this heinous crime.

Most detrimental to American life as a whole — especially given our fiercely divided nation — is the use of deepfakes to spread political misinformation. With former President Trump’s social media presence considered a hallmark of his presidency, and the majority of citizens having access to the presidential briefings on TV, our elected official’s ideals are more available than ever. However, America has always allowed itself to be swept up in illusions. In the very first televised debate of Nixon versus Kennedy in 1960, Kennedy was widely believed have been given an automatic edge because of his charisma and good looks. In this day and age though, it’s crucial our country looks more than skin deep. A video of President Biden sticking his tongue out, and another video of Biden making a statement that was proven to be fake, were both made of intricately spliced and edited audio clips. The second clip was reposted by one of Bernie Sanders’ campaign managers; it showed Biden apparently saying “Ryan was right.” This was in reference to the former Speaker of the House Paul Ryan’s desire to go after Social Security and Medicare. Even within the Democratic party itself, fake media was being used to enact support for a particular candidate, creating harmful disunity. However, change is on the horizon; the National Defense Authorization Act for Fiscal Year 2020 included deepfake legislation. This legislation included three provisions, the first being the requirement of a comprehensive report on the foreign weaponization of deepfakes. The second necessitates the government to notify Congress of foreign deepfake-misinformation being used to target U.S. elections. Lastly, the third establishes a “Deepfake Prize” competition in order to incentivize the development of more deepfake recognition technologies.

In a world where media is so easily manipulatable, it’s up to citizens to be smart consumers. By reading news from a variety of sources, and closely examining the videos you’re watching, you have a better chance of not being “faked out” by deepfakes. Some tips for identifying deepfakes include: unnatural eye or mouth movement, lack of emotion, awkward posture, unnatural coloring, blurring, and inconsistent audio. Many people worry in a world where anything can be fake, nothing is real. But there will always be journalists committed to reporting the facts, and promoting justice rather than perpetuating lies. When the yellowed edges of tabloids crumple to dust, and the cell phone screens fade to black, truth — in its shining array of technicolor — will snuff out the dull lies.

Sources:

Cheerleading mom Raffaela Spone created deepfake nudes of daughters teammates, police say — The Washington Post

First Federal Legislation on Deepfakes Signed Into Law | WilmerHale

48 States + DC + One Territory NOW have Revenge Porn Laws | Cyber Civil Rights Initiative

Deepfakes, the ‘Liar’s Dividend,’ and the Future of Misinformation (businessinsider.com)

Are Deepfake Videos A Threat? Simple Tools Still Spread Misinformation Just Fine : NPR

The growing threat of political ‘deepfakes’ (yahoo.com)

--

--