Hacker and Instagram story photos modified by author

Someone Might Be Using Your Instagram Stories to Make Deepfake Porn

Jillian Krasusky
Art of the Argument
7 min readApr 13, 2022

--

A 15-second Instagram story has 450 individual frames. That is enough material for deepfake porn creators to superimpose your face on explicit porn videos — with or without your consent.

When Noelle Martin was 17 years old, she decided to reverse image search a photo of herself to see where the image was on the internet. This bout of curiosity unleashed a horrifying nightmare as hundreds of deepfake videos of her engaging in explicit sexual activities appeared in her search results. The material used: photos and videos stolen without her consent from her various social media accounts.

Horrified, Noelle tried to seek help from the police but was told there was nothing they could do — these videos were uploaded by anonymous users that posted on sites based overseas. The only action Noelle could take was to contact the webmasters one by one and ask them to remove the videos, a process that was time-consuming and mentally taxing. During the years of battling with webmasters, she was met with few successes and many setbacks: “I had one webmaster respond to me saying he’ll only delete the site if I sent him nude photos of myself within 24 hours,” Noelle Martin stated in her TEDTalk.

Deepfake of Nicholas Cage (right) as Amy Adams (left)

This non-consensual deepfake pornography that Noelle Martin found herself a victim of, is occurring worldwide. A report by Sensity AI found 14,678 deepfake videos: 96% of these videos were non-consensual sexual deepfakes, and of those videos, 99% were of women. These deepfakes, a form of synthetic media that uses artificial intelligence to swap out the faces of people in videos, make a puppet out of the person featured in the video. The deepfake porn creators steal victims’ control of their own faces, using them for the pleasure of others.

The repercussions of deepfake pornography impact all facets of victims’ lives. Deepfake pornography devastates a victim’s sense of privacy, inducing feelings of extreme anxiety and fear, and its online permanency could hinder their ability to use the internet and find a job. ​British writer Helen Mort, who was alerted of deepfake videos that depicted her engaging in extreme sexual acts, told Vogue that being a victim of deepfake pornography is “…like you’re in a tunnel, going further and further into this enclosed space, where there’s no light.” After her appalling discovery, Helen described feeling exposed whenever she was in public and experiencing frequent panic attacks.

A deepfake of Donald Trump (right) based on talk show host Jimmy Fallon’s impression of the president (left)

And yet, even with the pervasive issue of deepfakes being used to exploit women, concerns about employing deepfakes to manipulate elections, jeopardize national security, and threaten democracy have dominated the media attention around deepfake misuse. The current risk that deepfakes pose to politics is often overstated. In 2020, Sensity AI reported that out of the thousands of deepfakes made of public figures, celebrities, and everyday people, only 35 of the individuals were American politicians. And, during the 2020 presidential election in America, when there were many concerns of deepfakes disrupting the election, “Deepfake videos failed to materialize as a threat to democracy in 2020, while other tools and techniques were used massively to spread disinformation,” declared Giorgio Patrini, CEO and co-founder of Sensity.

In a 2020 study, researchers asked the question, “how does news media characterize the problems presented by deepfakes?” (Burkell & Gosse). The answer, media outlets disproportionately reported on the potential political misuse of deepfake technology rather than discussing the harms associated with sexual deepfakes. Though there are valid concerns about the negative implications of deepfakes in the political arena, the implications of deepfake pornography are equally devastating. This asymmetric reporting demonstrates a disregard for the lasting impact of pornographic deepfakes on women’s lives.

The laws around deepfake misuse tend to parallel the unbalanced media attention on the technology. Legislature on deepfakes focuses on protecting politicians, whereas the exploitation of women continues to go unchecked by those in power. In the United States, there are several deepfake laws in the works, but their attention is mainly on election campaigns, thus failing to address 96% of the problem. Excluding a few states, like California and Virginia, deepfake pornography remains legal in the United States.

Deepfake porn request forum

Easily evading media attention and government legislature, deepfake pornography sites and their creators are thriving. VICE writer Evan Jacoby paid just $30 to get a creator to make deepfake porn of himself. He privately messaged several creators on a deepfake porn site, and when a creator responded, he sent them a 13-second video and a link to a Pornhub video. It was that simple.

During his time scouring deepfake porn forums, Jacoby conversed with four creators. Their responses were disturbing, to say the least. When Jacoby asked one creator how they thought people would react if they discovered non-consensual deepfake porn of themselves, the creator answered, “…guys would laugh or take it as a compliment, girls would freak out and scream rape.” Jacoby asked another creator if people should have an expectation of privacy. The creator’s response? “If they upload their life to Facebook then it’s their problem really. Whatever [is] shared willingly is free to use.”

Anything posted to social media is fair game for deepfake porn creators.

Like Noelle Martin, I am a 17-year-old girl with an active presence on social media. The threat of my photos being stolen and used for sexual purposes terrifies me. And with no legislation in my state protecting me from this transpiring and not enough media outlets calling out the problem, I can’t help but be fearful every time my family, friends, or I post on social media.

Media outlets are failing to report on, and governments are failing to legislate against, the real issue with deepfake technology. No, not the technology’s overstated threat to politicians, but the use of deepfake technology to create extremely harmful sexual deepfakes of women. This violence against women is threatening, non-consensual, an invasion of privacy, and steals women’s control over their bodies. It truly violates women’s sense of security and safety. We must work to stop these violent acts against women; the issue of deepfake pornography cannot go ignored by media outlets and government officials any longer.

Concerned about your social media posts being used to create deepfake porn? Help make deepfake pornography illegal by contacting your local and state legislators. Click this link to learn more: https://openstates.org/

Works Cited

Ajder, Henry, et al. The State of Deepfakes 2019 Landscape, Threats, and Impact. Deeptrace, Sept. 2019, https://sensity.ai/reports/. Accessed 11 Apr 2022.

Clark, Korey. “‘Deepfakes’ Emerging Issue In State Legislatures — State Net.” Lexisnexis.Com, 2022, https://www.lexisnexis.com/en-us/products/state-net/news/2021/06/04/Deepfakes-Emerging-Issue-in-State-Legislatures.page. Accessed 12 Apr 2022.

“Deep Fake Nicolas Cage.” imgur, 22 Mar. 2018, imgur.com/jKpckyM. Accessed 12 Apr. 2022.

“Deepfakes | The Presidents.” YouTube, uploaded by Derpfakes, 5 Mar. 2019, www.youtube.com/watch?v=rvF5IA7HNKc. Accessed 12 Apr. 2022.

“Deepfake — Wikipedia.” En.Wikipedia.Org, 2022, https://en.wikipedia.org/wiki/Deepfake. Accessed 11 Apr 2022.

“Educating Image Forensic Experts On Deepfakes.” Sensity, 8 Sept. 2022, https://sensity.ai/blog/deepfake-detection/how-to-spot-a-deepfake-educating-the-public-on-deepfakes/. Accessed 12 Apr 2022.

Galston, William. “Is Seeing Still Believing? The Deepfake Challenge To Truth In Politics.” Brookings, 8 Jan. 2020, https://www.brookings.edu/research/is-seeing-still-believing-the-deepfake-challenge-to-truth-in-politics/#cancel. Accessed 12 Apr 2022.

Gieseke, Anne Pechenik. “” The New Weapon of Choice”: Law’s Current Inability to Properly Address Deepfake Pornography.” Vand. L. Rev. 73, 2020, https://cdn.vanderbilt.edu/vu-wp0/wp-content/uploads/sites/278/2020/10/19130851/The-New-Weapon-of-Choice-Laws-Current-Inability-to-Properly-Address-Deepfake-Pornography.pdf.

Gosse, Chandell, and Jacquelyn Burkell. “Politics and porn: how news media characterizes problems presented by deepfakes.” Critical Studies in Media Communication, vol. 37, no. 5, 2020, https://doi.org/10.1080/15295036.2020.1832697. Accessed 11 Apr. 2022.

Hao, Karen. “Deepfake Porn Is Ruining Women’s Lives. Now The Law May Finally Ban It..” MIT Technology Review, 12 Feb. 2021, https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/. Accessed 12 Apr 2022.

Jacoby, Evan. “I Paid $30 to Create a Deepfake Porn of Myself.” VICE, 9 Dec. 2019, www.vice.com/en/article/vb55p8/i-paid-dollar30-to-create-a-deepfake-porn-of-myself. Accessed 11 Apr. 2022.

Krasusky, Jillian. Adapted from https://www.techfunnel.com/wp-content/uploads/2017/12/7-Types-of-Hackers.jpg and https://i.stack.imgur.com/NsgkM.png. Created 11 Apr. 2022.

Martin, Noelle. “Online Predators Spread Fake Porn Of Me. Here’s How I Fought Back.” Ted.Com, 2022, https://www.ted.com/talks/noelle_martin_online_predators_spread_fake_porn_of_me_here_s_how_i_fought_back/transcript. Accessed 11 Apr 2022.

“The most urgent threat of deepfakes isn’t politics.” Still of video. YouTube, uploaded by VOX, 8 June 2020, www.youtube.com/watch?v=hHHCrf2-x6w. Accessed 12 Apr. 2022.

Nast, Condé. “More And More Women Are Facing The Scary Reality Of Deepfakes.” Vogue, 16 Mar. 2021, https://www.vogue.com/article/scary-reality-of-deepfakes-online-abuse.

“Open States: Discover Politics In Your State.” Openstates.Org, 2022, https://openstates.org/. Accessed 12 Apr 2022.

Parkin, Simon. “The Rise Of The Deepfake And The Threat To Democracy.” The Guardian, 22 June 2022, https://www.theguardian.com/technology/ng-interactive/2019/jun/22/the-rise-of-the-deepfake-and-the-threat-to-democracy. Accessed 12 Apr 2022.

Pradhan, Prajakta. “AI Deepfakes.” University of Illinois Law Review, 4 Oct. 2020, www.illinoislawreview.org/blog/ai-deepfakes/. Accessed 11 Apr. 2022.

Puutio, Alexander, and David Timis. “Deepfake Democracy: Here’s How Modern Elections Could Be Decided By Fake News.” World Economic Forum, 5 Oct. 2022, https://www.weforum.org/agenda/2020/10/deepfake-democracy-could-modern-elections-fall-prey-to-fiction/. Accessed 12 Apr 2022.

Sayler, Kelley, and Laurie Harris. Deep Fakes and National Security. Congressional Research Service, 8 June 2021, crsreports.congress.gov/product/pdf/IF/IF11333. Accessed 11 Apr. 2022.

Shao, Grace. “Fake videos could be the next big problem in the 2020 elections.” CNBC, www.cnbc.com/2019/10/15/deepfakes-could-be-problem-for-the-2020-election.html. Accessed 12 Apr. 2022.

--

--