How deepfake porn bans will impact cloud compute providers

Denise Melchin
5 min readJul 25, 2019

--

Multiple states have started to take on the legislative battle against deepfake pornography. In Virginia, a law against revenge pornography just got amended to include deepfaked revenge pornography. In California, a bill rendering the non-consensual creation and publication of pornographic material illegal was recently making its way through the legislative process, but got blocked. This bill also covered deepfake pornography.

Deepfake pornography refers to the use of artificial intelligence techniques (mainly deep neural networks) to create videos or pictures of pornographic scenes which never actually took place. In analysing the troubles with deepfake pornography, it is tempting to focus on the creators of deepfakes as the main problem — but major companies like Amazon and Google are key enablers of the creation of non-consensual pornography.

Over the last few years, deepfake pornography has become more widespread. It has mostly targeted women famous for acting or modelling, but is occasionally also used to harass and silence other public figures. While large platforms in the online pornography space like Reddit and Pornhub have banned deepfakes from their websites, there are few laws protecting people from deepfake pornography.

In most countries, pornographic deepfakes themselves are not currently banned as they are often protected by free speech laws, although some cases of non-consensual deepfake pornography cases can still be illegal if they meet specific conditions.

It is possibly only a question of time when the first US law banning non-consensual deepfake pornography will come to fruition, mirroring previous legal successes against revenge pornography. The majority of US states have turned distributing revenge pornography into a criminal offense over the past few years.

But a law banning deepfake pornography would not only have implications for victims and perpetrators. It also has implications for companies whose technology is used to create pornographic deepfakes. Most notably cloud computing providers. The three biggest cloud providers are Amazon Web Services, Microsoft Azure and Google Cloud.

Cloud computing is often described as using ‘somebody else’s computer’. This is not an entirely inaccurate description. Cloud computing gives companies and individuals the opportunity to do almost anything they might want to do with their own computers, but without having to manage data centres or even servers themselves.

Everytime you use Gmail to email your friends without having to install any software, you are using cloud computing. This type of cloud computing, in which you use specific software without having to manage it yourself, is known as software-as-a-service.

There are other types in which customers access more fundamental functions of computers, platform-as-a-service and infrastructure-as-a-service. Both of these are mostly used by IT professionals and hobbyists. Software developers use platform-as-a-service so they can build and deliver apps to users without having to do maintenance work like hosting their own servers. Infrastructure-as-a-service is mostly used by IT administrators to have access to remote servers, storage capacity and other hardware without having to own and maintain them physically themselves.

Creating deepfake pornography requires a lot of computational power, that is, how many calculations the computer has to do internally. Most home computers are not optimised for providing a lot of computational power (with the exception of gaming computers) and therefore creating deepfake pornography can take days to weeks on a home computer.

Here cloud computing can help. Anyone who wants to develop deepfakes but doesn’t have a home computer with sufficient computational power can rent this computational power from cloud computing providers like Amazon Web Services or Google Cloud Platform. This can be done cheaply. The cost for computational power for one short pornographic deepfake clip depends on the machine learning model used and other details, but is usually only a few dozen dollars.

Together with apps helping users to create deepfakes without any machine learning expertise, cloud computing is enabling many more individuals to create deepfake pornography who otherwise would not have been able to.

So how would cloud providers respond to new laws banning deepfake pornography? All cloud providers require clients to abide by Acceptable Use Policies (AUPs) communicating to their clients what their platforms can be used for and what not. Their clients need to agree to the AUPs to be able to use the platform. While the most popular cloud providers have different AUPs, they share a lot of common ground. They disallow illegal activities as well as a variety of malicious but not necessarily illegal uses, like distributing spam. The cloud platforms have mechanisms in place to detect violations of the AUPs and can disable clients’ accounts if the clients do not abide by the AUPs.

If deepfake pornography becomes illegal, it is no longer cloud providers responsibility to decide whether deepfake pornography is allowed on their platforms. Given that the cloud providers already disallow illegal content and activities on their platforms, producing deepfake pornography will no longer be welcome. By rendering the creation of deepfake pornography illegal, changing laws will thus remove discretion from cloud providers to decide whether deepfake pornography is sufficiently harmful to not allow the creation on their platforms. This might incentivise them to develop methods to detect the creation of deepfake pornography so they can enforce their Acceptable Use Policies.

In this, cloud platforms are different to software providers. Cloud providers like Amazon Web Services and others sell a service which is why they have Acceptable Use Policies in the first place. This results in legal obligations for cloud providers that do not exist for plain software companies which sell a software product and an accompanying license. Software companies only ask their users to accept an End User License Agreement. These usually do not ask their users to refrain from illegal or malicious activities while using their software. Adobe is not able to disable their software if users photoshop naked celebrity pictures. But Amazon provides an ongoing service by renting out compute, and can terminate the service if they detect a user producing deepfake pornography.

Bills against deepfake pornography would be one of the first bills to make cloud providers’ AUPs explicitly prohibit a malicious use of artificial intelligence. Other kinds of malicious uses of artificial intelligence have been on the rise, and with the rise of power of artificial intelligence will likely become a bigger issue over time. Deepfake pornography gives cloud providers the opportunity to prove that they are prepared for the challenge.

--

--