Perhaps the new big phenomenon sweeping the internet right now is that of FakeApp.
For those of you who don’t know what FakeApp is, I don’t blame you — these days it’s hard to keep up with what the new big internet thing is. Essentially, FakeApp is a pc program that allows you to perfectly replace the face of someone in a video with someone else using deep learning technology. It’s a simple idea that has a lot of reach; let’s get into that.
For starters, you can use the application to make some pretty innocent, funny stuff. One of the first big videos passed around that used FakeApp was Nicholas Cage’s face transplanted onto Indiana Jones. Sure, the meme potential for FakeApp is pretty great… but, of course, that’s not actually what it’s being used for. Without a doubt, the most widespread use of FakeApp so far has been in pornography.
Users of FakeApp have primarily been using the software to take the faces of famous celebrities and place them onto porn actresses of a fairly similar body type — an act known as “deepfake”. The results are uncanny — FakeApp’s ability to perfectly simulate faces is already quite impressive, but for those with enough GPU power, time to kill, and pictures of their desired celebrity (this last step is pretty simple speaking of how easy it is to get images of a celeb’s face via interviews and movie scenes), it can look completely real. Even more horrifying is the fact that it doesn’t even take a celebrity to make a good deepfake. Using a friend and willing participant, I was able to translate his face into a video (not of porn, of course) at a decent quality using only 200 or so images. This means that, for any individual willing to scroll through someone’s social media for pictures, making a deepfake is relatively easy.
This, of course, creates some interesting legal headaches, mostly surrounding what should be done about the software. It raises some notable parallels to the legal controversy that surrounded lolicon, or the animation of child pornography, when it was first brought over to the US from Japan. Relatively speaking, it does not actually take a real child to make lolicon — and since initial child pornography laws within the United States primarily were created to prevent harm to children, states were slow in criminalizing loli pornography. However, after a multitude of notable psychologists claimed that individuals who consumed lolicon were more likely to consume real child porn, legislation began to be passed outlawing its possession. Even then, lolicon has not been banned in every state, and is not yet criminalized on a federal level.
With deepfakes the controversy becomes less about inherent immorality and moreso about reputations on the line. Ever since the infamous August 2014 celebrity nude leaks, it has been difficult for celebrities to keep their image from becoming sexualized. Now, however, one does not need to already have sexual pictures of themselves available — and they do not need to be celebrities. This, I predict, will cause legislation against the app to be much quicker and more widespread. That being said, the app still has a practical use, so an outright ban is not likely — rather, there will likely have to be some easy way to differentiate between a fake and real video.
Another big legal concern surrounding FakeApp goes in line with the much more widespread de-objectivity of video evidence. Previously within the legal system, video and audio evidence was seen as an obvious and straight connection between the criminal and his crime — now, however, one can produce a fake piece of evidence easily with tools such as this. One thing in all this is for sure — it will be interesting to see how FakeApp’s popularity pans out.