Proof of Truth in an era of Deep Fake Porn

Jeremy Epstein
Feb 22 · 4 min read

tl;dr: the world has been tormented by “fake news” recently. As AI systems get better, the problem is going to get much worse. Blockchains offer a potential solution.

There is artificial intelligence technology available now which can seamlessly superimpose the face of one person on the body of another.

You can imagine how this can be used for nefarious ends. The thing is, there’s no need to imagine it because it’s already happening. Using some AI tech, pictures, and a home computer, people have created fake porn videos of actresses like Gal Gadot and singers like Taylor Swift. And, needless to say, people are using it as a form of revenge and blackmail.

It’s called “deep fakes.”

It can also make it look like someone is saying something which they didn’t actually, as in this doctored video of a speech by President Obama.

https://youtu.be/dkoi7sZvWiU

In my research on AI, I have found hundreds of these examples. With neural networks, it is possible to generate thousands of completely fake faces instantly.

All of these below are TOTALLY FAKE.

“I’ll believe it when I see it” sounds like it is about to be an idiom relegated to the dustbin of history.

How could blockchain help?

To answer that question, first a story.

I was reading Crypto Theses for 2019 by Arjun Balaji recently. In and of itself, it’s excellent, but there was one thing that I saw that struck me as extremely unique and innovative.

All the way down at #62, he writes:

62) One place we’ve seen little regulatory action is with “crypto influencers” facing fines or other actions, though regulators have started clamping down on celebrities.

Naming names is rude, but this SHA-256 hash has my list of influencers that are more likely to get rekd, with a reveal coming in 2020:

a6c061624f97399d08fb58dbd23801ab9d03a9329128f5147a9873c9daf906a1

https://medium.com/@arjunblj/crypto-theses-for-2019-dd20cb7f9895

At first, I missed the brilliance of it and then it dawned on me.

The SHA-256 hash is a unique digital output based on a unique digital input. If anyone thing changes in the input, the output is totally different.

So, for example, if I list everything I ate for breakfast today

  • banana chia pudding
  • coffee
  • soy milk
  • goji berries

the SHA-256 hash for that will be TOTALLY different than a list that says:

  • banana chia pudding
  • coffee
  • soy milk
  • 3 goji berries

So, when you make a list of predictions and create an SHA-256 hash of it, you can share the hash without sharing the list.

Later, you can go back and share the original list. From there, someone can independently prove that it was the same list you made because, if it is, the hash will be the same. If you try to change your predictions from:

“I think Clinton will be President” to “I think Trump will be President,” it will be a different hash and everyone will know you aren’t sharing the same information.

In this way, you can make a prediction without revealing the prediction but later prove to someone that you were right (or wrong, I suppose). It’s brilliant.

Plus, if you timestamp the prediction in a blockchain, you can prove not just the accuracy of the hash, but WHEN you made the prediction. Even more bonus points.

and now we get to it…

Ok, so having that as background, here’s where I think we are going to have to go, out of necessity. I’m not sure at all how this all takes shape and I’m sure there are holes in it, but something like this.

Pictures and videos will have to have SHA-256 hashes (or some other type of algorithm) AND get timestamped in a blockchain.

Every phone and video camera will come with this capability built-in natively.

Whenever you take a picture, it will get hashed and recorded. Then, if that picture or video is ever doctored and published, it will have a completely different hash.

It may not help with the dissemination or visual PR damage, but it may help debunk “fake news” and “fake porn” and “fake other stuff” from the real stuff.

We’ll need a way for our screens to verify an image or video at which we are looking against some sort of proof provided by that person.

The more I think about it, the more I realize that I have no idea how it will all work.

All I do know is that AI can create fake images, blockchains can provide proof of authenticity, and if we don’t figure out how to merge those two, the implications for society could be dramatic.


Originally Published on https://www.neverstopmarketing.com


Data Driven Investor

from confusion to clarity, not insanity

Jeremy Epstein

Written by

Committed to Crypto since 2015. Linkedin.com/in/jer979 @jer979

Data Driven Investor

from confusion to clarity, not insanity

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade