How to earn a million dollars in one simple proof: a brief history of P vs. NP

Did this youngling just clickbait me by putting a meme, a million dollars and a spin on the work of Stephen Hawking in the title? What arrogance? Well, no, but yes ;)

A couple weeks ago, Prof. Dr Norbert Blum from the University of Bonn came out with his stab at getting richer: he gave a proof to the P vs. NP problem.

Let’s introduce you to the question: if we can verify the solution to a problem quickly (NP), can we also find an answer to that problem (P)?

That’s the gist of P vs. NP: one of Clay Mathematics Institute’s Millenium Prize problems. A group of 7 really, really, (did I say really?) hard problems: each worth a million dollars.

I’m not going to get into the beautiful finite details about the problem here, but I do want people to be able to appreciate the history behind how this problem came to be, how people have tried to find a solution to this problem, and what this means for Computer Science.

In 1955, John Nash (yep, the same dude played by Russell Crowe in the 2001 Oscar-winning movie — A Beautiful Mind) wrote multiple letters to the National Security Agency. If you do want to read the correspondence in it’s full glory, I’ll link it here, but here’s the good stuff.

In one of his letters, Nash basically came up with the idea of unbreakable encryptions. I’ll quote the best part (in my opinion, anyway) of the document here. You can skip reading it, but you won’t believe what he says in the last line. Also, nerdgasm. - general conjecture is as follows: for almost all sufficiently complex types of enciphering, especially where the instructions given by different portions of the key interact complexly with each other in the determination of their ultimate effects on the enciphering, the mean key computation length increases exponentially with the length of the key, or in other words, with the information content of the key.
The significance of this general conjecture, assuming its truth, is easy to see. It means that it is quite feasible to design ciphers that are effectively unbreakable. As ciphers become more sophisticated the game of cipher breaking by skilled teams, etc. should become a thing of the past.
The nature of this conjecture is such that I cannot prove it, even for a special type of cipher. Nor do I expect it to be proven.

If you haven’t realized it yet: this dude, the dude whose movie won 4 Oscars, just said that he couldn’t provide a proof, and doesn’t expect it to be proven.

An year later, Kurt Godel wrote a letter to John von Neumann. Before my inside nerd even spins out of control, get this, Kurt friggin Godel wrote a letter to John friggin von friggin Neumann. It’s like Albert Einstein chilling on the beach with Isaac Newton just discussing their thoughts on friggin life. It’s like Stephen Hawking partying with Nikola Tesla. It’s bloody ridiculous, and I love it. Anyway, this is what Kurt said in his letter, popularly called Godel’s Lost Letter (also, obligatory unfathomable and brain-hurting math and language warning, maybe not, but I warned you anyway)—

.. One can obviously easily construct a Turing machine, which for every formula F in first order predicate logic and every natural number n, allows one to decide if there is a proof of F of length n (length = number of symbols). Let Ψ(F, n) be the number of steps the machine requires for this and let φ(n) = maxF Ψ(F, n). The question is how fast φ(n) grows for an optimal machine. ..

Okay, this stupidus maximus will try to explain what’s going on here, so correct me if I’m wrong. Our boy Kurt here is basically asking the following: if we gave a computer a formula, how much time would it take to figure out a proof (or even tell us if there exists a proof) for that formula?

So Neumann never responded to Godel, or even if he did, he did not want us to know. Sadly, Neumann died an year later.

That’s where our story begins. A story that started 60 years ago. It took a while before P vs. NP was formally defined, until Stephen Cook (basically another Computer Science nerd with way too much time on his hands. Also really smart.) came up with the idea of polynomial-time reduction and subsequently defining NP-completeness.

Since then, there have been multiple attempts like Prof. Blum’s to inch towards a solution for the problem (many of them listed here). All of them have failed to scrutiny.

So, what would P being equal to NP mean? Well, basically, it would change everything. It would mean that all methods of modern encryption/cryptography will subsequently become worthless (don’t quote me on that) and that all our lives are meaningless and stuff.

Anyway, most people think P =/= NP. But then there’s Donald Knuth, who thinks that P is, in fact, equal to NP. If you don’t know who Donald Knuth is, let’s just say he’s this pretty smart dude who wrote this 4-volume behemoth called The Art of Computer Programming. He also has a Turing award, which is kind of a cool thing to own if you enjoy theoretical CS.

Anyway, what do I think? Truthfully, I don’t know. P being equal to NP would be infinitely more cooler than the alternative, so I’ll go with that.