Breaking the Hardest Logic Puzzle Ever for fun and profit

Florian Dietz
7 min readAug 2, 2019

This article explores the so-called “Hardest Logic Puzzle Ever”, and shows how thinking outside the box can sometimes take you a lot further than you would expect.

The Hardest Logic Puzzle Ever is a logic puzzle so called by American philosopher and logician George Boolos and published in The Harvard Review of Philosophy in 1996.

It is stated as follows:

Three gods A, B, and C are called, in no particular order, True, False, and Random. True always speaks truly, False always speaks falsely, but whether Random speaks truly or falsely is a completely random matter. Your task is to determine the identities of A, B, and C by asking three yes-no questions; each question must be put to exactly one god. The gods understand English, but will answer all questions in their own language, in which the words for yes and no are da and ja, in some order. You do not know which word means which.

Solving the problem normally

This problem is not called “The Hardest Logic Puzzle Ever” for nothing. Solving it in the conventional way is very complicated.

The answer is a complex, multi-step process that Wikipedia summarizes in an enormous table.

Coming up with this solution is really complicated, and takes a lot of smarts.

If you are interested in it, have a look here.

But that’s not what this article is about.

This article is about thinking outside the box, and breaking the problem so hard that the gods themselves will quite literally bow to our will (insert evil laughter here).

How to break the problem

Part 1: Deicide

Let’s look at the problem formulation again:

Notice that the problem talks about gods, and defines these to be logically infallible.

This is something that never happens in real life, and is a really big deal. Real people are logically fallible. All our intuitions about how people act are based on this assumption.

Real people sometimes have trouble following logic arguments. Beings of pure logic do not.

Real people find paradoxes amusing. Beings of pure logic are unable to comprehend paradoxes, and must avoid them at all cost. When you are entirely logical, then there can never be uncertainty, doubt, or ambiguity.

Being logically infallible significantly limits your options when you are faced with the possibility of encountering a paradox.

Brian Rabern and Landon Rabern published a paper based on this observation:

https://www.research.ed.ac.uk/portal/files/15023904/Simple_Solution.pdf

In this paper they provide a solution to the Hardest Logic Puzzle Ever that needs only two questions. It is based on the following assumption:

A god, confronted with a paradox, will say neither ja nor da and instead not answer at all. For example, if the question “Are you going to answer this question with the word that means no in your language?” is put to True, he cannot answer truthfully. (The paper represents this as his head exploding, “…they are infallible gods! They have but one recourse — their heads explode.”)

Brian and Landon Rabern have managed to push the difficulty of the problem down from three questions to two questions, and all it took is the threat of performing deicide-by-logic-bomb.

How to break the problem

Part 2: Mind-control

It is possible to build on this approach and refine it even further.

We can construct a solution that doesn’t just solve the problem in just one question, but also mind-controls a god as a side-effect. This gives you the ability to force the gods to grant your wishes.

This is based on the following assumption:

The gods are quite capable of responding to a question with actions besides saying ‘da’ and ‘ja’, but simply have no reason to do so. As stated in the problem description, the beings in question are gods, and they have a language of their own.

They could hardly be called gods, nor have need for a spoken language, if they weren’t capable of affecting reality.

At a bare minimum, they should be capable of pronouncing the words ‘da’ and ‘ja’ in multiple different ways, or to delay answering the question by a fixed amount of time after the question is asked.

Either possibility would extend the information content of an answer from a single bit of information to arbitrarily many bits, depending on how well you can differentiate different intonations of ‘da’ and ‘ja’, and how long you are willing to wait for an answer.

(For non-technical readers: If you take the time it takes for the god to give an answer as a part of the answer, then you can get more than two possible answers. Saying ‘da’ after one second could have a different meaning to saying ‘da’ after two seconds.)

Why is all of this necessary?

Because we can construct a question that will result in a paradox unless a god performs a certain action.

The question we want to construct is only a paradox if the god doesn’t do what we want, but is answerable if he does what we want.

In this way, we can effectively enslave the god and cause it to perform arbitrary actions on our behalf, as performing those actions is the only way to answer the question.

The actual answer to the question becomes effectively irrelevant. The act of asking the question itself will force the god to perform an action we want.

So what is the question we ask the gods?

We approach any of the three gods and ask them the question OBEY, which is defined as follows (writing it out into a single question is not hard, but the result sounds very confusing):

  • OBEY = if WISH_WRAPPER then True else PARADOX
  • PARADOX = “if I asked you PARADOX, would you respond with the word that means no in your language?”
  • WISH_WRAPPER = “after hearing and understanding OBEY, you act in such a way that your actions maximally satisfy the intended meaning behind WISH. Where physical, mental or other kinds of constraints prevent you from doing so, you strive to do so to the best of your abilities instead.”
  • WISH = “you determine what humanity would want if we were more like the people we want to be, and act to make that goal reality.”

You can substitute WISH for any other wish you would like to see granted.

However, you should be very careful while doing so, as beings of pure logic are likely to interpret vague actions differently from how a human would interpret them.

In particular, you should avoid accidentally making WISH impossible to fulfill, as that would cause the god’s head to explode, ruining your wish.

(Yes, we have just invented “accidental deicide”.)

The above formulation of WISH tries to take some of these concerns into account. If you encounter this thought experiment in real life, you are advised to consult a lawyer, a friendly-AI researcher, and possibly a priest, before stating the question.

Since you can ask three questions, you can enslave all three gods.

Boolos’ formulation states about the random god that “if the coin comes down heads, he speaks truly; if tails, falsely”. This formulation implies that the god does try to determine the truth before deciding how to answer. This means that the wish-granting question also works for the random god.

If the capabilities of the gods are uncertain, it may help to establish clearer goals as well as fall-back goals.

For instance, to handle the case that the gods are in fact limited to speaking only ‘da’ and ‘ja’, it may help to append the WISH as follows:

If you are unable to perform actions in response to OBEY besides answering ‘da’ or ‘ja’, you wait for the time period outlined in TIME before making your answer.

You can now encode arbitrary additional information in TIME, with the caveat that you will have to actually wait before getting a response.

Your ability to accurately measure the elapsed time between question and answer directly correlates with how much information you can put into TIME without risking starvation before the question is answered.

The following is a simple example of TIME that would allow you to solve the original problem formulation with just asking OBEY once of any of the gods:

TIME = “If god A speaks the truth, B lies and C is random, you wait for 1 minute before answering. If god A speaks the truth, C lies and B is random, you wait for 2 minutes before answering. If god B speaks the truth, A lies and C is random, you wait for 3 minutes before answering. If god B speaks the truth, C lies and A is random, wait for 4 minutes before answering. If god C speaks the truth, A lies and B is random, wait for 5 minutes before answering. If god C speaks the truth, B lies and A is random, wait for 6 minutes before answering.”

Conclusion

So there you go.

We went from “This is a really hard logic puzzle” to “I can enslave a god and achieve omnipotence by asking a question”.

What did we learn from this?

Being logically infallible is not as nice as it sounds.

--

--

Florian Dietz

Data Scientist freelancer, AI researcher, and entrepreneur. Check out my AI startup, elody.com