# Monty Hall Problem

### The Situation

- You’re on a game show.
- There are three doors.
- Behind one of them is a car.
- Behind the other two are donkeys.
- You get to open one door, in hopes of getting the car!

But then, this happens:

What do you do?

### First Approach

Hm. Well, idk what is going on here, but I see two doors in front of me. I have to choose one. One of them has the car, one of them doesn’t. So I have a 50–50 shot of choosing the right door. It doesn’t matter if I switch or not.

I may be the first person to tell you this… but there’s nothing *wrong* with that approach!

Yes, you heard me correctly. The Bayes Police have arrested me, but I stand my ground. They proclaim that there is a 2/3 chance of getting the car if you switch doors, and a 1/3 chance of getting the car if you don’t. But again, I stand my ground — from your perspective, you’re completely justified in your conclusion of there being a 50–50 chance of picking the right door.

I’ll explain why I deviate from the Bayes Police later. First, let me explain where they’re coming from.

### Sherlock

I want you to put on your Sherlock Holmes hat.

I want you to put yourself inside the mind of the game show host — Monty. Monty is a devious fellow. He’s playing mind games with you. See if you could outwit him.

He has to open a door after you choose your door. If you were him, what door would you open?

Well, you’re not allowed to open door A, because that’s what the contestant chose. The idea is that you open one of the doors that the contestant *didn’t* choose.

If you open B:

If you open C:

Either way, it places the contestant in a world where his initial choice was correct. One that looks like this:

Now let’s see what happens if the contestant had chosen the wrong door. What if the car is behind door B?

As the host, what do you do?

- You’re not allowed to open door A, because the contestant chose door A.
- You don’t want to open door B, because the car is behind it! If you do, you’d be showing the contestant where the car is, and he’d obviously pick choose door B.

That leaves one option — open door C:

That places the contestant in a world where his initial choice was incorrect. One that looks like this:

Let’s now consider the final possibility — the car is behind door C:

As the host, what do you do?

- You’re not allowed to open door A, because the contestant chose door A.
- You don’t want to open door C, because the car is behind it! If you do, you’d be showing the contestant where the car is, and he’d obviously pick choose door C.

That leaves one option — open door B:

That places the contestant in a world where his initial choice was incorrect. One that looks like this:

### Fuck

Fuck. As the host, 1/3 of the time I’m putting the contestant in the world where switching loses:

… but **2/3** of the time I’m putting the contestant in the world where switching **wins**:

Is there anything I could do about this?

No.

- When the car is behind door A, I
*have*to put him in the world I did. - When the car is behind door B, I
*have*to put him in the world I did. - When the car is behind door C, I
*have*to put him in the world I did.

What if the contestant knows this???

He’ll know that 2/3 of the time I *have* to put him in the world where switching wins, and 1/3 of the time I *have* to put him in the world where switching loses.

Think about it like this:

### Lightbulb

We still have our Sherlock hats on. We’ve deducted that 2/3 of the time the host *has* to put us in a world where switching wins, and 1/3 of the time he *has* to put us in a world where switching loses.

So then…

- If we switch, we’ll win 2/3 of the time.
- If we stay, we’ll win 1/3 of the time.

QED!

If you want to run some simulations to see this, check out this link.

### Your First Approach Wasn’t Wrong

Now let me explain why I claim that your first approach wasn’t wrong.

- Given the info you had, it was logical.
- Bayesian thinking is about taking an initial belief, gathering evidence, and updating your initial belief based on this evidence.
- In the Sherlock case, you’ve gathered the Sherlock Evidence, and updated your belief from 1/2 to 2/3.
- But without the Sherlock Evidence, you have no reason to update from 1/2 to 2/3!

I don’t know why the Bayes Police were after me. *Sherlock Thinking is HARD!*

It doesn’t make sense to say that the probability *is* 2/3. As Eliezer Yudkowsky eloquently explains — *probability is in the mind*.

- In Sherlock’s mind, the probability is indeed 2/3.
- But in a non-Sherlock mind…
*the probability is 1/2!*

Let’s consider what the probability in some other minds:

- In the mind of someone with x-ray vision, the probability is 1.
- In the mind of someone who’s bribed the host, the probability is about .9 (there’s always a chance that the host screws you over).

Maybe a different example will be helpful. Consider a coin flip:

- In the mind of a normal person, the odds are 50–50.
- In the mind of a person with a superheroic ability to watch the coin flipper, see how much force is applied to the coin at what angle, and quickly perform a calculation — perhaps the odds are 60–40.

### Parting Thoughts

If you haven’t yet understood Sherlock Thinking, hopefully you do now.

If you are part of the Bayes Police, hopefully you’ll stop arresting people!