Intuitively illogical

How mental constructs shape subjective reasoning

David Rosson
Linguistic Curiosities
6 min readNov 22, 2021

--

There’s a riddle (#16) from Michael Shackleford’s collection of maths problems:

“A box contains two coins.
One is heads on both sides, and the other is heads on one side and tails on the other.
One coin is selected from the box at random, and the face of one side is observed.
If the face is heads, what is the probability that the other side is heads?”

Though listed as an entry-level problem, it received “more emails…than any other”. What’s fascinating is not the problem itself or its solution, but the phenomenon that a large number of well-educated people, many of which trained in science, not only initially arrived at incorrect answers, but often remained adamantly convinced of their own alternatives — sometimes even with elaborate “proofs”.

The phenomenon also happened with the original Monty Hall Problem, where some reactions came along the lines of: “I have a PhD/teaching post in mathematics, and here’s the proof why my alternative answer is right”

Intuition, narratives, and logical reasoning

I can only speculate about why this question is easy to get wrong initially (or, intuitively). The hypothesis is that, when the setup of the experiment is phrased as “a coin is selected at random from the box”, it misleads the mind into constructing a model about drawing coins (vs. drawing entangled sides). Since there are two coins, each of one type (1x normal, 1x double-headed) in the box, it’s intuitive that the chance of picking one of them is 50/50.

This construct of drawing coins is so distracting, that it prevents people from taking into account that the experiment is about “observing one side of a coin”, and what that observation reveals contains “extra information”.

Monty Hall Problem. From Wikipedia.

In the Monty Hall problem, two goats and one car, regardless of what you have initially picked, the host always has the option of revealing a goat. Because the host has insider information about where at least one goat that you have not picked is.

Imagine the Two Coins problem, only with a different narrative:

There are two pairs of twins, one pair is boy-boy, the other is boy-girl. One child at random stands at the door.

You open the door and see a girl, what is the probability she has a twin brother? 100%, intuitively, since all the other three are boys. Now, you see a boy, we know that he has a twin, and that the twin is one of the three other children. What is the probability that the twin is a boy?

Odds and categories

When we ask: “What is the probability?” It follows to specify: “the probability of what event, exactly?” Our intuition often misses information embedded in the specificity, because our mind works with mental categories.

Let’s explore what that means. Consider an example:

You see this week’s lottery numbers.

If you buy a ticket with exactly the same set of numbers for next week, are you less likely to win?

Our intuition says it’s nearly impossible for an astronomically rare event to strike twice — whereas logic says each unique combination of numbers has exactly the same chance.

Normal dice table, from Wikipedia.

When we roll 2 six-sided dice, our intuition says that it’s rare to throw a 12 or a 2. This is largely correct, but it’s not because it’s harder to throw a 6 with one of them or with the other, or in fact, with both in a row (as intuition would have us think).

It is equally rare to throw 6 twice or to throw the first one 3 and the second one 4. It’s when we discard the specificity of the sequence, and account only for the sum, that we have the two more often add up to sums in the middle.

We could even say that a coin landing on its edge is equally rare as any unique configuration of infinitely minute details that we categorise as heads or tail.

Perspective and perception of probability

Consider another example:

There’s a room of random people. What is the probability that two of them share the same birthday?

When you meet someone in real life who shares the same birthday, you would say: “Wow, what a coincidence!” Even if you miss by a day or two, you would still think “that’s pretty rare to be so close”.

But that feeling of rarity comes from imagining an encounter with someone who has exactly *your* birthday, or, vicariously from the perspective of a person who experiences such an encounter. The perception is skewed, since it’s no longer just two random people in a room having the same birthday.

When we shift our perspective back to the general room: “If there are 23 people in a room, there is a 50% chance that two people in that room will have the same birthday. With 57 people the probability climbs to 99%.

From quantum to quasar… to quackery

Narratives and perspectives and mental constructs with which we internally model a problem influence the perception of what is being asked and what the answer is… But I wanted to talk about something tangential.

How is the mind so bad at logic?

Or maybe a broader question:

Why do educated people still fall for egregious fallacies, ranging from homeopathy to command economics to religious metaphysics?”

** Speaking of homeopathy… surprisingly, not surprisingly, it’s formidable industry in Germany. Sugar pills with no active ingredients are sold in fully licensed pharmacies, in packaging that resemble prescription medicine, and statutory health insurance covers it as specialist “therapy of your choice”.

Part-time rational

When we act as rational beings, how does that work “mechanically”? It almost seems like we carry out a form of memory-based computation, starting from the state of what is left over.

If yesterday the mind believed that the Earth was flat, then the mind wakes up on the following day, and resumes from that point. If on that day, through a sequence of presentations and lines of reasoning, you become convinced and have decided that the Earth is actually not flat, then the following day you start from that new point which you have accepted.

Furthermore, it seems that these mental states can be compartmentalised in different “areas” of reasoning. Someone could study biology and read a textbook, for example, a chapter on cells, then reason along, understanding one concept or mechanism after another, and nod along: oh, this makes sense, that also makes sense, this is how metabolism works, and so on. The facts are true within the boundaries of that particular rational exercise.

Then the same person (or even a biologist, or logician) could switch contexts on the weekend, and mentally entertain the notion that there is nothing irreconcilable about virgin birth, walking on water, transubstantiation, telepathically conveying thoughts, and eternal consciousness.

Some flat-Earthers have gone out and bought a ring laser gyroscope (a rather expensive precision instrument), having understood and acknowledging the physics of rotation, figured out the engineering setup of how to gather data, got the results, stared at the results (that contradicted their beliefs), then nonetheless decided: “Nah, it’s still flat!”

The precarious state of reason

Skills, sophistication, and the capacity to understand science “part-time” appear to be able to perfectly coexist with being plain bonkers in mutually isolated “chambers of facts”, the way that a trained mechanical engineer could one day pick up dousing.

Education probably helps, it’s as least correlated with reason. Biologists are much less likely than the general population to hold religious beliefs. Yet education clearly does not immunise us against unreason. The chambers of facts don’t often get pressure-equalised, and the human mind is not even naturally inclined to think in the way of the scientific method.

It’s easy to think of enlightenment as some one-off event, that once an individual acquires the ability to reason, that person is suddenly from then on coherent in dealing with all aspects of life — ranging from personal introspection, to professional knowledge, to political and economical persuasions, to beliefs about metaphysics — rationally, and there will be no more self-delusion, no more self-hypnosis, in any of these aspects. That idea itself is quite possibly magical thinking.

--

--