Image courtesy Pixabay

Pure rationality is a myth we should not aspire to

Dionne Lew
Jan 24, 2016 · 9 min read

(Note: pure rationality is a myth, not rationality. Expounding on why this is the case does not diminish the importance of probability, statistics or any other practice that helps us think better.)

‘Be rational’ people say as if –

  1. It’s possible, preferable
  2. The counterpart, irrationality, is avoidable & inferior.

In reality –

  1. It’s not possible to be fully rational
  2. We are all irrational to some degree and knowing that won’t prevent it
  3. Rationality is neither good nor bad, nor is irrationality.

This does not absolve us from responsibility for what we do or diminish the importance of methodically assessing quality data. Instead it challenges a tightly held belief that being purely rational is achievable and worth chasing.

Our attachment to the idea of the rational individual is so deeply ingrained that we do not stop to reflect that it is just that, an idea, rooted in ideology from the 1700s. This rationality assumes we are individuals who know what we think and want and act to get it. However, the reality is more complex.

1. It’s not possible to be fully rational

It’s not possible to be fully rational for many reasons including that we can’t know our minds -

  1. We are influenced by cognitive biases
  2. Thinking happens to us, as well as by us
  3. Who we are is also an extension of environment
  4. Even the most basic sensory information exists in context.

1. We are influenced by cognitive biases

We are influenced by cognitive biases of which we are largely unaware. Consequently, where rationality is possible, it is bounded.

The Muller-Lyer illusion below shows that even once we know the horizontal lines are of equal length, we still see them as shorter and longer.

This is caused by amplitude or end-point bias. Although there are varied explanations for why it occurs, it’s widely considered that in a 3-D ‘carpentered world’ we use angles to judge depth and distance and so the brain overrides information that both lines are equal in favour of seeing them as near and far angles. Familiarlity. We see what we expect to see.

Another example is Gambler’s Fallacy, believing a previous event influences a future outcome. If you flip a coin and get heads ten times in a row you do not have a better chance of getting tails next time. The outcomes are statistically independent. Each time you have a 50% chance of throwing either heads or tails. However “after observing a long run of red on the roulette wheel, for example, most people erroneously believe that black will result in a more representative sequence than the occurrence of an additional red.”

Nobel laureates Amos Tversky and Daniel Kahneman famous for the study of bias and heurisitics have suggested this occurs because people evaluate the probability of a certain event by assessing how similar it is to events they have experienced before.

These are just two of hundreds of mental shortcuts the brain uses to speed up decision-making. We don’t know for certain why we develop biases but the brain has limited processing capacity and so we think that while biases can lead to errors, when it comes to survival, timeliness may be more valuable than accuracy (distinguishing a snake from a stick), which also makes them useful. They can also have negative impacts.

For example, the availability bias influences how easily we recall and retrieve information. We think something that is more easily recalled is more likely to occur, which can lead to us worrying about the wrong risks. For example, if we hear a lot about an issue in the media we remember it and assume the risk associated with it is more likely.

In some cases, knowing about a bias helps overcome it. For example, we can train people to look for in-group bias (liking people who are like us) when recruiting. But being aware of a bias does not automatically enable us to override it. In the above example, educating people about randomness does not reduce the fallacy.

2. Thinking is something that happens to us, as well as by us

Do we think, or are we ‘thunk’? For a long time we’ve assumed that we’re in control of what we think consciously but this theory has been empirically refuted.

Professor Thomas Mertzinger says that rather than occurring at the personal level ascribed to us as individuals, thinking emerges from sub personal processes that include breathing and peristalsis. This will probably have heads nodding amongst advocates of embodied consciousness.

In other words, thinking is largely something that happens to us, rather than by will.

According to Mertzinger, research into mind wandering shows we spend two thirds of our time busy with task-unrelated thought, zoning out and losing inner autonomy hundreds of times a day. This is in addition to the uncontrolled complexities of the mind when asleep.

The ability to think and act autonomously is at the heart of rationality, yet mind wandering suggests that much (not all) of what we think is involuntary.

3. Who we are is also an extension of environment

The environment strongly influences what we do, deeply challenging the notion of a rational individual who thinks and acts independently.

I have written about this before but by way of example -

  1. The presence of a briefcase rather than a backpack in a room alters how we handle money (priming) — we negotiate more aggressively when there are business-related cues around
  2. People rate strangers less social and more selfish if they are holding a cold drink and more likable when they form their first impressions holding a cup of warm coffee
  3. Ambient temperate impacts mood, rainy weather makes us introspective and improves memory
  4. People are more honest when exposed to an icon from their religion even if they aren’t aware that they have seen it
  5. Judges give more lenient decisions at the start of the day and immediately after a break like lunch.

This is notwithstanding many other influences like genetics, conditioning, personality, culture or even the fight we had with our spouse that morning.

This should not come as a surprise as we’ve known about situational impacts for years.

For example, Zimbardo’s 1971 Stanford prison experiment allocated participants roles as prisoners and guards in a mock prison. During the experiment the guards subjected prisoners to torture and the prisoners became increasingly compliant. The experiment is widely (but not universally) considered to demonstrate that how people behave is dependent on the situation in which they find themselves rather than personality.

Many experiments have shown people will override their perceptions or deliberately draw wrong conclusions in order to obey, belong or feel part of a group.

In the notorious Milgram experiment ordinary people were ordered to give dangerous electric shocks to others and followed those orders even when they did not want to.

Asch’s conformity experiments also show how easily people slip into groupthink. Here confederates in an experiment agreed to call long sticks short and were able to convince a participant to go against his accurate assessment that the short stick was genuinely short.

While these are important insights it’s vital to remember that in all experiments there were participants who refused to do what they were told and that acquiescence was neither immediate nor certain. Zimbardo and Milgram’s methodologies have come under fire and it is rarely reported that Asch also showed how easy it is to snap people out of groupthink, ironically perhaps given the subject of this piece, by behaving sensibly. People are not sheep and may follow in certain contexts but not blindly, although they are heavily influenced by the degree of trust they have in a person or by authority.

Nevertheless, what we want and how we get it comes not just from personal reflection but also as a result of our interactions with others. Moreover, what we want most according to Professor Alex Pentland is based on what our peers (rather than we) agree is valuable.

Pentland says human behavior is determined as much by social context as by rational thinking or individual desire.

He quotes an example from the Great Recession of 2008 when many houses were suddenly worth less than their mortgages. Researchers found that it only took a few people walking away from their mortgages, something previously thought of as criminal, to convince many of their neighbors to do the same.

His research shows that social network effects often dominate the desires and decisions of individuals. We are collectively rational, and to a lesser extent, individually rational.

4. Even the most basic sensory information exists in context

In a brilliant talk on optical illusions that will challenge anyone’s notion of ‘reality’ Beau Lotto shows that we do not see what is in front of us. By changing the context in which coloured dots are placed, he changes perception. This demonstrates that sensory information has no inherent meaning but rather that even at the most fundamental level, context is everything.

He also shows how quickly our brains learn and redefine normality. First Lotto exposes the audience to two desert scenes that are physically alike. He then asks them to look at a separate image of dot placed centrally between a red backround on one side and a green background on the other. When participants are asked to look at the original desert scene again, the image changes, one side to red and the other green even though there are no physical changes to the image. What has changed is the way the brain sees. It has been trained to expect red/green illumination and this becomes the new normal even when the illumination is no longer present.

The key outtake is that no one can be an outside observer of nature. We’re not defined by our central properties but by environment and our interaction with it which is relative, historical and empirical.

2. We are all irrational to some degree and knowing that won’t prevent it

Knowing is supposed to be ‘half the battle’ but Professors Laurie Santos and Tamar Gendler say knowing is a tiny portion of the battle when it comes to understanding most real world decisions.

“You may know that $19.99 is pretty much the same price as $20.00, but the first still feels like a significantly better deal. You may know a prisoner’s guilt is independent of whether you are hungry or not, but she’ll still seem like a better candidate for parole when you’ve recently had a snack. You may know that a job applicant of African descent is as likely to be qualified as one of European descent, but the negative aspects of the former’s resume will still stand out.”

Knowing about the bias doesn’t elimate it. Studies have shown that the same CVs are perceived differently depending on whether they are submitted with an European or African name, by males or females. Interestingly, it makes no difference in the sciences, often considered more ‘rational’ and ‘objective’ than other disciplines.

In a controlled trial at Yale University female scientists with the exact same qualifications as male counterparts in the experiment were automatically offered lower salaries. Interestingly, both male and female decision-makers showed the same gender bias in assessing candidates.

When asked to explain the harsher assessments participants were able to come up with rational arguments for why. We can explain away just about anything in our own minds. Being rational may have little bearing on facts.

3. Rationality is not implicitly good or bad, nor is irrationality

When we tell someone to be rational, we are usually implying it’s a better process that will lead to a better outcome. But this may not be the case.

Examining scientific teams on work sites Professor Kathryn Clancy found that over 60% of respondents had been sexually harassed and 20% sexually assaulted. The majority of perpetrators were senior scientists targeting junior females.

Provided the results (the methodology was based on self-report surveys) are accurate, how can this be? Scientists are trained to think and act rationally and we assume may behave consistent with their training.

Questions emerge -

  1. What is the relationship between thinking, training and behaving?
  2. Does training in one area like technical expertise have any impact on behaviour in another area, like social relationships?
  3. Was the decision to harass colleagues conscious or did the behaviour emerge in other ways?
  4. Was it rational or irrational?
  5. Given that it was illegal; does it matter?

You cannot attribute any set of values to rational over irrational -

  1. You can be irrational and harm others, but
  2. You can plan in a rational way to harm others. (Harming others is not ipso facto irrational. Research shows cooperation for the collective worse is as widespread as cooperation for a better society.)

Where to now?

While we’re not wholly rational we should not assume irrationality is synonymous with wilful ignorance or that an unbridled lack of self discipline is okay. Irrational simply means individual knowing or will is not at our beck and call.

Knowing may not always be half the battle, but in some areas, it is.

We should continue to learn what drives us, take responsibility for our actions, try to make better decisions, seek the best possible information and think critically.

But the myth of rationality and its worship should be put to bed.

Love to connect on Twitter @dionnelew. I blog at theunderneathness, if you like my work you can see more there.

Dionne Lew

Written by

Paints, writes at