“The Righteous Mind” or Why We Can’t Get Along, Part 1

Here are some notes and extrapolations from reading The Righteous Mind (website) by Jonathan Haidt. The book explains several interesting aspects of human nature and why “some people” seem like they are evil, idiots, or both. It is highly relevant today, and its main points will likely endure.

Philosophy and psychology have offered up three different takes on the intuition/emotion vs rationality/reason divide:

  • Plato: reason should govern emotion, though only true philosophers achieve it.
  • Jefferson: reason and emotion are and should be co-rulers.
  • Hume: emotion does and should rule, reason is a servant.

Hume was right, and it’s pretty settled. Ouch.

Intuition → Judgement → Justification

Judgement is an intuitive, instant process, and is separate from justification, which is a reasoning, slow process, based on our earlier judgement. I.e. we have instinctive gut reactions that we then post-rationalize. Motivated reasoning is prevalent. We hold strong opinions that we can’t explain.

If you’re having a strong reaction against this and are currently looking for all the reasons this is wrong, then… well… yeah. You get the point.

This is why we can be well aware of cognitive biases, yet still be affected by them even though we think we’re not.

Affect (flash of like/dislike to guide behavior) is primary and colors all perception — whenever you see something, you feel something, however small, about it. It’s so compelling it limits alternatives for later thinking. This is particularly true of social and political judgement, and reveals subconscious preferences we may not think we have.

It goes further: reasoning requires the passions. The head can’t even do “reasoning tasks” without the heart — it gets overwhelmed. And emotions are not dumb. Usually they’re right, or right enough. The tricky part is when they’re not.

A big factor here is the timeline involved. Reason can affect emotion if given time. You can realize that your gut reaction was leading you astray. But this needs to be done in an atmosphere where you’re not under pressure / defensive.

To Change Minds Don’t Provide Rationales. Nudge Intuitions

If you want to influence people, you should therefore influence their intuitions (be friendly instead of confrontational, actually see things from their point of view, nudge), not provide new rationales (which they will automatically reject).

Those nudged intuitions can then revise the person’s judgment given time.

This goes beyond the traditional “know the opposition’s arguments, be able to make their case” to actually feeling what they feel, which is quite different — we usually quickly dismiss that part.


Humans cooperate beyond kinship, which is extremely unusual in nature. This cooperation is based on accountability (= knowing we’ll have to justify our beliefs, feelings, actions), and we’re intuitively good at navigating it.

Exploratory thought = evenhanded consideration of alternative points of view.

Confirmatory thought = a one-sided attempt to rationalize a particular point of view.

Accountability increases exploratory thought only under specific circumstances (no pre-formed opinion, will have to present to an audience of unknown opinion, and believe audience is well-informed and interested in truth). Otherwise it enhances confirmatory thought — we’re trying to act in ways that can be persuasively justified or excused to others, sometimes also to ourselves.

When discovering the physical world we may seek truth, but when it comes to the social world we seek approval (even when we think we don’t) — we’re all little politicians trying to get re-elected.

We offer moral reasons for actions and injustices in order to rally others to our cause: we do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgement.

When we want to believe something, we ask “Can I believe it?”, look for something to support it, and stop thinking a soon as we find even a shred of pseudo-evidence. We now have justification if questioned.

When we don’t want to believe something, we ask “Must I believe it?”, look for something to discredit it, and upon finding a single reason for doubt, we can justifiably dismiss it.

This is why it’s so hard to notice bias in news we agree with and so easy to find it in news we disagree with. Consider steelmanning.

When we’re not held accountable, when there’s opportunity to cheat, even many honest people do: the majority of people cheat, but cheat just a little bit.

People are generally groupish, not selfish, and vote accordingly — we ask if the party/policy supports our group, not necessarily ourselves. We reason/argue to support our team, and to show commitment.

Reasoning evolved to help us engage in arguments, persuasion and manipulation in discussions with other people, not to help us find truth.

Since it’s so hard to see one’s blind spots, it really helps to have open, civil discussions with trusted people to reason better together.


This was part one in a three part series. Read the book or follow me on twitter for the sequels.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.