There is a famous Mark Twain saying (probably apocryphal…)
It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.
It is great to be confident of your views of the world but care needs to be taken to ensure your confidence doesn’t stop you from being open to alternate points of view and possibly realising that confidence is misplaced.
Intellectual humility (IH) is simply “the recognition that the things you believe in might in fact be wrong,” (Vox)
In a paper on the Psychology of intellectual humility the authors write:
…people high in IH pay greater attention to the evidentiary basis of their beliefs and spend more time thinking about beliefs about which others disagree. This pattern may reflect the fact that people who recognize that their views are fallible are naturally more motivated to think about the accuracy of their beliefs than people who assume that they are right about most things.
The belief that one’s beliefs are fallible is associated with motives that reflect a proactive, inquisitive approach to knowledge. People who are high in IH score higher in epistemic curiosity, the motivation to pursue new ideas and address holes in one’s knowledge. Their higher curiosity seems to be motivated both by the intrinsic enjoyment of learning new information
In a wonderful article in Vox, the concept of intellectual humility (IH) is explored.
- In order for us to acquire more intellectual humility, we all, even the smartest among us, need to better appreciate our cognitive blind spots. Our minds are more imperfect and imprecise than we’d often like to admit. Our ignorance can be invisible.
- Even when we overcome that immense challenge and figure out our errors, we need to remember we won’t necessarily be punished for saying, “I was wrong.” And we need to be braver about saying it. We need a culture that celebrates those words.
- We’ll never achieve perfect intellectual humility. So we need to choose our convictions thoughtfully.
Two related concepts tie into this work on intellectual humility; The Dunning Kruger effect and Philip Tetlock’s Foxes and Hedgehogs.
The Dunning-Kruger effect refers to the seemingly pervasive tendency of poor performers to overestimate their abilities relative to other people–and, to a lesser extent, for high performers to underestimate their abilities. The explanation for this, according to Kruger and Dunning, who first reported the effect in an extremely influential 1999 article in the Journal of Personality and Social Psychology, is that incompetent people by lack the skills they’d need in order to be able to distinguish good performers from bad performers:
“…people who lack the knowledge or wisdom to perform well are often unaware of this fact. We attribute this lack of awareness to a deficit in metacognitive skill. That is, the same incompetence that leads them to make wrong choices also deprives them of the savvy necessary to recognize competence, be it their own or anyone else’s.”
We see the Dunning -Kruger effect is all time as highlighted in this Aeon article
it’s typical for people to overestimate their abilities. One study found that 80 per cent of drivers rate themselves as above average — a statistical impossibility. And similar trends have been found when people rate their relative popularity and cognitive abilities. The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices but, also, they are robbed of the ability to realise their mistakes.
A word of caution from the [Citation needed] blog
We should also try to be aware of another very powerful cognitive bias whenever we use the Dunning-Kruger effect to explain the people or situations around us–namely, confirmation bias. If you believe that incompetent people don’t know enough to know they’re incompetent, it’s not hard to find anecdotal evidence for that; after all, we all know people who are both arrogant and not very good at what they do. But if you stop to look for it, it’s probably also not hard to find disconfirming evidence.
Phil Tetlock in his book Superforecasting divides decision makers into Foxes and Hedgehogs. The concept comes from The Greek poet Archilochus who wrote, “the fox knows many things, but the hedgehog knows one big thing.”
Phil Tetlock argues it’s a way of understanding two cognitive styles:
Foxes have different strategies for different problems. They are comfortable with nuance, they can live with contradictions. Hedgehogs, on the other hand, focus on the big picture. They reduce every problem to one organizing principle.
Steve Johnson writes in “Farsighted”(p90)
In Tetlock’s analysis, the foxes — attuned to a wide range of potential sources, willing to admit uncertainty, not devoted to an overarching theory — turned out to be significantly better at predicting future events than the more single-minded experts. The foxes were full spectrum; the hedgehogs were narrowband. When trying to make sense of a complex, shifting situation — a national economy, or technological developments like the invention of a computer — the unified perspective of a single field of expertise or worldview actually appears to make you less able to project future changes. For the long view, you need to draw on multiple sources for clues; dabblers and hobbyists outperform the unified thinkers.
A bad combination to be is a hedgehog suffering from the Dunning Kruger effect!
Warrent Berger outlines several good questions to ask yourself on his “A More Beautiful Question” Q-Cards
- Do I tend to think more like a soldier or a scout? A soldier’s job is to defend, while a scout’s purpose is to explore and discover.
- Would I rather be right, or would I rather understand? If you place too much importance on being right, it can put you in “defense” mode and close off learning and understanding.
- Do I solicit and seek out opposing views? Don’t ask others if they agree with you — ask if they disagree and invite them to say why.
- Do I enjoy the “pleasant surprise” of discovering I’m mistaken? Finding out you were wrong about something needn’t be cause for shame; it’s a sign of intellectual openness and growth.
Writes Leo Tolstoy:
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of a doubt, what is laid before him.
Let me know what you think? I’d love your feedback. If you haven’t already then sign up for a weekly dose just like this.
Get in touch… — linktr.ee/Tomconnor
More like this from 10x Curiosity
- Helping people make better choices — Nudge Theory and Choice architecture — Can you make the default option one that nudges people towards better outcomes?
- Communicating to yourself and others — Your Personal User Manual and other great tools — Why not simply tell people directly how you would like them to interact with you?
- The Flywheel Effect
Exploring the power of simple reinforcing loops executed over time
- Looking in the rear view mirror… — Are you aware of the hindsight bias you are applying to your reaction to events that happen in life?
- Achieving Diversity — why being unbiased is not enough — Diversity is something we aspire to, but are we paying more than lip service to it? Are individual good intentions enough to achieve a diverse group setting, at work, socially or through the community?