The perils of intuition

Fall When Hit
Fall When Hit
Published in
5 min readJun 23, 2015

Intuitive decision making is celebrated in many fields, from leaders who effortlessly make decisions to sports stars who just seem to have more time on the ball than anyone else. They raise decision making to an art form, and relegate the rest of us to plodding, uncreative box tickers.

This is particularly true in the British Army, where from early on officers hear terms like coup d’œil (stroke of the eye) or fingerspitzengefühl (finger-tip feeling). The British Army loves its amateurs — and intensely dislikes reading pamphlets.

But there is a real danger in elevating intuitive decision making, and in not recognising its perils.

As an example, British Army Review 158 included an article by Lieutenant Colonel (ret’d) Chris Booth called “Knowledge Based Intuition”. The article celebrated this faculty, and cited its recognition by US and UK doctrine. It is a quick and interesting canter through the literature in the context of our recent experience in operations.

Unfortunately, though, the two most critical factors in intuitive decision making were glossed over, and this could have serious consequences for operational success.

First, heuristics should not be regarded simply as clever rules of thumb employed by intuitive decision makers, for a very good reason: they are not always helpful, and are often very dangerous. Heuristics are models of how humans actually make decisions. In other words, humans make decisions in subtle, complicated and largely unconscious ways, and researchers try to explain these with heuristics. Unfortunately, patterns in human decision-making lead to systematic decision failures, called cognitive biases.

For instance:

  • Confirmation bias: people tend to seek facts to support their arguments, rather than disprove them
  • Belief bias: people tend to overweight strength of belief at the expense of logical argument
  • Availability heuristic: people overestimate the importance of something that is easy to remember
Daniel Kahneman

Nobel-prize-winner Daniel Kahneman has done extensive research into the field of cognitive biases, and before anyone gets too excited about intuition they should read his magnum opus, Thinking, Fast and Slow.

Ironically, Kahneman got his start as an Israeli army officer conducting that country’s equivalent of the Army Officer Selection Board. The dominant heuristic was — and is — to find candidates with “officer quality” — confidence, poise, physical stature, athleticism, etc. Over time his research revealed that so-called officer quality was completely uncorrelated to effectiveness in combat. That heuristic was actually harmful.

Gary Klein

Second, we must be clear how good intuition is formed and when it is appropriate. Booth cites the work of both Gary Klein (a proponent of intuition) and of Kahneman (a sceptic). When these two intellectual opponents collaborated for the first time they were able to address their conflict, agreeing that intuition is useful in the short-term in relatively straight-forward environments where cause and effect can be linked.

Decision makers who apply intuition must have extensive experience in the relevant area, and this experience absolutely must involve feedback loops that allow the decision maker to test and improve her skills over many years. Although the 10,000-hour rule popularised by Malcolm Gladwell has been discredited (it was, after all, just a straight average of the number of hours required to become world class in one particular field), the bottom line is that you need a lot of experience to develop sound intuitive judgment — and exercises on the Plain against disinterested opfor who are not allowed to be creative does not count.

A mid-career A&E nurse, for instance, if he has been closing his feedback loops methodically over the years, is ideally placed to use his intuition. A young officer fresh out of Sandhurst is not, and neither is a senior commander arriving in theatre for the first time.

It is always possible to cite examples of people who trusted their instincts and came back alive, as Colonel Booth does. The problem is that we don’t hear from the people who trusted their instincts and didn’t. Ironically enough, that’s actually a cognitive bias too (the survivorship bias).

As an aside, once you understand the concept of survivorship bias you see its relevance all over. This is most famously true in investing, where a sufficiently large group of stock pickers who cannot in the aggregate outperform the market always produce a select few superstars. Gallons of ink is then spilled understanding their methods, when in reality they are simply very lucky. The finance professor Ken French has said it would take more than thirty years of investing returns to definitively know whether a stockpicker is good or lucky. Not surprisingly, survivorship bias is a major theme of Taleb’s The Black Swan.

A related challenge is understanding how good organisations function. Most famously, Jim Collins’s book Good to Great (a fascinating read) identified a series of companies that outperformed very similar peers. Collins then identified what set apart the successful ones from the unsuccessful ones, and produced one of the best-selling business books of all time. The flaw in this analysis is that Collins failed to look for dis-confirming examples — companies that had these criteria but failed to succeed. Richard Feynman has called this sort of mistake cargo cult science.

To make matters worse, many of the companies that Collins identified under-performed afterwards. As Clayton Christensen identified in The Innovator’s Dilemma, many companies feted for their success (by featuring on magazine covers, for instance) have passed their peak and are already having their business models destroyed.

Michael Mauboussin examines this timeless challenge in The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.

--

--

Fall When Hit
Fall When Hit

A blog by British Army heretics. Background photo used under UK OGL.