Understanding Systems of Thinking

Roy Steiner
Omidyar Network
Published in
4 min readAug 25, 2017

Our brains have two ways of thinking. In “Thinking, Fast and Slow” the behavioral economist Daniel Kahneman explains that, “System 1” is quick and automatic, with little or no effort or voluntary control (i.e. identify the above image), while “System 2” is slower, effortful, and associated with deliberate attention and concentration (i.e. determine 17X49).

A key point is that we need both systems. I want my daughters to react quickly in dangerous driving conditions, but I want them to be very logical and deliberate when they are choosing an investment strategy.

Better understanding these systems is a useful way to differentiate instinctual, emotional thinking from deliberate, intentional thinking.

Photo Credit: Mix Research

Kahneman asserts that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience and resulting heuristics (mental model, rules). This automatic thinking can lead us to make certain mental mistakes.

Predictable Ways System 1 Thinking Can Lead Us Astray

  1. The “anchoring effect” names our tendency to be influenced by irrelevant numbers. For example most people when asked whether Gandhi was more than 114 years old when he died, will provide a much larger estimate of his age at death than they would if the anchoring question referred to death at 35 years old. A good recruiter or VC will often use anchoring to help set expectations about salary or price.

2. The availability (recency) heuristic is a mental shortcut that occurs when people assess probabilities by giving more weight to current or easily recalled information instead of processing all relevant information. For example, since information regarding the current state of the economy is readily available, researchers have found that investors often use the availability heuristic to make decisions and subsequently, may be obstructing their own investment success.

3. To explain overconfidence, Kahneman introduces the concept he labels What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has already observed. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it has no information. Finally it appears oblivious to the possibility of Unknown Unknowns (made famous by a certain Secretary of Defense). He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily unrepresentative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.

4. Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. The events leading up to the Iraq War are illustrative of this bias in multiple ways (focusing on anything that indicated that Iraq had weapons of mass destruction — which turned out not to be the case).

One of the areas where this challenges us the most is knowing whether we are drawing the right conclusions from the data in front of us (i.e. statistics, probabilities, trends).

When I worked at the Gates Foundation, I had a ringside seat to one of the more public demonstrations of this misunderstanding of statistics.

A major study my education colleagues commissioned concluded that most successful schools, on average, are small.

Based off this data, the Gates Foundation invested around $1 billion in the creation of small schools. This conclusion made intuitive sense because it is easy to construct a causal story that explains how small schools are able to provide superior education and thus produce high-achieving scholars by giving them more personal attention and encouragement than they could get in larger schools. Unfortunately, the causal analysis is pointless because the facts are wrong. If the statisticians who reported the Gates foundation had asked about the characteristics of the worst schools, they would have found that bad schools also tend to be smaller than average. The truth is that small schools are not better on average; they are simply more variable. Even the smartest people in the world can fail to interpret statistics properly. They later found that what actually makes a difference is the quality of teachers.

Becoming more aware of how our brain works and what traps and biases we are prone to is a good step to making better decisions.

As you think about these ideas consider the following questions:

  1. Do you sometimes see trends when you only have a few data points?
  2. In your last negotiation, did anchoring play a role?
  3. When has confirmation bias led you to making a wrong investment decision?

#AlwaysLearning

Roy

Our Friday Learning Notes series is designed to share insights from Omidyar Network’s journey to become a best-in-class learning organization. Grab a cup of coffee and start your own Friday morning learning journey! *warning: side effects of regular reading may include improved mood, upswing in dinner party conversation, and/or increased desire to cultivate learning for social impact

--

--

Roy Steiner
Omidyar Network

VP of Food@RockefellerFoundation. Director @OmidyarNetwork. Deputy Director @GatesFoundation Scientist by way of @MIT @Cornell. Strategist @McKinsey