Book: Thinking, Fast and Slow

Thinking, Fast and Slow (Daniel Kahneman) is very similar in content to Superforecasting (Philip Tetlock and Dan Gardner), the last book I read. Both titles deal with common pitfalls that undermine rational decision making and cover ways to mitigate their effects. However, Tetlock and Gardner mainly discuss how these findings can be applied to making more accurate predictions about the future, whereas Kahneman primarily focuses on the findings themselves.

Kahneman begins by describing two systems in our mind, System 1 and System 2. System 1 is intuitive, and it tells us that 2+2=4, that our boss is angry, or that we need to adjust the steering wheel a couple of degrees to the left to stay on the road, all without any active thinking on our end. System 2 is logical and thoughtful, and helps us do more complex arithmetic, understand a technical presentation, or write a blog post. While, Kahneman states, this is not a good representation of how our brains actually operate, it provides a model that can help us better understand how we think, and, as a result, make better decisions.

Unsurprisingly, system 1 takes no effort to engage, and will produce output whether we want it to or not — it is completely automatic. In contrast, engaging system 2 is an effortful process, and people tend to avoid doing so when possible. This “personality difference” between the two systems produces many effects that lead to flawed decision making, but I felt that a few of them were particularly salient.

Substitution

Often, when faced with a difficult question (e.g. “How likely is the president to be reelected?” or “How satisfied have you been with your life over the past year?”), system 1 will substitute an easier question (e.g. “How much do I like the president?” or “How happy am I right now?”) and answer that instead. Sometimes the answer must be converted from one scale to another, for example, when converting how much you like the president to a probability of reelection. Usually the answer to the substituted question is relevant information to the original question, but is incomplete in isolation; a president that you like is a bit more likely to be reelected, and your mood right now does weigh into your life satisfaction over the past year, but this alone does not give a good answer to the main question. Substitution is difficult to avoid, as it often happens too quickly for us to even notice it, but when making important decisions, it’s good to be very explicit about what questions you are trying to answer and what information will help you answer them.

Coherence vs. likelihood

Oftentimes when people are considering the likelihood of a sequence of events or that of something being true given some relevant information, they will focus on the coherence of the story told by the facts rather than the actual statistics surrounding the situation. This is an effect of substitution, and manifests itself in a variety of ways, such as when pundits claim after the fact that an unexpected event was inevitable (because they can construct a coherent story ending with that event, even if they themselves failed to predict it) or when people, upon hearing that someone likes reading and organization, judge that person as more likely to be a librarian than a farmer (forgetting that there are far more farmers than librarians). Accepting the coherence of a story as a valid explanation of its events also leads people to believe they understand the causes of random events or that they are skilled at something when in reality they were just lucky.

In one example mentioned by Kahneman, the authors of Built to Last studied 18 pairs of competing companies, each with one more successful than the other, and examined the differences between each pair in an attempt to tease out the elements that make up a successful company. However, after the publication of the book, the differences in profitability and stock returns of the two sets of companies shrank dramatically. It’s arguable that profitability can be negatively affected by long-term bets, and that it’s easier to grow a small valuation than a large one, but it’s hard to believe that the authors didn’t downplay the role of luck in their book. Randomness is not something that comes intuitively to us, but it would be wise to understand its significance in determining the outcomes of almost everything.

Loss aversion

Offer someone a bet in which they win $150 or lose $100 with equal probability, and they will, in most cases, turn it down despite this bet being worth $25 in expectation to them. This illustrates the concept of loss aversion, which in short states that, for most people, the pain of losing something significantly outweighs the pleasure of getting it. In this bet, people judge that the pain they will feel from losing $100 outweighs the pleasure from winning $150. This seems like a reasonable judgment in isolation, but the isolation is an illusion — a pattern of loss aversion will lead people to make many such negative-expectation decisions over the course of their lives (such as purchasing flight insurance), and the losses of these decisions as a whole can be quite large in aggregate.

Know the limits of intuition

The first step towards improvement is acknowledgement of its availability, and Kahneman excellently illustrates many ways in which our intuitions can lead us to make poor decisions. Only by acknowledging and recognizing these pitfalls can we begin to perform better than our immediate intuitions and construct our judgments on stronger foundations.