Distrust your Gut: Judgement, Bias, & Frameworks For Decision Making
Books I’ve Read: The Undoing Project & Thinking In Bets
I am obsessed with the science of decision making and find myself often reading about behavioral psychology and decision science, which was the case with the last two books I finished. Oddly enough, the timing couldn't have been better because just as I was wrapping up the second, this tweet by venture capitalist Manu Kumar ignited real sparks among investors and founders on Twitter:
I have a lot of problems with this advice, not least of which is that it comes from an extraordinarily successful seed stage investor (Twilio, Carta, Lyft, etc.) and, to the undiscerning, can be taken as dogma. When someone says they made a decision off of “gut-feel” to me that conveys a lack of awareness of one’s own biases, value judgments, and subconscious rubrics for evaluating imperfect information. This is especially challenging if the subject has been successful accidentally (not saying this is at all the case with Manu).
To me, “gut” is a shorthand for an uncodified values framework that if dissected properly can be studied, iterated on, learned from and shared with the people who work with you. After all, if we can’t explain a decision such that another person could learn from it and repeat that decision for themselves, how can we be trusted to deliver reliably on that result ourselves? The act of self-analysis refines and hones our “gut”, sharpens our intuition and creates better heuristics for building conviction.
This week, instead of focusing on broad takeaways from the books I’ve read, I’ll summarize them briefly before delving into the specific biases discussed in both and help you as the reader identify these cognitive traps in others (easier) and yourself (much harder) towards a goal of better, more accurate decision making and, hopefully in my case, better investing.
“Uncertainty is an uncomfortable position, but certainty is an absurd one”. — Voltaire
By Michael Lewis
Lewis always features protagonists who zag when everyone zigs. In the most meta way, The Undoing Project is about Amos Tversky and Daniel Kahneman, two Israeli behavioral psychologists, who formed an unlikely friendship between 1971 and 1984, and asked why the human mind operates in certain ways. Their collaboration has taught generations to mistrust human intuition and to seek answers in data and they are probably more responsible than anyone for today’s data- and customer feedback-driven mindsets that propel innovation in software and technology products.
By Annie Duke
When rational decisions don’t yield desired outcomes, the real mistake is not understanding what outcomes are influenced by skill and which owe themselves to luck. Annie Duke, former World Series of Poker champion turned business consultant, draws on examples from poker, sports, politics, and her life to help you embrace uncertainty and make better decisions. By shifting from a need for certainty to a goal of assessing knowns and unknowns, we can train ourselves to be less vulnerable to reactive emotions, cognitive biases, and other destructive habits in decision making.
Six Common Biases & Takeaways:
Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them — Annie Duke
1. Confirmation Bias & Motivated Reasoning
As we strive for internal psychological consistency, humans have an overwhelming tendency to take in only the information that confirms our own beliefs and we outright reject that which challenges our conceptions. Motivated reasoning is the act of doing all sorts of mental gymnastics to make data fit our preconceived notions. It’s also why being smart can actually make fighting our biases even harder — the smarter someone is, the better they can be at making uncorrelated data fit a prepared narrative.
To combat this ask yourself “Do I want to be right? Or do I want to be accurate?” If the goal is really to be accurate, cherrypicking data takes you further from the goal of being truth-seeking.
2. Resulting & Hindsight Bias
After we know the outcome of an event, we tend to think that a successful outcome was as a result of good decision making and an unsuccessful one was as a result of bad luck. In reality, Resulting happens because humans create too tight a correlation between the quality of an outcome and the quality of the decision, especially with small sample sizes. Hindsight bias is the tendency to see the outcome as having been inevitable which makes it hard to learn in retrospect whether that decision was, in fact, the right one anyway.
To compensate, we have to recognize that we can only learn from our decisions if we chronicle why we made these choices before we know the result. As a venture investor, I write detailed memos ahead of making an investment and pay special attention to the section on risks (w hat could go wrong). Some also call this a ‘pre-mortem’ and its something I want to record in greater detail as part of my decision-making process.
3. Loss Aversion & Minimizing Regret (a.k.a. FOMO)
Traditional economists believe their subjects to be rational actors that maximize utility. Behavioral economists understood through Kahneman and Tversky’s work on Prospect Theory and Loss Aversion that “losses loom larger than gains” and that in fact, the pain of losing can be twice as powerful as the pleasure of gaining. Loss Aversion umbrellas many common cognitive traps including the Endowment Effect, Sunk Cost Fallacy, and the Status Quo Bias.
Investing because of FOMO often gets investors into trouble and causes even the best models to be thrown out of the window because humans seek to minimize regret even in the face of data that shows us a lower likelihood of success. Loss aversion isn’t a habit, it’s a psychology. This makes it much harder to push back on its influence on rational decision making. Creating previously agreed upon rubrics/frameworks and understanding when you’re being motivated by irrational factors can go a long way to preventing decision making focused on minimizing regret.
4. Representativeness Bias & Pattern Matching
Kahneman and Tversky identified the idea of Representativeness: people tend to draw conclusions or link events to each other that might have no correlation at all. Famously, NBA player Jeremy Lin was a remarkably talented basketball player in high school but received no offers from universities or NBA, despite remarkable credentials. The reason? As an Asian-American, he simply didn’t “look” like a basketball player. It’s an especially troubling bias for a venture capital industry that elevates investors who are adept at pattern matching — a shorthand for the observation and selection method by which VCs isolate variables of success for an early stage company and use it to identify other all-stars.
The massive problem with pattern matching is that it reinforces exclusion. If you’ve never seen something like X in your past sample, how can you count or discount X it as a predictor of future success? However, to see an example of how a competitor’s representativeness bias can be used to one’s advantage, look no further than Billy Beane’s Oakland A’s who specifically went after players with some kind of physical defects/anomalies. Markets for human capital aren’t perfectly rational and what others may have overlooked, they have likely also mispriced.
5. Availability Heuristic
People assess the probability of an event by asking whether relevant examples are cognitively “available”, assuming the likelihood of events based on how easily they can recall something like it happening before. When asked whether gun homicides or gun suicides are more common, people pick homicides when in fact suicides happen twice as often. Gun homicides get more media coverage so people tend to assume they happen more.
While there is nothing wrong with heuristics — cognitive shortcuts — for smaller, low stakes decision making, we must be even more aware of our misconceptions and why they are formed when it comes to high-value decisions. For investors, this bias comes into play in combination with representativeness and leads to decisions being made on “gut instinct” when the situation ahead of you looks remarkably like similar situations with positive outcomes in the past. At the very least, this shortcut creates a positive inclination towards a particular decision and gets further compounded by motivated reasoning to lead to an undisciplined decision. Taking the time to reflect on why you are positively inclined and to perform adequate diligence, can decouple the correlation of past outcomes from the quality of those past decisions.
6. Self-Serving Bias
In order to maintain our self-esteem, humans give ourselves more credit for success and give others more blame for failures. By engaging in self-serving bias we absolve ourselves of responsibility but doing so, lose a critical opportunity to learn. What exacerbates this problem is that humans are naturally good at identifying biases in other people, but really bad at identifying it in ourselves.
Annie Duke suggests that the best way to compensate for this very human shortcoming is to create accountability with others, via decision pods. Data shows when people know they have to explain their decisions to someone else, they are much more thoughtful about how they spend time, energy and capital. To force this introspection, seek out people who actively disagree with you and hear them openly. I’ve realized that over the last few years, I get really uncomfortable when everyone tells me something is a good idea. Ideally, I’d only do that thing if at least a few people tell me it’s a bad idea. At least that way I’ve evaluated both (or all) sides of the decision appropriately. This means divorcing decision making from ego and focusing on being accurate vs. being right, which is much easier said than done.
One of my personal OKRs for 2019 is to double the number of books I read annually from 8 to 16 (gulp!). Definitely a “stretch goal” but I wanted something that would be difficult, yet attainable. As with any goal setting exercise, there needs to be structure, measurability, and communication. For me that has come in the following two ways:
- Read at least 25 pages per day (Thanks Shane Parrish!) then check off on a daily calendar with a (Y or N) when completed.
- Publishing takeaways and overall observations in a brief post like this.
For transparency: Over the last 3 months, I’ve read at least 25 pages on under 28% of days so I’m pacing around 12 books by year end i.e. 4 short of my goal. Part of the problem is reading before bed so, in April & May, I’m setting aside ~45mins in the mornings daily to see if that helps get me closer to my goal of 25 pages daily.
Up Next: “Destined for War: Can America and China Escape Thucydides’s Trap?” by Grahm Allison
If you enjoyed reading this book summary, please smash that *clap* button so others can enjoy it too! More importantly, I’d be most interested to hear in the comments from people who disagree with my belief that you should “distrust your gut”.
Also, if you’ve read similar books on behavioral psychology or decision making frameworks, please leave me a comment on Twitter @JayKapoorNYC with your recommendations!