Cognitive Biases: “Problem 3: Need to act fast” ⚡️🏃🏻💨

21CP
14 min readJul 23, 2022

--

Below, we will use the enormously useful categorization developed by Buster Benson of Better Human to explain how cognitive biases work. All quoted definitions below are from Wikipedia.

Cognitive Bias Codex by Designhacks.co

Related articles:

“Problem 3: Need to act fast”

Mark Zuckerberg’s former motto “Move fast, break things” was partially responsible for Facebook’s success as well as for landing it in massive trouble regarding privacy and ethics. Our minds overwhelmingly use the automatic System 1 without engaging the more deliberate System 2, essentially acting before thinking. That often lands us in trouble as well.

“In order to act, we need to be confident in our ability to make an impact and to feel like what we do is important.”

The world can be wildly dangerous, unpredictable and challenging. If we don’t have overconfidence in ourselves to tackle our surroundings, it’d be hard to get out of the bed in the morning. Our ego therefore develops this tendency to “fake it till we make it”. As biologist Carl Bergstrom puts it 🎧: “We don’t lie well so it’s better to lie about things we believe. We evolve to self-deceive in order to be better bluffers” [17:32]. However, that in turn prevents us from seeing reality sometimes because we are so full of ourselves. Buckle up — this is gonna be a long list.

  • Barnum effect / Forer effect: A “common psychological phenomenon whereby individuals give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically to them, yet which are in fact vague and general enough to apply to a wide range of people.” All astrology, fortune-telling, and many personality tests belong to this category. Like “you are stressed this week because you have a lot to do.” Or “love is on the horizon.” People are especially taken to personality descriptions when they believe the descriptions are tailored to them, when the giver of the descriptions is an authority, or when the descriptions are positive (re: Pollyanna principle). Beware of “fortune-tellers” who claim they can tell your future, or people who claim to know your personality but actually just flattering you with positive descriptions — they could be manipulating or scamming you!
  • Dunning-Kruger effect: “[P]eople with low ability at a task overestimate their ability.” Like when you think moonwalk looks easy, or when watching a movie, you keep thinking you can come up with a better plot. Studies show that culture, class and gender play a role in the Dunning-Kruger effect. Japanese people, for example, tend to underestimate their abilities while North Americans tend to do the opposite. Women are more accurate in accessing their abilities than men. People in relatively high social class are more overconfident than those in the lower class — this can explain why sometimes affluent people cannot identify with the have-nots. Perhaps the more privileged you are, the more inflated your confidence is, resulting in more self-delusion about your abilities — unless of course you are constantly reminded by society to be “in your place”, as in the many cases of Asians, woman and lower class. “Without the self-awareness of metacognition (note: thinking about thinking), people cannot objectively evaluate their competence or incompetence.” This is true with the Dunning-Kruger effect as well as all the other cognitive biases we may have.
  • Effort justification: The tendency for people to attribute more value to an outcome they put a lot of effort into achieving than the objective/actual value. For example: I put so much effort into that difficult relationship, it must be significant to me. Or: my fraternities/sororities are as important to me as family, because I put up with so much to get in. Moral of the story? It is understandable to attach more value to a hard task to make yourself feel better about doing it, but be careful of that person, organization or regime that put you through hell — you might end up giving them more meaning and loyalty than they deserve.
  • Egocentric bias: “Recalling the past in a self-serving manner, e.g., remembering one’s exam grades as being better than they were, or remembering a caught fish as bigger than it really was.” While we all need a bit of ego boost from time to time to counter the weight of living, having too much egocentric bias can cause us to misjudge our abilities and make bad decisions.
  • False consensus effect: The tendency for people to “see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances.” This effect gives us the false impression that our thinking, behavior, and characteristics are relatively common in society, boosting our self-esteem and reaffirming our desire to conform.
  • Illusion of control: The tendency to overestimate our ability to affect events. For example, stock traders think they have more influence on their investment outcome than they actually do. (Interestingly, the more control the traders think they have, the more risks they take and the less they earn). Stressful and competitive environments, the emotional need to control, familiarity with a task, and actively choosing an outcome all contribute to higher illusion of control. On the other hand, the illusion is weaker for depressed people. Self-servingly, we feel more in control over successful events and less in control over failures (if I win, it’s because of me; if I lose, it’s not because of me). Illusion of control might be developed to give us a confidence boost to deal with chaotic, stressful situations, and has shown to correlate with health in older folks. While illusion of control can motivate us to persist at difficult tasks, it can also “cause insensitivity to feedback, impede learning and predispose toward greater objective risk taking”.
  • Illusion of validity: Overestimating our ability “to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern — that is, when the data ‘tell a coherent story’.” Dubbed WUSIATI, or “what you see is all there is”, researchers reviewed the predictions versus the actual performance of wealth advisors, athletes, army officers, and found that people were overly confident in their predictions based on historical data, when in fact their forecasts fared not much better than chance. From the illusion of validity, we develop an illusion of skill or experience, thinking inaccurately that skill and experience we build up overtime enhance our ability to predict the future. To overcome the illusion of validity, remember that hands that feed you day after day could be the same hands that kill you one day for food (think poultry), and check your assumptions about predicting the future.
  • Illusory superiority / Lake Wobegon effect: The overestimation of our qualities and abilities, as compared to others.
  • Optimism bias: “The tendency to be over-optimistic, underestimating greatly the probability of undesirable outcomes and overestimating favorable and pleasing outcomes.” One recent example was the West’s overt optimism that Russia will lose in the beginning of its invasion of Ukraine early 2022. Opposite to negativity bias, optimistic bias increases our self-esteem but can also lead to more risk-taking. For instance, optimistic bias can prevent you from taking preventive measures for your health, since your over-optimistic about the prospect of your health in the future. Interestingly, attempts by researchers to reduce the optimism bias winded up increasing it. But you can still trying countering this bias by comparing what you are optimist about with someone close to you, like comparing your chances of getting certain kind of cancer with a close family’s. Also, try experiencing what you’re optimistic about to get a frame of reference; for example, trying actually getting drunk to test if you are really hard to get drunk.
  • Overconfidence effect: A “person’s subjective confidence in his or her judgements is reliably greater than the objective accuracy of those judgements.” We can be over-confidence about our performance, our performance compared to others, and how accurate our beliefs are. Over-confidence has launched lawsuits, strikes, wars, and market bubbles and crashes. So watch out for the “con” man/woman in you and in others.
  • Restraint bias: Our tendency to overestimate our ability to control impulsive behavior. Whatever you are addicted to — your phone, food, porn, shopping, money, power, drugs… chances are you think you are more in control than you really are. Fortunately, studies have found a few ways to reduce restraint bias, including believing in your ability of self-restraint, not exposing yourself to tempting environments, and focusing your attention on your addictions and how they affect you.
  • Risk compensation / Peltzman effect: “The tendency to take greater risks when perceived safety increases.” For example, motor cyclists speed up when they are wearing a helmet. In skydiving, it is observed that “the safer skydiving gear becomes, the more chances skydivers will take, in order to keep the fatality rate constant”. Well, it seems we like to maintain a certain level of risk to keep living interesting.
  • Self-serving bias: The tendency to see ourselves in an overly favorable manner in order to maintain and enhance self-esteem. In this, we lie to ourselves (self-enhancement) and to others (self-presentation) by attributing desirable outcome of an event to our own doing and blaming undesirable outcome on others or our surroundings. Self-serving bias is found to be higher in people with higher self esteem, between unfamiliar people, and in situations with negative outcomes; the bias is lesser in older people, depressed people, and people in closer relationships. Don’t want to be a self-serving bastard? A constructive way to do is building close relationships in family, workplace and society. The social version of the self-serving bias is group-serving bias or in-group favoritism.

Related cognitive biases include: actor-observer asymmetry, defensive attribution hypothesis, exaggerated expectation, false consensus effect, fundamental attribution error, hard–easy effect, third-person effect, trait ascription bias, social desirability bias.

“In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions.”

How important independence and status are to you?

“In order to stay focused, we favor the immediate, relatable thing in front of us over the delayed and distant.”

  • Hyperbolic discounting: Our preference to have immediate payoffs rather than delayed payoffs, causing us to make choices today that our future selves would not have made. For example, a study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate. Hyperbolic discounting has implications in saving, borrowing, procrastination and voting. One way to overcome it? When planning for the future, think about the needs of your future self (as well as future generations).

Related cognitive biases include: appeal to novelty, identifiable victim effect.

“In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in.”

These set of biases prompt us to focus on things we’ve invested in but are not necessarily truly important, and miss out on other potentially more worthy opportunities.

This comic by the Oatmeal poetically explains the backfire effect.
  • Backfire effect / Pushback effect: When presented with contradictory evidence, our tendency to reject the evidence and believe our pre-existing ideas even more strongly. In other words, the disagreeing evidence is backfiring. While the backfire effect hasn’t been replicated in recent studies, it does not mean that we are open-minded about opposite views and arguments. We still prefer evidence that support our pre-existing beliefs, as discussed in confirmation bias. Instead, rather than the stronger “backfiring”, or ignoring counter-evidence and barreling deeper into our own beliefs, we “push back”. As Alexios Mantzarlis, Director of an international fact-checking organization put it, “we are fact-resistant, but not fact immune.” Next time you argue with your family, co-worker or friend, know that they (as well as yourself) may appear to be defensive and unyielding, but deep down some of the arguments probably go through.
  • [Logical fallacy] Escalation of commitment or irrational escalation: Our propensity to continue a decision or action despite negative outcome to align with previous decisions or actions. There are many proverbs describing this behavior: “Throwing good money after bad”, “In for a penny, in for a pound”, or “If you find yourself in a hole, stop digging.” On one hand, if we don’t persist in face of adversity, how do we get anything difficult done? On the other hand, we need to know when to stop when a cause is lost (unless, the cause concerns with ethics and fundamental values such as human rights; see Life > Life Perspectives > Motivations, Values & Purpose). It is harder for groups than individuals to stop escalation of commitment due to the size and complexity of group efforts. To avoid irrational escalation, be very clear about why you want to join a project or cause before committing.
  • Generation effect (Self-generation effect): “That self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.” If you want to remember something that is important, paraphrase it in your own words to remember it as your own statement. Here are some techniques you can use. Also see a related topic in Life > Methods > Life Stories.
  • [Prospect theory] Loss aversion: “[T]he tendency to prefer avoiding losses to acquiring equivalent gains.” We hate losing much more than we like making equivalent gains. Loss aversion prevents us from making rational, worthy risks ▶️. A personal example: there was a time when I’ve avoided taking on a leadership role because I worried that the embarrassment of failing would out-weight the benefits of personal growth (luckily I got out of that phase). Loss aversion has been used by marketers and policymakers to change our behaviors. To minimize irrational loss aversion, try not to frame an outcome as loss or gain, and ask yourself, “what’s the worst thing that could happen?” (Source)
  • Processing difficulty effect: Information that takes longer to read or need more time to process is remembered more easily. Does that mean you should make your emails and text messages long and hard to read? No, because TL;DR. But if you want yourself or others to remember a piece of writing well, take the time to read or write details — a great story is always better remembered than its synopsis.
  • Sunk cost fallacy: People’s tendency to take past cost into account in making future decisions, even though the past cost is no longer relevant. For example, when you have waited for the taxi for 15 minutes, should you keep on waiting or just call a Uber? In this scenario, the 15 minutes you have waited is irrelevant and whether a taxi or Uber will come first depends on the traffic situation and other factors. In relationship, spending and voting, it benefits us to forget about the bygone costs and reevaluate the situation as of now.

Related cognitive biases include: disposition effect, endowment effect, IKEA effect, pseudocertainty effect , status quo bias.

“We favor options that appear simple or that have more complete information over more complex, ambiguous options.”

Taking the path of least resistance can help us get things done, but can also push us to bias towards easier solutions that are wrong for us.

  • [Prospect theory] Ambiguity effect: Our avoidance to choose options with unknown odds of favorable outcome. For example, we might choose the “safe” romantic partner over the “wild card”, despite the fact that we are more attracted to the latter. To reduce ambiguity effect, try training ourselves to make bolder choices that align with our goals and ethics, even if we don’t have perfect information about them.
  • Information bias: Our tendency to seek irrelevant information that have no bearing on our actions, mistakenly believing that the more information we have before making a decision, the better. When facing an illness, we tend to opt for taking extra tests even when these extra test results do not inform us what treatments to take. I’ve seen myself and my friends asking the same questions about an important decision over and over, such as whether to end a relationship or move to another city, only to get into a decision paralysis with too many irrelevant opinions. To overcome information bias, ask ourselves: will the additional information change the outcome of our action or decision, or are they just distractions?
  • Interoceptive bias or hungry judge effect: “The tendency for sensory input about the body itself to affect one’s judgement about external, unrelated circumstances”. For example, study finds that parole judges are more lenient when they are well-fed and rested. To make sure you make good decision, be sure your basic deficiency needs are taken care of.
  • Omission bias: Our tendency to favor inaction over action, omission over commission “… and to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).” In On Liberty 📖, philosopher John Stuart Mill called us out on this human bias: “A person may cause evil to others not only by his actions but by his inaction, and in either case he is justly accountable to them for the injury.” Sadly, most of us are generally less insightful than Mill. In the trolley experiment, for instance, we are okay with not actively causing harm to one person to save five others, because we intuitively feel inaction is less harmful and more moral than action, even though more people die due to this inaction. In a brutal regime, we feel okay with standing by while other people suffer, because the evil is not actively performed by our own hands. To combat this potentially immoral, even murderous bias, we need to regard inaction just as harmful and immortal than action. Poet Dante Alighieri said it best: “The hottest places in hell are reserved for those who, in times of great moral crisis, maintain their neutrality.”
  • Truthiness: “[T]he belief or assertion that a particular statement is true based on the intuition or perceptions of some individual or individuals, without regard to evidence, logic, intellectual examination, or facts.” Coined by television host Stephen Colbert, truthiness was defined by his show as: “We’re not talking about truth, we’re talking about something that seems like truth — the truth we want to exist.” Read any heated debates online, and you’ll find endless examples of statements of “feel true” but in fact illogical. Below are four cognitive biases demonstrating truthiness.
  • Belief bias: Extremely common and thus important, belief bias is a cognitive error where our “evaluation of the logical strength of an argument is biased by the believability of the conclusion.” In other words, if the conclusion “sounds true”, we tend to believe it more although it might not be logical. Similar to the effect of confirmation bias, belief bias makes us “…more likely to accept an argument that supports a conclusion that aligns with [our] values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion.” To minimize the errors of belief bias, give yourself time to think through a problem, think about the negative implications of a conclusion, and generate alternative explanations for the evidence supposedly supporting the conclusion.
  • Illusory truth effect: “[T]he tendency to believe false information to be correct after repeated exposure.” In other words: “repeat a lie often enough and it becomes the truth”, because it has become familiar and easier to process. Illusory truth effect is repeated used by advertisers, politicians and propagandists to make lies seem true. When you see the same false information popping up over and over in your news feeds, even if you start off not believing them, eventually you’d start to believe there is some truth to it. Again, if you don’t want to be manipulated, expose yourself to factual media and information, and stay away from platforms that show nonfactual ads or content as a source of income.
  • Rhyme as a reason effect: Sayings and statements are judged to be more truthful when they rhyme. The most famous example of this effect is the “If it doesn’t fit, you must acquit” argument made by O.J. Simpson’s lawyer, which indeed helped O.J. get acquitted of murder. Using the rhyme as a reason effect, you can try to manipulate people into believing something by turning it another a rhyme. At the same time, be careful of neatly rhymed sayings that sound true. After all, truths can rhyme, so can lies.
  • Subjective validation: “Perception that something is true if a subject’s belief demands it to be true.” This bias has been covered above under “We are drawn to details that confirm our own existing beliefs.” To debias subjective validation, constantly challenge and test your own beliefs.

Related cognitive biases include: Delmore effect, less-is-better effect, occam’s razor, Parkinson’s law of triviality, zero-risk bias.

Do you have any suggestions, doubts, hypothesis or experience for this topic? Please comment below 👇!

--

--

21CP

21stC Personhood: Cheatsheets for the 2020s is an index/summary of ideas pertinent to today's challenges, compiled for anyone working towards a #FutureWeDeserve