Cognitive Biases: “Problem 2: Not enough meaning”

21CP
10 min readJul 5, 2022

--

Below, we will use the enormously useful categorization developed by Buster Benson of Better Human to explain how cognitive biases work. All quoted definitions below are from Wikipedia.

Cognitive Bias Codex by Designhacks.co

Related articles:

“Problem 2: Not enough meaning”

In Man’s Search for Meaning 📖, psychiatrist and Nazi death camp surviver Viktor E. Frankl suggested that the primary human drive was not happiness but the pursuit of meaning. I cannot agree more. Sometimes though, humans strive so hard to find meaning, we see meaning when there is none.

“We find stories and patterns even in sparse data.”

Between failing to see patterns when there is one (false negative) and seeing patterns when there is none (false positive), we humans tend to err on the latter — better see a predator that is not there than not see a predator when it is there. In our modern world where predators have evolved and information is overloading, however, this tendency to over-interpret the information we receive often leads us to make wrong assumptions and judgments.

Princess Diana’s fate supposedly was foretold in the random text published in Moby Dick.
  • Apophenia / Patternicity: Our tendency to see connection between unrelated things. Our brains have evolved to recognize patterns, so much so that sometimes we see patterns in things that are not related, causing us to bet irrationally or believe in conspiracy theories. Below are three kinds of apophenia.
  • Clustering illusion: “The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).” Examples include “lucky streaks” in gambling or sports, or reversely, “when it rains, it pours” — the feeling that unlucky events come in a cluster.
  • Illusory correlation: “Inaccurately perceiving a relationship between two unrelated events”, like believing a “lucky charm” would bring you luck.
  • Pareidolia: Perceiving significance in random images or sounds, for example, “seeing images of animals or faces in clouds” or “hearing non-existent hidden messages on records played in reverse”.
  • Anthropomorphism: When we incorrectly use human as a basis to understand or judge a non-human entity. The robot vacuum I have, for example, is unimaginatively called “Robbie”. When activating it, I’d say “Robbie, get to work!” Anthropomorphism can help us cope with loneliness, e.g., Tom Hanks’ character’s basketball companion “Wilson” in the movie Cast Away. The personification of objects can also aid learning and help marketers sell more products to us through, for example, talking cartoon animals.

Related cognitive biases include: anecdotal evidence, gambler’s fallacy and hot-hand fallacy.

“We simplify numbers and probabilities to make them easier to think about.”

As much as we like to overthink stuff that is not there, we also oversimplify math that we don’t want to deal with.

  • Conservatism (belief revision): When encountering new evidence, we don’t revise our views sufficiently to match the new information. Conservatism can affect our decision-making in dating, investing and voting. New light might be shed about our dates, stocks or politicians, but we under-react to these new evidence and adjust our investments in them too conservatively, causing erroneous judgments and potential loss.
  • Extension neglect: Our propensity to “ignore the size of the set during an evaluation in which the size of the set is logically relevant.” In our world of big data, it is crucial to understand the extension effect in order to make valid judgment about matters concerning large quantities. Below are examples of extension neglect.
  • Base rate fallacy or Base rate neglect: “The tendency to ignore general information and focus on information only pertaining to the specific case, even when the general information is more important.” For example, when evaluating the long-term performance of a company’s stock, an investor may be adversely influenced by a recent PR crisis, and ignores baseline information such as the company’s general health, consequently rating the stock poorer than it is supposed to be. For another example, most people who get Covid-19 in highly vaccinated countries such as Iceland and Israel are vaccinated not because it is more likely to get Covid after vaccination, but because most people in these places are already vaccinated — the chances of finding someone who are unvaccinated and have Covid is low. In these highly vaccinated places, the more telling Covid data are probably the rate of infection, hospitalization and death in vaccinated vs. unvaccinated populations.
  • Duration neglect / Peak-end rule: We will cover this under Problem 4: What to Remember > We reduce events and lists to their key elements.
  • Insensitivity to sample size: Our tendency to judge the probability of something occurring without considering the entire sample size. Many ads exploit our insensitivity to sample size, claiming that “99% of test participants agree” that their product does wonders, and hiding the extremely small test sample size in the fine print. Since variation is more likely in smaller samples, the few people the advertiser sampled is hardly representative of the larger population and by extension, you or me.
  • Neglect of probability: Our predisposition to “disregard probability when making a decision under uncertainty… Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored.” In an experiment, for example, the anxiety levels between participants who were told that they would receive a mild electric shock and participants who were that told they have a 50%/20%/10%/5% chance of receiving the same shock were essentially the same. The researcher’s conclusion? “We lack an intuitive grasp of probability.” In old times, we couldn’t even afford the slightest of risks because any risk could be fatal. In modern times, however, this inability to grasp probability can cause us to take less risks than optimal when the probability of harm is uncertain. We live in uncertain times when global challenges such as climate change and rise of totalitarianism require bold solutions. Neglect of probability may cause us to take timid small steps when sweeping reforms are needed.
  • Scope neglect or scope insensitivity: “The tendency to be insensitive to the size of a problem when evaluating it. For example, being willing to pay as much to save 2,000 children or 20,000 children.” Scope neglect prevents us from seeing the enormous scale of global crisis and, as a result, may cause delay in appropriate action.
  • Normalcy bias: “The refusal to plan for, or react to, a disaster which has never happened before” (source). Years before Covid-19, Bill Gates warned that the world was not ready for the next outbreak. Despite being one of the most famous person speaking on one of the largest platforms in the world (TED), his warning apparently fell on deaf ear. Climate change is another global normalcy bias disaster in development, which many predict has passed the point of no return. For decades, climate change deniers vehemently refuse to face the truth despite overwhelming scientific evidence. Described as “one of the most dangerous biases we have”, albeit its unexciting name, normalcy bias is reportedly displayed in whopping 70% of us during a disaster. This makes me pessimistic about our ability to handle global crisis.
  • Proportionality Bias: Our innate tendency to assume that big events have big causes. This may also explain our tendency to accept conspiracy theories.

Related cognitive biases include: appeal to probability fallacy, conjunction fallacy, denomination effect, magical number seven, mental accounting, Murphy’s law, subadditivity Effect, unit bias, Weber–Fechner law.

“We project our current mindset and assumptions onto the past and future.”

All we can do is live in this moment of now. Maybe that’s why we are not very good at projecting our current mindset into the past or the future.

  • Consistency bias / Self-consistency bias — “When making decisions, our perception is influenced by judgments we have made in the past…” This bias comes from our need to maintain a consistent self-image; we need to believe we have been and will be the same person, neglecting the fact that we change all the time. Our brain wants to be self-consistent so much that we’d modify our immediate memory to fit our current self. For example, in two experiments, researchers asked people to recall their romantic relationship last year and their political stance a decade ago respectively. Both experiments showed that participants recalled their past as closer to their current political stance and romantic relationship than what they were in the past. To overcome consistency bias, try writing down reasons for important decisions and review them objectively later to see how your positions have changed.
  • End-of-history illusion — A psychological illusion “that one will change less in the future than one has in the past.” This illusion affects how we perceive changes in our personality, core values or preferences in the future. It leads us to believe things tend to stay the same and make bad plans for the future. Things are changing faster in our time then ever before. To correct this illusion, remind ourselves that we are always changing, no matter the age. See a related discussion in Self > Principles > Context Matters & Things Change.
  • Hindsight bias — “Sometimes called the ‘I-knew-it-all-along’ effect, the tendency to see past events as being predictable at the time those events happened.” If you wonder why you are so insightful all the time, “hindsight is 20–20” might be the answer. Hindsight bias can boost your confidence but also hinders rational thinking. If you want to be more truthful about your ability to predict, try to think about alternative explanations for what had happened other than “I knew it!”
  • Impact bias — Our overestimation of how long and how intense our feelings will be in the future. Studies have shown that we are good at predicting whether an event will generate positive or negative emotions (e.g., if our favorite team lost the game, we’d be unhappy; if we won the lottery we’d be jubilant), but we consistently overestimate the degree of how good or bad we’ll feel. Knowing the impact bias, it might be a good policy to take more risks in life, to “dance like nobody is watching and love like you’ll never get hurt”, because even if someone is watching, it’d not be as embarrassing as you think it’d be, and even if love does hurt, it’s not gonna hurt as much as you expect.
  • Outcome bias — “The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.” Since no one can foresee the future, using the ends (outcome) to justify the means (past decisions) prevents us from objectively evaluating the validity of a past decision. For instance, just because someone you ask out or a job you have applied for turned you down, it doesn’t make that the decision to make a move was a bad one. To reduce outcome bias, we need to ignore after-the-fact information and analyze the most logical decision to make given the information available at the time of decision-making.
  • Pessimism bias — “The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.” Opposite to optimism bias (under Problem 3: Need to act fast), permission bias prevents us from fully enjoying life since we’re over-worrying about undesirable outcome.
  • Planning fallacy — “The tendency to underestimate one’s own task-completion times.” I am a frequent offender and victim of this fallacy. Anyone who have run a group project before must also know this pain. Personally, planning fallacy ironically motivates me to take on more tasks, because I mistakenly think that I can finish them quickly. Of course that also causes me to be late on finishing tasks all the time. To counter planning fallacy, try breaking down a task into smaller tasks, mindfully going through the steps needed to get the task done, and using past completion time to predict time needed for similar future tasks. See more in Life > Methods > Achieving Goals.
  • Present bias — “The tendency of people to give stronger weight to payoffs that are closer to the present time when considering trade-offs between two future moments”. Present bias is a deterrent preventing us from solving problems for the long-term future.
  • Projection bias — “The tendency to overestimate how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.” In shopping, we might overestimate how much joy we get from the products; in politics, we might overestimate how satisfied we are with a candidate after we voted them into office. My personal lesson learnt dealing with projection bias is “don’t go grocery shopping with an empty stomach” — you’d just end up getting a lot of junk food you don’t need later. From shopping, dating to politics, projection bias could work against our future selves. To reduce its effects, try making decisions when you are relatively calm, content and level-headed.
  • Self-licensing / Moral credential effect: Our tendency to allow ourselves the “license” to do something undesirable or unethical after we have done something good or moral, due to an improved self-image. For example, allowing ourselves to eat junk food because we have had a salad, or allowing ourselves to waste energy because we’ve recycled, or feeling it’s okay to be prejudiced because we believe ourselves to be egalitarians… Self-licensing hampers our effort to do good, but being aware of it could minimize its effects.

Related cognitive biases include: declinism, moral luck, pro-innovation bias, reminiscence bump, rosy retrospection, telescoping effect, time-saving bias.

“We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information.”

  • Halo effect: The tendency for us to think one entity is better or worse because of what we think about another entity we associate with it. For example, just because the Ronald McDonald House is a charity, we associate the fast food chain McDonald to be good too; or because H&M has a recycling program, we mistake that the company is less environmentally-wrecking. A kind of association fallacy, the halo effect is often used by corporations, famous people, and governments to whitewash, greenwash, etc.. Being aware of this bias can prevent us from being fooled.

“We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of.”

It should not surprise us that we are prejudiced towards others, but how and way?

  • Pollyanna principle / Positivity Bias: Our brain tends to process positive and agreeable information more precisely compared to unpleasant information. As Mr. Spock observes in Star Trek episode And The Children Shall Lead: “Humans do have an amazing capacity for believing what they choose — and excluding that which is painful.”

Related cognitive biases include: well-traveled road effect.

“We think we know what others are thinking.”

This creates so much confusion and misunderstanding. See in-depth discussion in Groups > System 1: Social biases.

Do you have any suggestions, doubts, hypothesis or experience for this topic? Please comment below 👇!

--

--

21CP

21stC Personhood: Cheatsheets for the 2020s is an index/summary of ideas pertinent to today's challenges, compiled for anyone working towards a #FutureWeDeserve