Cognitive Biases: “Problem 1: Too much information.”

21CP
16 min readJun 19, 2022

--

No one can be a totally fair judge in his own case.” Hindustani / Hindu Proverbs

Below, we will use the enormously useful categorization developed by Buster Benson of Better Human to explain how cognitive biases work. All quoted definitions below are from Wikipedia.

Cognitive Bias Codex by Designhacks.co

Related articles:

“Problem 1: Too much information.”

As mentioned, our senses gather 11 million bits of information from our environment every second. While it is estimatedthat our entire brain can execute as many as 100 billion operations per second, our conscious mind can only process about 120 bits of data per second. Clearly, something’s gotta give. To add to this burden, digital and social media have exponentially increased the amount of information we receive each day, causing information overload. As a remedy, we “self-select and filter and allow in the information that’s meaningful to us and important to us, and then maybe let in a trickle of stuff that’s just, you know, whimsical or fanciful. Like, what George Clooney wore at his wedding,” according to neuroscientist Daniel Levitin. Below are situations where our brain selects, filters and creates biases.

“We notice things that are already primed in memory or repeated often.”

Just like we tend to put frequently or currently used things within our arm’s reach at home or at work, we tend to notice things that are frequently or recently accessed in our minds as a shortcut to increase our speed in retrieving them. While convenient, the tendency to grab things near and dear in our head might cause us to grab the wrong things. Below are the related cognitive biases.

  • Availability bias / Availability heuristic: “[A] mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision.” For an unfortunate example: the only answers we can think of in a test are the concepts you have studied right before the test. Below are a few kinds of availability bias.
  • Attentional bias: “The tendency of perception to be affected by recurring thoughts”, such as seeing things that remind you of your crush everywhere you go.
  • Frequency illusion / Baader-Meinhof phenomenon: “[O]nce something has been noticed then every instance of that thing is noticed, leading to the belief it has a high frequency of occurrence…” For example, after watching a documentary about flat-earthers, you notice a lot more news mentioning flat earth, giving you the impression that there are more flat-earthers than there actually are.
  • Selection bias: Similar to frequency illusion, selection bias is our predisposition to “notice something more when something causes us to be more aware of it.”
  • Context effect: “That cognition and memory are dependent on context…” Things out of context are harder to recall than things in-context, and vice versa. For example, it is easier to recall work-related information at work than at home (working from home blurs this division, of course). In a certain context, even good people make unimaginably harmful decisions. For a discussion on the implications of how context affects us in real life, see Self > Principle: Context matters & Things Change.
  • Cue-dependent forgetting: Our failure to recall a memory without the right cues “that were present when a memory was encoded”. Opposite of over-accessing information at the top of our mind, cue-dependent forgetting is the failure to retrieve information without priming or repetition. For example, when we say something is “at the tip of our tongue”, we know that something exists in our memory, but we are missing the relevant cue to access it.
  • Empathy gap / Hot-cold empathy gap: Our capacity to understand another person is “state-dependent”. “For example, when one is angry, it is difficult to understand what it is like for one to be calm, and vice versa; when one is blindly in love with someone, it is difficult to understand what it is like for one not to be…” In Chinese idiom, the empathy gap is poetically described as “you cannot talk to a summer bug about ice”. Intrapersonal empathy gap causes us to misjudge our reaction in a different state, such as thinking that dieting would be easy when we are full, only to realize that we crave food more than we expected when hunger strikes again. Unless we “develop the muscle memory to override our instincts in those states”, journalist Shankar Vedantam comments, “the people we are now can be very different from the people we might become,” causing us to become “strangers to ourselves”.
  • Familiarity principle / Mere exposure effect: Our tendency to like somethings simply because we are familiar with them. That’s how ads get to you — advertisers make sure you are repeatedly exposed to their brands and their products until you are familiar with them enough to start liking and consequently consider purchasing them. “What’s more, research shows thinking deeply or being smart does not make you immune to this cognitive bias… Older adults were especially susceptible to this repetition. The more often they were initially told a claim was false, the more they believed it to be true a few days later” (source). This bias prompts people to believe viral fake news and disinformation more because of our heavy exposure to them. The lesson? Carefully choose what you are exposed to to ensure that what you end up liking is what you want to like.
  • Mood-congruent memory bias: The “tendency of individuals to retrieve information more easily when it has the same emotional content as their current emotional state. For instance, being in a depressed mood increases the tendency to remember negative events…”, and vice versa. When attempting to remember something emotional from the past, try simulating the mood at the time of the event. Reversely, if you’d like to remember something in a good mood in the future, try not to encode it in a foul mood in the first place. For instance, try not to discuss important matters with your loved ones when you’re angry, because next time you might not recall everything you last discussed until you get enraged again, and when you do, you only remember the maddening stuff, possibly contributing a never-ending cycle of irate discussions. This applies to public discourse as well.
  • Negativity bias / Negativity effect: Our tendency to be more affected by negative events than positives ones of the same intensity. Negativity bias causes us have a greater recall of bad experiences compared to positive ones. This trait has an obvious evolutionary function — our ancestors were likely to be in dangerous or risky situations threatening their survival when unpleasant memories were made. Remembering these lessons would help them learn and avoid similarly bad situations in their future. However, negativity bias may also lead us to pay way more attention to the few negative comments others make of us as opposed to positive ones, especially on social media, making us unnecessarily unhappy. Negativity effect may also lead us to recall more negative emotional events than positive ones, misleading us to think that our living quality is worse than it actually is, and preventing us from taking worthy risks. In elections, our votes “have been shown to be more affected or motivated by negative information than positive: people tend to be more motivated to vote against a candidate because of negative information than they are to vote for a candidate because of positive information,” prompting us to make biased voting decisions that could adversely affect our future.
  • Recency bias: Recency bias makes recent events seem more important than older events, leading us to favor recent events over past ones. For example, when investing in the stock market, “recency bias… convinces us that a rising market or individual stock will continue to appreciate, or that a declining market or stock is likely to keep falling. This bias often leads us to make emotionally charged choices… tempting us to hold a stock for too long or pull out too soon.” This bias affects how we invest in not only our money, but also our time, attention and emotion, causing us to over-invest in recent events and under-invest in older ones.

Related cognitive biases/concepts include: anthropocentric thinking, anthropomorphism, illusory truth effect (discussed under Problem 3: Need to Act Fast), implicit association, non-adaptive choice switching, survivorship bias, well travelled road effect.

“We are drawn to details that confirm our own existing beliefs.”

To err is human, so is being self-centered. Being self-centered also makes us err.

  • Choice-supportive bias — “The tendency to remember one’s choices as better than they actually were…”, for example post-purchase rationalization. To overcome the negative effect of this bias, it might be a good practice to regularly and objectively review whether past choices were well made in order to improve decision-making without evoking regrets. Before sleep, I try to recount how my day went and whether things can be done better. I find that if you detach yourself from the event like reviewing a play after a game, you’d develop the muscles to review your choices and improve your decisions without feeling remorse — it helps me sleep better too since I feel like I’ve resolved the day’s issues and can move on!
  • Choice blindness: Our failure to remember the choices we made in the past, mistaking other choices as our previous choices. This finding has been replicated in taste for jam, financial decisions, and eye-witness testimony. The bias teaches us not to be too attached to our past choices — we may remember them incorrectly.
  • Confirmation bias: “The tendency to search for, interpret, focus on, and remember information in a way that confirms one’s preconceptions.” This is a very important bias to understand in our times — if you wonder why people with different views do not see what you see, and frankly, vice versa, confirmation bias could be the culprit. All of us are only willing to see what we wish to see and, in turn, use what we are willing to see to confirm our pre-existing beliefs and biases. Consequently, we become more and more wedged in our own views, and less and less capable to see other views. What’s worse, social media, and more specifically the filter bubble that traps you with content you already believe in so as to keep profiting from your attention, exacerbates our confirmation biases, driving a constant wedge into the increasingly divisive world that we live in.
  • Cognitive dissonance: When we experience psychological stress holding contradictory beliefs or doing something that contradicts their beliefs. Neuroscience experiments done with fMRI have demonstrated that: “… when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent… [because] human beings strive for internal psychological consistency to function mentally in the real world.” In other words, our little brains cannot compute that things in the real world can be contradictory, so we force them in our minds to be consistent. According to this theory, we employ four main ways to reduce cognitive dissonance (all from Wikipedia):
  1. Change the behavior or the cognition (“I’ll eat no more of this doughnut.”)
  2. Justify the behavior or the cognition, by changing the conflicting cognition (“I’m allowed to cheat my diet every once in a while.”)
  3. Justify the behavior or the cognition by adding new behaviors or cognitions (“I’ll spend thirty extra minutes at the gymnasium to work off the doughnut.”)
  4. Ignore or deny information that conflicts with existing beliefs (“This doughnut is not a high-sugar food.”)

The bad news is, except for 1, we are kind of kidding ourselves with 2, 3, 4 in order to reduce cognitive dissonnance. The good news is, since the reduction of cognitive dissonance is kind of automatic (i.e., System 1), we can make use of it to trick us into doing good. For example, studies have shown that students studying something hard or uninteresting without a promised reward would later be more interested in the subject since they have to cognitively justify doing that hard work without an apparent reason. And it is cognitive dissonance between what we believe in and what we witness in the world that drive us to do good or fight against social injustice (or spending endless agonizing hours writing these words that I don’t know if anyone will ever read). On the other hand, be careful when others use cognitive dissonance to manipulate you. For instance, brands might make something ordinary exorbitantly expensive and leave it to your mind to justify the price — it must be very well designed, or well constructed, or limited in edition. Disinformation campaigns take advantage of how people don’t want to choose between lies and truth, but what’s consistent with our worldview and not. People living in an authoritarian regime often resolve the dissonance between what the state had told them and what’s actually happening around them by believing that the all ills in their lives are caused by external forces rather than the blameless autocrat state.

  • Congruence bias: The “tendency of people to over-rely on testing their initial hypothesis (the most congruent one) while neglecting to test alternative hypotheses.” Given the very first principle of iterative learning (see Self > Method: Iterative Learning) is “test everything”, this bias is an important reminder to not just to check the validity of your first hypothesis, but all the alternative hypothesis you can think of too — even if those other ones might disprove your initial belief. Why did you fail at a job interview, for example? Your first intuition might point to a lack of experience, and it might seem that you can use the same reason to explain subsequent failed interviews, but until you test other hypothesis, e.g., a how well you presented the knowledge you did have, creativity in answering an open-ended question unrelated to experience, a biased interviewer, or a tough job market, etc., you may never know the real reasons that you didn’t get those jobs you interviewed for. Psychologist Jonathan Baron suggested asking two questions to avoid the traps of congruence bias:
  1. “How likely is a yes answer, if I assume that my hypothesis is false?”
  2. “Try to think of alternative hypotheses; then choose a test most likely to distinguish them — a test that will probably give different results depending on which is true.”
  • Continued influence effect: “The tendency to believe previously learned misinformation even after it has been corrected.” This helps to explain why fake news are so potent and difficult to overcome — even after they are fact-checked and debunked, they can continue to misinform us because our brain is not updating the information, leading us to unwittingly sharing the debunked misinformation or using it to make future judgments or decisions. To combat this bias, choose factual sources when seeking out information or reading news. Because once the info is in our brain, it might be hard to change our minds about it. Below are examples of the continued influence effect.
  • Selective perception — “The tendency for expectations to affect perception.” For example, if you are an optimist, you tend to notice events that turn out well and forget events that didn’t bode so well. If you are a pessimist, the reverse would be true. As such, what you expect actually becomes your reality as you see it. It follows that you have the power to “make you own heaven and hell, right here on earth” as the lyrics go, simply by adjusting your expectation.
  • Subjective validation — “[P]eople will consider a statement or another piece of information to be correct if it has any personal meaning or significance to them…”, especially if the statement is positive. Subjective validation is easily observed when we are seeking advice from our friends or vice versa. Often times the advice-seeker would pick and choose the advice that is positive or confirms what they already feel are significant to them. Beware of how selective validation can fool us: studies have shown that no matter what is said in horoscopes or personality tests, we would cherry-pick something positive or meaningful to us, and feel the statements to be very true.
  • Semmelweis reflex — The “reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms.” For example, when you are in love with somebody, you may not to notice their flaws until the day you break up. The antidote? Make a conscious choice to seek out challenging views and opinions. On social media, where algorithms show and hide what you see and don’t see, it’s advisable to subscribe to a variety of (truthful and fact-checked) views, including views that would make you rage or cringe, so as to break out of the filter bubbles social media create for us. If they don’t know what views you truly believe in, they can’t lock you into one bubble or another.
  • Other confirmation biases include: backfire effect (not conclusive), experimenter’s or expectation bias, observer-expectancy effect, ostrich effect, subject-expectancy effect

“We notice flaws in others more easily than flaws in ourselves.”

We will cover these biases in Groups > System 1: Social Biases.

“We notice when something has changed.”

Noticing changes was important to our survival — when it was clear that “winter was coming”, we prepared; when we noticed changes in our body, we took note to prevent sickness… Being over-sensitive to changes, though, might cause us to make wrong assumptions and decisions.

  • Anchoring or focusing effect — Our over-dependence on an initial piece of information (an “anchor”) when making subsequent judgments and decisions. When shopping online, that first item we come across might become the “anchor” with which all the subsequent items are compared and judged. At work or in a negotiation, the first proposed solution will affect how all subsequent solutions are viewed and valued. Experts and novice are both susceptible to anchoring. On the other hand, anchoring bias affects sad and depressed people more than happier ones; and agreeable, conscientious and openminded people are more affected by anchoring than callous, careless and cautious ones. Functional fixedness and law of the instrument are two types of anchoring bias. Getting out of our comfort zone, learning new methods and new tools, and solving problems in a novel way will help us mitigate anchoring bias.
  • Assimilation and contrast effects: For example, “[t]hinking about Richard Nixon, a politician strongly associated with scandals, decreases the trustworthiness of politicians in general (assimilation effect), but increases the trustworthiness of every other specific politician assessed (contrast effect).” Similarly, putting yourself among attractive people make you seem more attractive (assimilation effect), but if you appear after a more attractive person has shown their face, you would look more ugly comparatively (contrast effect). As in the decoy effect, assimilation and contrast effects can be exploited in advertising and politics to make someone or something seem more palatable or less desirable, depending on what they appear next to.
  • Distinction bias: The tendency to “over-examine and over-value the differences between things as we scrutinize them.” When presented with multiple opportunities at the same time, such as multiple potential lovers / jobs / pets / places to live, this bias teaches us that evaluating them separately would benefits us more, because although Lover A is better than Lover B doesn’t mean you won’t enjoy your time with Lover B, and although Place X is more exciting than Place Y doesn’t mean that you will notice this difference once you moved to either place. Evaluate each option in their own merits and demerits would help us make the most sensible choice.
  • Framing effect: “Drawing different conclusions from the same information, depending on how that information is presented.” Prospect theory shows that we humans hate risks more than we like the same amount of gain. By extension, we like statements that are framed to highlight gains (which we like a little) rather than risks (which we hate a lot). Thus, condoms that are labeled 95% effective are more trusted than the same condoms labeled to have 5% risk of failure. Meat labeled 75% lean is enjoyed with a heightened peace of mind than the same meat labeled as containing 25% fat. Note that framing effect is consistently shown to increase with age, meaning as we grow older, we make more inconsistent, risk-adverse decisions depending on how something is framed.

Related cognitive biases include: decoy effect, money illusion.

“Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things.”

Our brains are wired to notice things out of the ordinary — when she was 4, my niece asked me why I had a spoon in the bathroom (I used it to open the hard-to-open mirror door), because even in her young mind, spoons and bathrooms did not fit. This ability to detect unusual happenings helps us to adapt to strange, unfamiliar situations, but it also make us less sensitive to mundane but equally important things.

  • Bizarreness effect: The tendency for us to remember bizarre things than common occurrences. If you want to commit something to your memory, find aspects of it that’s out of the ordinary or look at it from a fresh perspective. On the other hand, bear in mind that just because something atypical sticks out in your mind does not mean it happens more or it is more important than everyday happenings. For example, we should be more grateful to someone who loves and supports us consistently than someone who occasionally takes interest in us. And if you want to succeed, persisting would give you a better chance than chasing shiny objects.
  • Humor effect — We tend to remember humorous things more than non-humorous ones due to “the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor.” This helps to explain why funny memes are so viral. Also, to make ordinary things more memorable, try poking fun at them.
  • Picture supremacy effect — “[P]ictures and images are more likely to be remembered than words.” In other words, “a picture is worth a thousand words”. The picture supremacy effect helps to explain the proliferation of memes, and how Instagram’s popularity surpassed Facebook’s, only to be usurped by the more bizarre/funny/visually-striking/anthropomorphic Tiktok. In learning and communications, try using visuals to make the points stick. On the other hand, question how content providers are influencing you through easy-to-remember images in Instagram and Tiktok.
  • Salience bias (an availability bias) — “The tendency to focus on items that are more prominent or emotionally striking and ignore those that are unremarkable, even though this difference is often irrelevant by objective standards.” Shark week is a good example — who among us have encountered a real shark? Very few. Yet so many of us are obsessed with sharks at least one week a year. By comparison, we rarely think about the menace of mosquitos, although they are the deadliest animals to humans, surpassing even the fatality of human murderers.

Related cognitive biases include: agent detection, Von Restorff effect.

Do you have any suggestions, doubts, hypothesis or experience for this topic? Please comment below 👇!

--

--

21CP

21stC Personhood: Cheatsheets for the 2020s is an index/summary of ideas pertinent to today's challenges, compiled for anyone working towards a #FutureWeDeserve