MINDSET

The Biased Ways Our Minds Craft Our Reality

Reinforcing What We Want to Believe

Alejandro Betancourt
Bottomline Talks
Published in
6 min readApr 30, 2024

--

Photo by Google DeepMind from Pexels.

“The moment we want to believe something, we suddenly see all the arguments for it, and become blind to the arguments against it.” — George Bernard Shaw

We all have beliefs and perceptions about ourselves and the world around us. These beliefs help provide understanding, certainty, and confidence as we navigate life.

Sometimes, new information or experiences challenge our ideas and cause us to feel discomfort and uneasiness. Psychologists refer to this feeling as cognitive dissonance.

Thanks for reading Beyond Two Cents! Subscribe for free to receive new posts and support my work.

Cognitive Dissonance

In the 1950s, psychologist Leon Festinger made up a term called cognitive dissonance. It means feeling uncomfortable when you have two opposite beliefs or behaviors. The mind perceives these contradictions as dissonant or discordant.

For example, a person who smokes knows that smoking may cause cancer and other health issues, yet they continue to engage in the unhealthy habit. They feel uneasy and uncomfortable when they realize smoking is bad for them because they like smoking. This is an example of cognitive dissonance.

People may experience dissonance when they learn new information that contradicts their beliefs. When the data contradicts someone’s beliefs, it can cause discomfort as they try to accept new evidence.

Festinger’s theory says people try to reduce cognitive dissonance because it feels uncomfortable. When people feel tense, they use mental strategies to make their reality match their beliefs. These strategies include denial, rationalization, and selective attention.

When faced with contradicting facts, people sometimes cling to their beliefs. Cognitive dissonance helps us understand this. Our minds will go to great lengths to hold onto a preferred reality and avoid discomfort from conflicting information and change.

Denial — Outright rejecting or ignoring the contradictory evidence. Pretending it doesn’t exist is easier than questioning our assumptions.

Rationalization — Making excuses to justify holding onto our original belief despite the conflicting data and convincing ourselves that disconfirming information is flawed or unreliable.

Selective attention — Only focusing on specific details that align with our existing views. Overlooking or downplaying facts that contradict what we believe.

These mental maneuvers allow us to maintain our preferred beliefs by skewing how we process challenging information. However, being mindful of when we slip into these strategies can help us recognize when cognitive dissonance may be biasing our thoughts.

“People become attached to their burdens sometimes more than the burdens are attached to them.” — George Washington Carver.

Confirmation Bias

Cognitive dissonance is the uneasy feeling we get when we have conflicting ideas or beliefs. To reduce this mental discomfort, our minds look for ways to make our original thoughts fit the new, contradictory information. These mental adjustments often introduce biases that allow us to craft reality to match our beliefs and needs.

For example, think back to when you bought a new car. You researched different makes and models, compared features, and justified your buy. After you made the significant investment, you probably thought about all the great things about the new car. It has a smooth ride, fancy dashboard features, and good fuel efficiency. You stopped noticing tiny flaws or drawbacks because you needed to believe you made the right choice. This selective focus highlights the bias of confirmation bias.

Confirmation bias causes us to look for and remember information that confirms what we already think or believe. We notice and give more attention to events and data that fit our existing narratives and expectations. We tend to ignore or forget information that contradicts our beliefs. It makes us uncomfortable.

Motivated Reasoning

Motivated reasoning is a bias where we evaluate information to support our pre-existing views.

For example, a Stanford researcher's study showed participants fake scientific studies on a new skin cream. Some studies showed the cream caused a rash, while others showed it had health benefits. Participants only accepted studies that supported their beliefs and rejected any that contradicted them.

Even when our beliefs are challenged, we often double down and strengthen our original views instead of changing them. Psychologists call this the backfire effect.

In one study, participants were shown factual data that discredited a common misconception they held. Shockingly, this led people to believe in the illusion even more.

We do not like being told we are wrong, so we work hard to justify our original thinking.

We can also fall into subjective validation. This means we see connections and meaning between unrelated events to confirm our beliefs. For instance, a sports fan may believe his team will only win when he wears his lucky jersey. In reality, his clothing unaffected the wins and losses, but the illusion of control and connection satisfies a belief.

Have you noticed any of these tendencies in your thinking and behavior? In what ways might your objectivity be clouded by the mind’s need to craft reality to your beliefs and needs?

Turning a critical eye inward takes courage but can lead to improving oneself.

“The eye sees only what the mind is prepared to comprehend.” — Henri Bergson

Avoiding the Traps of Biased Thinking

Cognitive biases impact our thinking, but we can become more aware and lessen their effect.

  • Look for different opinions. Don’t just listen to sources you agree with. Try to understand other perspectives, even if they disagree with yours. Being challenged can illuminate blind spots.
  • Ask probing questions: When you feel intensely about something, ask yourself probing questions to uncover potential biases driving your viewpoint. What evidence would change my mind? How might others see it differently?
  • Be alert to loaded language. Words like “clearly” or “obviously” show biased thinking by not allowing questioning. Retrain yourself to use more measured language.
  • Separate facts from opinions: Facts are objective observations. Opinions are the subjective meaning we assign to those facts. Don’t confuse the two in your thinking.
  • Consider the opposite: Entertain rival hypotheses — if your current belief is X, assume Not-X is true. What evidence could support an alternative explanation?
  • Talk to someone you disagree with, and have an open and understanding dialogue with someone holding an opposing viewpoint. The back-and-forth can reveal blind spots on both sides.
  • Focus on precision over speed: Fast thinking fueled by confirmation bias often triumphs over slower, more profound analysis. Slow down and prioritize accuracy over satisfying a quick conclusion.

By countering our natural tendencies, we can loosen the grip of biases and pursue a more balanced perspective, not one crafted to preserve what we want to believe. An openness to contradictory evidence lays the groundwork for views that cohere more closely with reality's complex nature.

While the brain’s bias towards crafting reality can be helpful in some ways, it can also lead us astray from the truth. With awareness, we can watch for these biases and be more open-minded.

Before you believe a study or statistic that supports your beliefs, ask yourself if you would still consider it if it went against what you think is true. Am I resisting or rationalizing contradictions to keep my preferred reality intact?

We can understand better by questioning our assumptions and examining information, even if it challenges our beliefs. This takes humility and the willingness to admit we may not have all the answers.

As we loosen our grip on our assumptions, we open ourselves to growth and a reality more complex and nuanced than the one our biased minds create. The truth often lies between extremes, making room for paradoxes.

As legendary author Anaïs Nin observed, “We don’t see things as they are; we see them as we are.” The choice to seek truth with an open and curious mind could not be more critical.

Please share this article with your friends and family if you found it helpful. You may also sign up to receive updates when I publish new material!

I first published this article in “Beyond Two Cents” on September 18th, 2023.

Instagram | Personal Website | X

© Alejandro Betancourt, 2024. All Rights Reserved.

--

--

Alejandro Betancourt
Bottomline Talks

Entrepreneur, Investor, Executive Coach & Author. Single Dad sharing insights on Mindset, Philosophy, and Self-Improvement.