Behavioral economics: Your feelings don’t care about the facts

Jorge Valencia
10 min readAug 2, 2023

--

“We’re not thinking machines. We’re feeling machines that happen to think.” — Dan Ariely

Do you ever feel like you’re in control of your decisions, confidently navigating life with rationality as your faithful guide? Well, I hate to burst your bubble, but we’re not the masterminds we believe ourselves to be. We’re emotional beings, and our hearts reign supreme, while our rationality takes on the role of an obedient press secretary, justifying the whims of our emotions.

Psychologist Jonathan Haidt introduces a useful analogy to illustrate this concept. Your emotions are an elephant, and your brain is the rider, clinging to the elephant’s back. Sure, as the rider, you can try to steer the elephant in the direction you want to go, but ultimately, that elephant will do whatever it pleases. Rationality can nudge, but emotions have the final say.

Controlling your emotions is like riding an elephant. You can nudge, but the elephant has the final say,

Now, you might be thinking, “Wait, what about all those times I made logical decisions based on sound reasoning?” Well, we like to believe we’re rational, but in reality, our emotions are often the puppeteers pulling the strings behind the scenes.

Now, I know this might sound disheartening, like admitting you’re not the master of your own fate. But guess what? Embracing this truth is actually empowering. It’s an invitation to be humble and acknowledge that sometimes, we’re just not as in control as we’d like to believe. And that’s okay.

By recognizing the supremacy of emotions, we can look at our thoughts and decisions with a clearer lens. We become more aware of the biases that color our judgments and cloud our thinking. We can learn to navigate life with greater clarity, making better decisions in our work, our relationships, and most importantly, within ourselves.

Cognitive Biases and the Illusion of Rationality

Behavioral economics opened the door to the fascinating realm of psychology. A couple of years ago I became obsessed with the idea of being rational in every decision I made. I engaged in “rational” discussions with my teammates, and I designed products for what I believed were “rational” users. But no matter how hard I tried to be rational, things didn’t always work out as expected. I tried to be like Spock and failed miserably.

Humans can’t be fully rational, there is a reason why Spock isn’t a human.

In our design meetings, when we debated contentious issues, like the position of a call to action, I would throw around articles and books to justify my “rational” decisions, yet my teammates seemed to act irrationally (and I’m sure they thought the same about me).

Even with users, the outcomes were not much better. They didn’t always follow the path I had meticulously designed; instead, they were drawn to shiny elements in the interface, guided by instinct rather than rational thought. Surprisingly, the designs that performed the best were the ones that “felt right,” not necessarily the ones that were objectively correct.

Why was everyone acting so irrationally? I started feeling perplexed until I stumbled upon the work of Daniel Kahneman, the father of behavioral economics. It was like being pulled out of the matrix — the veil was lifted, and I understood that we are not entirely in control of our decisions. We are prisoners to our cognitive biases, often acting based on gut feelings instead of deliberate reasoning.

The Rational Brain and the Lazy Brain

In “Thinking Fast And Slow” Daniel Kahneman introduced the concept of two thinking systems: the lightning-fast, intuitive System 1, and the slower, analytical System 2.

System 1: Fast & Dumb

Imagine System 1 as the eager but not-so-bright intern in your mind’s office. It’s the one who’s always on their toes, responding lightning-fast to emails, but sometimes sending out the most hilarious typos. You know, the intern who excitedly blurts out answers without thinking, leading to the occasional facepalm moment. Yep, that’s System 1 — quick on its feet, but not always the sharpest tool in the shed.

System 2: Rational & Lazy

Now, let’s picture System 2 as the thoughtful but laid-back professor in your mental faculty. This one enjoys sipping a cup of tea, taking their sweet time to ponder and analyze complex problems. While System 1 is running around like a headless chicken, Professor System 2 is calmly contemplating the universe’s mysteries, occasionally drifting off into daydreams. Sure, they might take a while to respond to your questions, but when they do, you can bet it’s a well-thought-out and insightful answer. Because the analytical system 2 requires effort, we often rely on System 1’s gut feelings and intuitive judgments, even when a more considered approach from System 2 would be more appropriate. The brain is a beautiful mess.

As psychologist Dan Ariely pointed out in “Predictably Irrational,” we may think our choices are based on logic and sound judgment, but in reality, they are often swayed by hidden biases, social influences, and plain old irrationality.

So, while we may not always be as predictable as we’d like to believe, embracing our cognitive biases with self-awareness can lead to a better understanding of ourselves and the world around us.

5 Biases That Sneak Into Our Daily Lives

It might be overwhelming to learn all the cognitive biases (it's a long list), but don’t worry, you don’t need to become a walking encyclopedia of biases. Simply being aware of the most common ones can go a long way in helping you avoid falling prey to their traps. Let’s explore five of the most common biases you’ve probably encountered. You might recognize a few of these in your own life.

1. Availability Bias: The Memory Trap

Image from thedecisionlab.com

Imagine you’re considering investing in the stock market, but a recent news story recounts your friend’s disastrous investment experience, leaving you wary. The availability bias swoops in, making that single negative instance dominate your thoughts, and you start believing that investing is inherently risky.

Are you basing your judgment on a single anecdote or researching the market’s historical performance and risks?

Our brains are wired to prioritize emotionally charged memories, creating an illusion of frequency and probability. Vivid stories may be memorable, but they don’t represent the overall reality.

2. Confirmation Bias: The Echo Chamber Effect

Image from thedecisionlab.com

When news and information related to your preferred party come your way, you eagerly absorb it, finding validation for your beliefs and feeling reassured that you’re on the right track. At the same time, you tend to dismiss or downplay any news or arguments from opposing political parties, labeling them as biased, misinformed, or even malicious.

Your social media feed becomes an echo chamber, filled with posts and opinions that align perfectly with your own, further reinforcing your political convictions. As a result, your mind becomes closed to alternative viewpoints and critical analysis, leading you to overlook flaws or shortcomings within your favored party.

This bias creates our own little echo chambers, where we only hear what we want to hear, reinforcing our preexisting beliefs. To overcome this bias, we must actively seek out alternative viewpoints, engage in respectful discussions, and remain open to evolving our beliefs.

3. Sunk Cost Fallacy: Walking Away is Hard to Do

Image from thedecisionlab.com

Ever found yourself stuck watching a boring tv show, but you refuse to stop because, well, you are already in half of the season? That’s the sunk cost fallacy at play. It tricks us into believing that the more time, money, or effort we’ve invested in something, the harder it becomes to let go, even if the situation is dragging us down.

So, the next time you find yourself watching a boring show, reading a dull book, or doing an unfulfilling job, remember to cut your losses and move on.

4. Loss Aversion: The Fear of Letting Go

Image from thedecisionlab.com

We, humans, are wired to avoid losses like the plague. The loss aversion bias makes us feel the pain of losing something far more intensely than the joy of gaining something of equal value.

Imagine you’re designing a mobile app and during the early stages, you included a flashy animation that’s very hard to accomplish. As time goes on, user testing and feedback show that the animation is distracting and adds unnecessary loading time, negatively impacting the user experience.

However, despite this feedback, you find yourself hesitant to remove the animation from the design. The loss aversion bias kicks in, and you become attached to the initial idea, fearing that removing it will somehow diminish the product’s appeal or result in the loss of a perceived “wow” factor. You may grapple with the decision, struggling to let go of something you invested time and effort into creating.

Yet, the truth is that by holding on to an outdated or problematic design element, you risk compromising the product’s usability and user satisfaction. So, it’s essential to recognize the loss aversion bias and prioritize the overall user experience over personal attachments.

5. Framing: The Art of Shaping Perception

Image from thedecisionlab.com

Consider a scenario where you are looking to buy a new smartphone. You’re presented with two options — one marketed as “water-resistant up to 30 meters” and the other as “not water-resistant above 30 meters.” Both statements convey the same information, but the framing bias comes into play, shaping your perception of the products.

The “water-resistant up to 30 meters” option seems more appealing because it highlights a specific capability, instilling a sense of durability and reliability. On the other hand, the “not water-resistant above 30 meters” option may leave you feeling uncertain and less confident about its ability to withstand water exposure, even though it’s basically the same as the first option.

This framing bias impacts consumer decisions, as we tend to be drawn to positive, affirmative statements that highlight the strengths of a product, rather than its limitations.

Other Biases to Take Into Account

  • Hindsight Bias: Believing an event was predictable or foreseeable after it has occurred.
  • Halo Effect: Forming a positive impression of a person or product based on one positive trait or characteristic.
  • Dunning-Kruger Effect: Overestimating one’s competence and knowledge while underestimating the competence of others due to a lack of self-awareness.
  • Default Bias: Preferring to stick with the default option or status quo rather than actively making a different decision.

Understanding the human mind is a superpower, but remember Uncle Ben's advice, with great power comes great responsibility. While cognitive biases can be harnessed for good — to design user-friendly experiences, make better decisions, and stay informed — they can also be used for less noble purposes.

Advertisers, politicians, and even social media platforms take advantage of these biases to manipulate and sway public opinion. Use this knowledge for good. Don’t be evil. You can use these quirks of human behavior to improve your life and the life of others.

“The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct.” — Daniel Kahneman

Stop Fooling Yourself And Make Wiser Choices

So, you’ve taken the red pill and ventured into the intriguing world of cognitive biases. Now, get ready to see biases everywhere you look, they’re lurking around every corner, just waiting to trip us up. But now you have a secret weapon to question assumptions, challenge your own beliefs, and make more informed choices.

Use this weapon when designing products by appealing to users’ emotions, remember that feelings trump reason most of the time (even when we are unaware).

Use it when debating with your teammates by relying on data instead of anecdotes and personal preferences to make decisions and don’t fall into “groupthink bias”.

Use when navigating social media news by embracing diverse viewpoints and avoiding filter bubbles and echo chambers that reinforce your confirmation bias.

And the next time you catch yourself falling into the clutches of a cognitive bias, take a step back and examine your thoughts. Ask yourself why you’re making a particular choice, and whether your emotions are leading the charge. By doing so, you’ll reclaim some control over the reins, allowing your rational mind to step up and nudge the elephant in the right direction.

Suggested readings

If you share my fascination with the field of behavioral economics, I highly recommend checking out these books:

To understand the impact cognitive biases have on product design, you can check out these links:

--

--

Jorge Valencia

Head of Design at Runroom, Product Designer at the core, lover of the human psyche and a 2000s nostalgic.