How to spot the top 5 real estate blind spots.

What are the mental shortcuts that help us make quick decisions and judgments? Where did they come from? And how will they cost you money, time, and peace of mind when buying or selling real estate?

First understand these shortcuts, sometimes referred to as heuristics, are automatic. Your brain uses these shortcuts to process the available information and arrives at a decision without you being aware of what’s happening.

This isn’t a problem in and of itself. In fact, it’s necessary. It would be untenable to go through life having to calculate the actual risks and rewards of every possible situation. In many, if not most, areas of our lives, arriving at answers that are “close enough for jazz” works out fine. I don’t need to know the actual, precise costs and benefits of, say, taking a particular route to work. I take the route I take and am generally satisfied with it. I can live with the five minutes longer or shorter that could happen if I were to choose a different route. It’s not something I need do a deep dive on.

Suppose, if during one walk to work, I see a piano hoisted into a building by some sort of rope and pulley system. I’d be likely to walk around it without needing to calculate the precise odds of it falling. I don’t need to know the rope’s tensile strength or the weight and strength of the person doing the hoisting. I make an instant judgement and step around the area. I’ve have enough of Wiley E. Coyote “available” to me to realize that ropes break. I also don’t need a PhD to know that while Mr. Coyote might shake himself back from a flattened state, such an outcome is less likely for me.

But what is actually happening when I make that snap judgement? What’s the process my brain used to do that? If it wasn’t based on facts (e.g. tensile strength), what was it based on? More importantly for the purposes of your next real estate transaction, the question could be: Is what happens when we make such snap judgments always useful? Does it always lead us to the best possible outcome? When the impact of the wrong choice is the difference between a five-minute detour and possible death, it’s a no-brainer. But what happens when the outcomes are less obvious and don’t demand an instant answer?

In the case of buying and selling your home, what snap judgments are you using? Why do you pick one real estate agent over another? Where did the dollar value you placed on your home come from? Why select this home as worth offering more than another? Is it based on rational thought or arrived at by a mental shortcut?

There are hundreds, if not thousands, of variables in an ordinary real estate transaction. It is a complex situation with an uncertain outcome. Natural selection encouraged, developed, and refined our brains to use these heuristics to handle this kind of situation quickly and decisively.

In fact, when faced with a life-or-death situation, when an instant answer is necessary, ours brain do an admirable job. But real estate is not life or death and instant answers are not necessary. While there’s a considerable amount of money and a good chunk of peace of mind at stake, it isn’t life or death. Even in the fastest moving markets, you needn’t generate an instant answer.

For over 200 years, marketers have used our own thinking processes to sell us everything from snake oil to elected officials (not an accidentally similar set of examples). Our own brains betray us daily as we attempt to divine the correct choices in matters where the outcomes are uncertain.

My mission here is to shed some light on these blind spots. To take a peek inside what I am calling the “black box” of residential real estate. My hope is to answer the question: what is your brain doing before, during, and after buying or selling your home? And in engaging in the question allow you to avoid the most common pitfalls I’ve seen cost people time, money, and peace of mind.

As a disclaimer, let me make a few things clear: The closest I came to even completing a course in psychology was failing Psych 101 in 1982.

This was the year I won both Partier and Drinker of the Year at my fraternity. While this was the first time anyone ever won both those awards the same year (a point of pride at the time), it does point out a truth that academia was not a strong suit for me.

This is not a treatise on behavioral economics, nor am I pretending this is technically accurate. I’ve no doubt students of psychology and/or economic theory could point out inconsistencies and inaccuracies in what I write.

My point isn’t to teach you behavioral economics as a theory. I am out to show how these economic theories shape a very specific, expensive, and significant, scenario that, given you’re reading this, are likely about to deal with.

I’ve done my research in good faith. I’ve attempted to get at least close to making sense to someone familiar with these theories. Most importantly, however, everything I’ve written about comes from personal observation. This is how people actually behave when shopping for a real estate agent, a home, or putting their own home up for sale.

It’s my hope that you, in seeing what is happening in your thinking, will be able to use these insights to alter your behavior and make more rational decisions.

Buying and selling real estate is not like buying soap or chips, or any other consumer product. That your brain treats this transaction in the same way it treats buying dinner is inappropriate. In other words, if these heuristics are at the source of what brand of soap or chips wind up in your cart the next time you’re at the store, the impact on you is negligible. (And there are currently millions of dollars spent to make sure that these heuristics are at play when you do fill your cart.) But, when these mental shortcuts cause you to lose the house of your dreams to another buyer, or when the sale of your home nets you $25,000 less than you could have gotten and takes you an extra 3–4 months — well, that’s a much bigger deal. Especially when, rational thinking along the way, would have made the difference.

The heuristics we’ll be looking at are:

  • Representativeness
  • Availability
  • Prospect Theory
  • Confirmation Bias
  • Anchoring

This article is designed to examine each heuristic as itself, unrelated to real estate. I’d like to create a little background and shore up your understanding of each. Then, depending whether your interest is in home buying or in home selling, you can reference the steps of those processes elsewhere on my blog, or in book, Real Estate Blind Spots: A look inside the black box of buying and selling your home.

As you discover the most common blind spots you’ll create alternate ways to think about buying and selling a home, thus creating a path to better results.

Let’s get started!

Representativeness Heuristic vs. Base Rate

Three quick demonstrations should make this short cut clear:

1. Let me tell you about Sarah. Sarah loves to listen to New Age music and reads her horoscope each day. In her spare time, she enjoys aromatherapy and attending a local spirituality group.

Based on the description above, is Sarah more likely to be a school teacher or a holistic healer?

2. Which of the following outcomes is more likely out of flipping a coin ten times: H (heads)T(tails)THTTHTHT or HHHHHHHHHH.

Which do you pick?

3. Meet Tom. Tom is of high intelligence, although lacking in true creativity. He has a need for order and clarity, and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and by flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to feel little sympathy for other people and does not enjoy interacting with others. Self-centered, he nonetheless has a deep moral sense.

Which degree is Tom most likely studying for: engineering or social sciences/history?

The correct answers, statistically speaking, are:

  1. Sarah is more likely to be a school teacher.
  2. The coin is just as likely to be flipped in either pattern.
  3. Tom is 17X more likely to be a social science major than an engineering major.

Surprised? Were your answers incorrect? They very likely were because most people don’t answer from the actual probabilities. Why? Because people are horrible at predicting the actual, in-reality, real-life probability of something occurring.

This failure to judge outcomes consistent with reality gets worse when specifics are presented with the data. In fact, often the more detailed the available data, the more likely someone is to use it to determine the answer, regardless of whether the information is relevant to the actual likelihood of the event happening.

Let’s strip away the specifics and look only at the statistics to arrive at answers that are more probable.

  • There are more school teachers than holistic healers in the world, so it’s more likely that Sarah is a teacher.
  • Coin flipping always has a 50/50 chance of landing on heads or tails.
  • Social science/history majors outnumber engineering majors by more than 17:1.

Those statistics on their own wouldn’t shock most people. It’s that these statistics are true, regardless of what other information you may know about Sarah or Tom, that confuses the issue.

Change the question a bit and suppose you knew Sarah likes books and children instead of horoscopes and New Age music, what would your answer have been? Suppose none of that information was available and all you knew was Sarah was 5’5” and weighed 120 lbs. What then? Suppose you knew Sarah liked dogs, kids, New Age music, horoscopes and sushi?

Details matter, but not in the way you think they do. Having more information often leads further from the truth. What’s happening is the information you have causes what you’re looking at to fit into one specific “representative” mold, or mental image. Your brain is a pattern-seeking machine. It’s relentless in hunting for patterns and fitting the world around you into ones familiar to you.

Your brain might be looking for statistical probability, it’s just really bad at it. It’s fooled by random, unrelated data. If something looks like the mental model you already have, your brain pulls in that direction. In other words, if the data on hand is representative of something you already have in mind, that will seem like the right answer.

In the case of the coin flip, the first pattern looks more random, that is, it’s more representative of the image you have in your mind of a random coin flip. It’s that it looks to you as random that’s the issue. Now you’ll operate as if it were the more likely outcome, despite the fact that the odds of either exact pattern arising are identical.

This is called as the Gambler’s Fallacy. It’s another example of how, rather than dealing with the specific probabilities of something happening, we’ll go with our “gut.” We’re in fact, organized to do so.

People don’t use data as data when making decisions. They use data in the form of a story, data after it has been “patterned,” and then pretend their decisions are based on the data, not their interpretation of it.

If you said Tom was an engineering student rather than a social sciences student, you were using your personal interpretation of the information you had. You used subjective characteristics, rather than the fact there are 17 times more people studying social sciences.

For many people, even if they have the numbers of students in each field at their disposal, they still rely on the way the information they have “patterns” itself into a story. If it fits (i.e represents )what they picture as an engineering student then they will say he’s an engineering student.

The pull of base rate (the statistical probability of something being more likely to be true) is far weaker than the Representative Heuristic in moving the needle in people’s decision making, yet most people would purport themselves to be rational in their decision making.

The truth is we’re unprepared to use the statistical likelihood of something happening to make our decisions.

The Representativeness Heuristic, like all heuristics, is automatic, unseen, operating in the background, and has us make assumptions and take actions that are counter to what reality says would be productive.

Availability Heuristic

The term, first coined in 1973 by psychologists Amos Tversky and Daniel Kahneman, suggests that this shortcut has us operate under the principle that “if you can think of it, it must be important.” If something comes to mind easily, we believe it to be far more common, and more impactful, than it may actually be.

As an example of this heuristic at play, look at this experiment:

Consider the letter K. In any given text (this article, Hamlet, War and Peace, it doesn’t matter) does K more often appear as the first letter of a word or the third letter? And if you had to guess, what would you estimate the ratio of these two is (i.e. __:1)?

First published in 1973, Tversky and Kahnaman, found of the people in the study, twice as many answered that K appeared as the first letter of the word, and did so twice as often, as it appears as the third letter in a word.

This is almost exactly backward. In English texts, words with K as the third letter are roughly twice as common as words that start with K.

So why did people think it’s the other way around? Tversky and Kahneman suggested it’s because it’s easier to think of words that start with K than it is to think of words that have K as the third letter. That is, words with K as the first letter are more available in our minds; therefore, we place more weight on them.

Additional studies around this heuristic have been conducted in which one group of subjects were asked to share six examples of when they’d been assertive in life. Most in this group could think of six examples. Another group was asked for 12 examples, which few people could complete, that is, they could not come up with 12 examples. Both groups were then asked to rate how assertive they were. The group that had been asked to recount just six occasions scored themselves higher because, it was interpreted, their available data had a greater proportion of being assertive. (Schwartz 1991)

Even things totally fictitious, if present (available) for us, can drive our behavior. I’ve certainly been affected by this heuristic when my wife has had dreams during which my behavior was less than, how shall we say, appropriate. Regardless of the fact this was a dream she had while I was asleep by her side, the memory of what I had “done” was available to her and the impact on our communication in the morning was very real.

Most readers will have had the experience of worrying about a rare but vivid event that they’ve recently heard about. It’s why people are more worried about homicide than stomach cancer (even though ten times more people die from the latter each year). It’s why, when you see a car accident while driving, you slow down even though conditions are the same as they were before seeing the accident.

The Availability Heuristic makes us susceptible to overestimating the likelihood of uncommon but vivid events. We live in fear of terrorism and child abduction, despite these being rare and, at the same time, we ignore asymptomatic diseases that are both common and deadly. The first thought that comes to us, we think more likely and more important. It’s as if the very fact that we think it quickly means it’s more important.

Anything that keeps the idea or event stuck in our mind will impact its availability and thus shape our judgments. The more recent the event, the more likely it is to shape our actions, which is why we can say that time heals all wounds. As memories fade, they become less available.

Vividness is another critical factor in availability. In experiments, psychologists have shown that when test subjects were told of a new disease that had recently been discovered, those taking the time to imagine their life with the disease estimated they were more likely to be infected than the group that was simply told about it but not asked to visualize their life with the disease.

Overall, heuristics are part of our psychological inheritance. Like other naturally selected attributes they protect us from making mistakes that could’ve gotten us killed on the savannah. I suppose it’s for this reason imagined negative outcomes tend to stay in our minds longer, and are more vivid, than positive outcomes — worrying as a matter of survival.

Again, most readers will have experienced some form of this firsthand — fear of negative outcomes is a greater driver of action than desire for positive ones partly because negative outcomes are more available to us.

This then is a direct link here to our next heuristic — Prospect Theory.

Prospect Theory

The fact that avoiding loss is more important to people than attracting gains is not much of surprise.

Prospect Theory shows us how we operate when faced with the choices of potential losses and gains… and it’s not rational! Again, no surprise, right?

According to Prospect Theory, first introduced by Kahneman and Tversky in 1979, we tend to value a gain that’s certain — a bit of the proverbial “sure thing” — more than a gain that is less than certain, even when the expected value of each is the same. Stated another way: Even if the statistical odds and net gains of the certain vs the less certain are the same, we will “pay” more for the certain one. We become risk adverse.

But in the case of losses, we behave in the opposite manner: We will take even bigger risks to avoid a certain loss. In general, when the risk seems higher, we focus on potential losses, and when the risk seems lower, we switch to focusing on potential gains.

Consider one of the seminal experiment designs:

Test subjects were told to assume there was a disease affecting 600 people and they had to choose between two programs which had different probabilities for success:

• Program A: 200 of the 600 people will be saved.

• Program B: There’s 33% chance that all 600 people will be saved and 66% chance that nobody will be saved.

When faced with the choice between a certain outcome and an uncertain one, the majority chose the certain one (Program A).

Then the same subjects were offered two more choices:

• Program C: 400 people will die.

• Program D: There’s a 33% chance that nobody will die and 66% chance that all 600 people will die.

Now, given the framing of the question to highlight the risk of loss, most people chose D. This despite the fact that the outcomes of Programs A and C are identical, as are the outcomes of Programs B and D.

When faced with potential loss, people are consistently willing to take greater risks.

Prospect Theory demonstrates that when it comes to potential gains, people are, by default, risk adverse, and when it comes to avoiding potential losses, people become risk seeking. The fact that we will take greater risks to avoid losses is the heart of what’s known as loss aversion.

Winning may make us happy but losing definitely makes us miserable. In other words, we need to win more, roughly $250, to feel as good as losing $100 makes us feel bad. We’re much quicker to feel bad about loss than we are to feel good about gain. There’s a demonstrable asymmetry in our behavior when it comes to going for gains vs. avoiding losses, which Prospect Theory calculates as roughly a 2.5X difference.

This, as you might guess, impacts us in a myriad of ways. First of all, it makes no rational sense. When it comes to avoiding losses, you and I become irrationally risky, meaning our responses differ from statistically appropriate responses, creating more risk of loss… not less. In other words, when faced with the risk of losing money, we tend to take actions that actually increase the chance of losing more money, not less.

Look at another experiment, conducted by noted behavioral economist, Dan Ariely:

• Salespeople were paid a commission to sell TVs.

• TV manufacturer “A” paid the commission up front. That is, salespeople were given the total possible commission for selling all of “A’s” TVs before they had even sold a single one.

• TV manufacturer “B” paid the commission the normal way — after the sale.

• When a salesperson sold TV “B,” they received $12 as commission, but, because they hadn’t sold TV “A,” they had to return $10 of the prepaid commission.

While this still generated a net gain for the salesperson of $2, salespeople consistently oversold TV “A” because the pain of giving back $10 was greater than the pleasure of receiving $12.

This experiment also illustrates the Endowment Effect. In the Endowment Effect, we don’t see gains or losses against absolute zero but against that which we already own. Once the salespeople were given the money by the manufacturer of TV “A,” they considered it theirs. Losing their money was more painful than the possibility of gaining someone else’s, even when the amount of gain is bigger than the loss.

The Endowment Effect shows that as soon as we own something, we start to look at life through the lens of potentially losing that thing. In a market situation, we tend to over value that which we own. The flip side is that the market tends to over value its possession (money to buy what you own). Since we’re averse to loss, it’s no surprise that our perspective on valuing what we are bringing to the transaction focuses on the loss of what we have — either money when buying or the object when selling.

Confirmation Bias and the Backfire Effect

Confirmation Bias is the idea that, no matter how open-minded we pretend to be, we seek information that confirms what we already believe to be true.

While science might be the game of arriving at the truth through pursuing evidence that contradicts the current hypothesis, our daily lives are full of anything but.

Consider this experiment Peter Wason conducted in the 1960s: Subjects were given a sequence of three numbers — 2, 4, 8 and asked to determine the rule that was used to govern the sequence. The subjects could discover the rule by proposing their own number sets and receive a simple “yes” if it followed the rule, or a “no” if it did not follow the rule. They could then guess what the rules was.

Time after time, students would come up with a hypothesis for what the rule would be and then propose numbers to prove their hypothesis. Very rarely would someone propose a sequence designed to disprove their original idea.

In other words, given a sequence of 2, 4, 8, they might think “Oh, that’s a sequence of doubling numbers” and propose sequences like 4, 8, 16 or 100, 200, 400 or 6, 12, 24 and so on. All these do, in fact, fit the rule and after several of these attempts, they would become satisfied that they had discovered the rule.

The problem is that this wasn’t the rule.

All the students had done was to find more sequences that fit, but not the rule itself. It’s only by finding information that disproves or doesn’t fit the existing hypothesis can you hope to discover the rule.

But we don’t do that.

Because of Confirmation Bias, we look for that which supports what we already believe… hence, the enormity of the political pundit industry.

Oh… so what was the rule in the experiment? Watch this video to find out:

The Backfire Effect is the other side of the hand from Confirmation Bias. Not only are we only looking for information that confirms and validates our existing views, but if we do come across information that contradicts it, we tend to double down on what we already believe rather than change our belief to fit the new evidence.

History could not be clearer on this point. How was Galileo treated when presenting the idea that the earth was not the center of the universe? With a parade and as a hero? No. We put him in jail for heresy. But you would never do that, right? You’re not so closed minded. You’re open to new ideas and welcome all points of view. Really? The evidence shows you’re not open-minded…(and now watch your immediate reaction to that — how open minded are you to my even suggesting you’re not open minded?)

The evidence shows that correcting someone’s misconception on any topic will increase their belief in the misconception, not decrease it. Worse, the closer the topic is to our personal sense of self, the more pronounced the Backfire Effect becomes. In other words, trivial topics are not as affected by this phenomenon as topics integral to who we consider ourselves to be.

A study conducted by researchers at Dartmouth College had people read an article containing a quote from President George W. Bush asserting that tax cuts “helped increase revenue to the Treasury.” In some of the versions of the article, this claim was then corrected with evidence showing revenue to the Treasury actually declined for the three years following the tax cuts from $2 trillion in 2000 to $1.8 trillion in 2003… and you know what they say, a trillion here, a trillion there, pretty soon we’re talking about real money.

So what happened when shown the evidence? People describing themselves as conservatives were twice as likely to think the tax cuts generated more revenue when they read the article with the contrary evidence as they were when they read the article without the evidence.

If for some reason you read this and believe these types of results are particular to conservatives and not liberals, you might consider you are, right now, at the effect of your own Confirmation Bias.

The upshot is that we’re never as open to new ideas as we imagine. Our willingness to consider alternative points of view are constrained on one side by Confirmation Bias, where we seek information that proves we’re right, and on the other side by the Backfire Effect, where evidence that’s found proving us wrong is not simply ignored, it drives us further into our mistaken beliefs.

Anchoring Heuristic

The Anchoring Heuristic points to our tendency to use the first piece of information offered (the “anchor”) when making decisions.

Think of buying a car — the first piece of information is the manufacturer’s suggested retail price (MSRP). That number becomes the anchor. All negotiations now revolve around that number. Now when the salesman says, “Let me go back to my manager and see what I can do” and comes back with a number that’s lower than the MSRP, you have no choice but to think you’re getting a deal.

This kind of makes sense. You would have no way to really know what the car is worth or where to start the negotiation without some figure, so the MSRP serves a purpose. However, if you think the Anchoring Heuristic is limited to occasions that seem logical or useful, you’d be mistaken. Any random number will serve as an anchor and will shape the amount you’ll pay for something.

Consider this experiment conducted by Dan Ariely and Drazen Prelec in 2006 that Ariely reported in his book, Predictably Irrational. As professors at MIT, they had their students take part in an auction of sorts. For sale, they had a bottle of wine, a computer trackball, a textbook, etc. The researchers would explain the benefits of owning each item and then ask the students how much they would pay for each one.

Before they could write their offer, the students needed to write the last two digits of their social security number on the top of the bidding sheet. Then next to each item, they were to take those digits and turn them into dollars and write them next to each item.

For example, if I were taking the test, I would write 14 on top of my bidding sheet, as my social security number ends in 14. Then further down the page next to each item, I would write $14. If my son were doing this he would write 82 on the top and $82 next to each item, and so on.

Now the students could write their offer of what they’d actually pay for each item. The instructions were to write it next to the dollar amount that came from their social security number.

What do you think happened? No doubt the social security number was a random, unrelated number. It couldn’t have anything to do with what people would pay, right? The students were well aware of what they were doing. The lack of any correlation between the number on the page and the value of the items on which they were bidding was in no way hidden.

The results? Students with social security numbers that ended between 80–99 were willing to pay up to 346% more for the items than the students with social security numbers with the last two digits between 00–20.

The experiment could have used today’s temperature, the score of a football game, or the number of angels you can fit on a pin as an anchor. The brain will not discriminate. It fixates on the first number it sees and that becomes the anchor.

By the way, and important to note, this happens with more than money. The house you’re in becomes an anchor in your mind that shapes and determines your thinking about your next house. People can become anchored to the past — when gas was a dollar, what you paid for something last time, the way things used to be — all these exist in our minds as anchors, creating comparisons that shape what we will and won’t accept, how we will and won’t behave.

Like Prospect Theory, Anchoring is also related to the Framing Effect. Look at this example, also from Dan Ariely’s book, Predictably Irrational:

Below are three actual subscription plans offered by the magazine, The Economist. Which would you choose?

1. subscription — U.S. $59.00; one-year subscription to; includes online access to all articles from The Economist since 1997.

2. Print subscription — U.S. $125.00; one-year subscription to the print edition of The Economist.

3. Print & web subscription — U.S. $125.00; one-year subscription to the print edition of The Economist and online access to all articles from The Economist since 1997.

What? Why would anyone choose the second option? Getting both the web and print versions for the same price as print only is a better deal. Why would they even bother adding the second option? Mr. Ariely was determined to find out why they did bother, so he created an experiment in which he gave these three options to his MIT students.

Of the 100 students queried, 16 chose option one and 84 chose option three. None took option two (so at least we know MIT’s admission process is okay). He then offered a different set of options to a second group of 100 students. This time he only gave them two options:

1. subscription — U.S. $59.00; one-year subscription to; includes online access to all articles from The Economist since 1997.

2. Print & web subscription — U.S. $125.00; one-year subscription to the print edition of The Economist and online access to all articles from The Economist since 1997.

Now what happened?

Without the middle, “useless” option, 68 students took the first, cheaper option and only 32 took the $125 plan.

What do you notice about this? By framing the plans and giving them a crazy option that nobody would take, The Economist stood to increase the sale of the more expensive plan by 263%.

There’s no doubt that the Anchoring Heuristic and Framing Effect surround us daily, and the more complex and rare the transaction, the more susceptible we are to it.

In fact, it should be clear a this point that each of these mental shorcuts, these heuristics, shape, color and guide us daily. The rest of this book is an opportunity to notice where they come in to play during the buying or selling of your home. My intention is that after you’ve seen the pitfalls they create you’ll be better prepared to step around them.