Behavioural Economics

The Hindsight Bias: No, you didn’t know it all along.

If you read newspapers from ten years ago, you’ll learn just how unpredictable the world really is.

Neha is a business development manager who is trying out a new business idea. She has been told by her superiors that it’s very tricky and most likely it won’t be successful. But she feels it in her gut that her idea would work.

With her good argumentation skills she happens to persuade her bosses to give her the go-ahead with this idea. “I knew it all along that they’ll be able to see my vision,” Neha thinks.

Everyone’s excited about the roll out of the new idea. All have poured in a lot of money and sweat into it. Then the unthinkable happens. It doesn’t work. How could that be? Neha runs the numbers — no good. She keeps trying it over and over to see if she has missed something.

Neha looks back to think about it. True, everybody was against it. It had merely a small chance of being successful — “Why was I so adamant and didn’t listen to them?”

When she goes to her bosses she gets the quintessential “I told you so” from them.


Last week while driving back from a nearby city I reached a T in the road. I wasn’t so sure about the direction and I instinctively took the left. After driving for half an hour I realised that I was lost. It was night and I was low on gas. “I knew I took a wrong turn. Why didn’t I look at Google Maps before taking the turn?!”


From the war journals in the 1940s, after Paris was occupied by Germany, all citizens were certain that the Germans will leave by the end of year. Their officers also confirmed this. “England will fall as fast as France did, and then we will finally have our Parisian lives back — albeit as part of Germany,” they all thought. However, the occupation lasted four years.

But in today’s history books, the German occupation of France seems to form part of a clear longterm military strategy. They knew it all along what would happen — there was no ambiguity.


“I knew this was going to happen” — how often do you hear that? How often do you yourself think that?

When a new business idea plan fails — “I knew this would happen! This plan was doomed to fail from the start.”

When all your investments are lost — “Everybody had told me not to invest in the stocks. I was a fool not to listen to them.”

When that girl rejects you after you propose her — “What was I even thinking? She’s completely out of my league. I was only fooling myself!”

When she accepts you — “I don’t know what I was afraid of for so long! Clearly she loves me!”

You often look back on the things you’ve just learned and assume you knew them or believed them all along.

You tend to edit your memories so you don’t seem like a dimwit when things happen that you couldn’t have predicted. When you learn things you wish you had known all along, you go ahead and assume you already did know them. This tendency is just part of being a person, and it is called The Hindsight Bias.

Want articles like this delivered straight to your inbox, read by 3,000+ subscribers? Subscribe to my newsletter here.


Hindsight bias, also known as the knew-it-all-along phenomenon refers to the common tendency for people to perceive events that have already occurred as having been more predictable than they actually were before the events took place.

Financial bubbles are often the subjects of substantial hindsight bias. Following the Dot Com bubble in the late 1990s, many pundits and analysts tried to demonstrate how what seemed like trivial events at the time were actually harbingers of future financial trouble. If the bubble had been that obvious to the general population, it would likely have been avoided altogether.

Judgments about what is good and what is bad, what is worthwhile and what is a waste of talent, what is useful and what is less so, are judgments that seldom can be made in the present. They can safely be made only by posterity.
Endel Tulving

Similarly, in 2007, economic experts painted a rosy picture for the coming years. However, just twelve months later, the financial markets imploded. Asked about the crisis, the same experts enumerated its causes: monetary expansion under Greenspan, lax validation of mortgages, corrupt rating agencies, low capital requirements, and so forth. In hindsight, the reasons for the crash seem painfully obvious. “We knew it all along.”


In 1986, Karl Teigen, now at the University of Oslo, did a study in which he asked students to evaluate proverbs. Teigen gave participants famous sayings to evaluate. Most people agreed with all the proverbs he showed them, and then agreed once again when he read to them proverbs that stated opposing views.

When he asked them to evaluate the phrase “Love is stronger than fear,” they agreed with it. But when he presented them the opposite, “Fear is stronger than love,” they agreed with that as well.

Teigen was trying to show how what you think is just common sense usually isn’t. Right now you are very likely to say in your head, “Yep that’s true!” Chances are you’ve fallen prey to hindsight bias as well.


So what exactly causes this bias to happen?

Hindsight bias embodies any combination of three aspectsthat stack on top of each other — from basic memory processes up to higher-level inference and belief.

  1. The first level of hindsight bias, Memory Distortion which involves misremembering an earlier opinion or judgment — “I said it would happen.” You tend to distort, or even misremember earlier predictions about an event. You selectively recall information that confirms what you already know to be true. As you look back on your earlier predictions, you tend to believe that you really did know the answer all along.
  2. The second level, Inevitability, centres on our belief that the event was inevitable — “It had to happen.” You try to create a narrative that makes sense out of the information you have.
  3. And the third level, Foreseeability, involves the belief that we personally could have foreseen the event — “I knew it would happen.” Since your (fake) narrative is simple, and very easy to generate, you interpret that to mean that the outcome must have been foreseeable from the beginning.

Furthermore, research suggests that you have a need for closure that motivates you to see the world as orderly and predictable and to do whatever you can to promote a positive view of yourselves.


One potential problem with this way of thinking is that it can lead to overconfidence. If you mistakenly believe that you are going to succeed, you might become too confident and more likely to take unnecessary risks.

Such risks might be financial — overconfident entrepreneurs are more likely to take on risky, ill-informed ventures that fail to produce a significant return on investment. They might also be emotional — investing too much of yourself in a bad relationship.

Hindsight bias gets in the way of learning from your experiences, and it also allows you to participate in one of your favourite pastimes — criticising the decisions of others for their lack of foresight.

If you feel like you knew it all along, it means you won’t stop to examine why something really happened. That’s why it’s often hard to convince seasoned decision makers that they might fall prey to hindsight bias. If you have an overconfident boss, you know this very well.

However, it isn’t all bad. According to psychologist Rüdiger Pohl,hindsight bias is not necessarily a bothersome consequence of a “faulty” information process system. It rather represents an unavoidable by-product of an evolutionary evolved function — adaptive learning.

According to this view, hindsight bias is seen as the consequence of your most valuable ability to update previously held knowledge. This may be seen as a necessary process in order to prevent memory overload and thus to maintain normal cognitive functioning. Besides, updating allows you to keep your knowledge more coherent and to draw better inferences.


So what, if anything, can we do about it?

Researches Roese and Vohs suggest that considering the opposite may be an effective way to get around your cognitive fault — at least in some cases.

When you are encouraged to consider and explain how outcomes that didn’t happen could have happened, you counteract your usual inclination to throw out information that doesn’t fit with your narrative. As a result, you may be able to reach a more nuanced perspective of the causal chain of events.

Questions like “given what you knew at the time what reason did you have to think that X might happen?”, “when did you start to feel guilty, and what information had you learned that led to you feeling that way?”

This exercise keeps you from falling prey to Outcome Bias as well.

Another easy way is to keep a journal. Write down your predictions — for political changes, your career, your weight, the stock market, and so on. Then, from time to time, compare your notes with actual developments.

You will be amazed at what a poor forecaster you are. Instead of textbooks, read historical diaries, and documents from the period. Read newspapers and follow news from five, ten, or twenty years ago. You’ll have a fair idea of just how unpredictable the world really is.

Also, knowing about hindsight bias should arm you with healthy scepticism when investors and businessmen talk about their past decisions. Also, keep it in mind the next time you get into an argument with your boss or a colleague — the other person really does think he or she was never wrong, and so do you.

About the author:

Hi, I’m Abhishek. I’ve written 50+ essays which have been featured and quoted in Lifehacker, Psychology Today, ACM Digital Library, Springer, and Interaction Design Foundation.

Recommended Reading:

  1. The Dunning-Kruger Bias: The stupid are usually cocksure while the intelligent are full of doubts.
  2. Behavioural hacks restaurants employ to trick you into spending more.
  3. The Outcome Bias: If you don’t meet an accident the first time, you are very likely to drink & drive again.