Looking in the rear view mirror…

Tom Connor
10x Curiosity
Published in
6 min readOct 28, 2018

Are you aware of hindsight bias when you react to events that happen in life?

It is obvious what you should have done after the event

Monday’s Experts

Always know what’s best

Always tell you what you should’ve done

Monday’s Experts

Always know what’s cooking

How the game was lost and how it could’ve been won

“Monday’s Experts” Weddings Party’s Anything

A favourite song, discussing the expert opinions that come out once the result is known. This is common and easy for us all to do, with hindsight the correct decisions and actions are apparent, a linear sequence of events leading to a seemingly inevitable outcome. Whether is be sports, your children, a work colleague or even ourselves — we are always judging events by how they turned out and reviewing them with the benefit after the event, of hindsight. Gone is the uncertainty and multiple decisions paths that confronted individuals in the moment. Decisions they were probably not even aware they were weighing up before committing to a direction.

Sidney Dekker has a thought provoking position through his many books and online pieces. In his book “The Field Guide to Understanding Human Error” Dekker explores our tendency to apply hindsight to our judgement of events:

One of the safest bets you can make as an investigator or outside observer is that you know more about the incident or accident than the people who were caught up in it — thanks to hindsight:

• Hindsight means being able to look back, from the outside, on a sequence of events that led to an outcome you already know about;

• Hindsight gives you almost unlimited access to the true nature of the situation that surrounded people at the time (where they actually were versus where they thought they were; what state their system was in versus what they thought it was in);

• Hindsight allows you to pinpoint what people missed and shouldn’t have missed; what they didn’t do but should have done.

… The effect of knowing an outcome of a sequence of events is huge… It has an enormous impact on your ability to objectively look back on a piece of performance. Actually, you no longer can.

Dekker goes further to explore hindsight bias and how it shapes our reaction to failure. This reaction, significantly impacts your (and others ability) to learn from failure. Taking a hardline approach that singles out “bad actors” and assumes that inherent system safety can be restored once they are moved on, grossly underestimates the complexities of the systems we create.

I have previously looked at how important the organisational or family culture is to providing an open atmosphere where mistakes can be learnt from in “Black Box Thinking”. Dekker specifically highlights how:

…reactions to failure interfere with your understanding of failure. The more you react, the less you understand. [These reactions can be]:

  • Retrospective. reactions arise from our ability to look back on a sequence of events, of which we know the outcome;
  • Counterfactual. They lay out in detail what people could or should have done to prevent the mishap;
  • Judgmental. They judge people (e.g., not taking enough time, not paying enough attention, not being sufficiently motivated) for supposed personal shortcomings;
  • Proximal. They focus on those people who were closest in time and space to the mishap, or to potentially preventing it.

A way to overcome this bias is to force yourself into the inside view — a point illustrated in the graphic below as the person looking inside the tunnel.

This is the point of view of people in the unfolding situation. To them, the outcome was not known (or they would have done something else). They contributed to the direction of the sequence of events on the basis of what they saw on the inside of the unfolding situation. To understand human error, you need to attain this perspective.

What perspective are you taking? (Credit — Dekker)

Dekker is writing about safety in organisations, but his learnings can be applied widely to a number of situations. Especially salient are his thoughts on how to improve our own frame to get the most learning out of an event or situation. Focusing on the “Bad Apples” will see you:

  • Single out particularly ill-performing practitioners;
  • Find evidence of erratic, wrong or inappropriate behavior;
  • Bring to light people’s bad decisions; their inaccurate assessments; their deviations from written guidance or procedures.

Alternatively you can have a more expansive mindset, what Dekker calls the “New View” thinking and summarises this mindset with the following points:

  • Human error is not a cause of failure. Human error is the effect, or symptom, of deeper trouble.
  • Human error is not random. It is systematically connected to features of people’s tools, tasks and operating environment.
  • Human error is not the conclusion of an investigation. It is the starting point.
  • To create safety, you don’t need to rid your system of 70% human errors. Instead, you need to realise how people at all levels in the organisation contribute to the creation of safety and risk through goal trade-offs that are legitimate and desirable in their setting.
  • Rather than trying to reduce “violations”, New View strategies will find out more about the gap between work-as-imagined and work-as-done — why it exists, what keeps it in place and how it relates to priorities among organisational goals (both stated and unstated).
  • New View thinking wants to learn about authority-responsibility mismatches — places where you expect responsibility of your people, but where their situation is not giving them requisite authority to live up to that responsibility.

Dekker has an excellent 5 part YouTube series going into more detail on Human error and also a thought provoking piece on Just Cause in organisations and Todd Conklin further digs into this work with his book “The 5 Principles of Human Performance

  1. Error is normal. Even the best people make mistakes.
  2. Blame fixes nothing.
  3. Learning and Improving is vital. Learning is deliberate.
  4. Context influences behavior. Systems drive outcomes.
  5. How you respond to failure matters. How leaders act and respond counts.

The biggest challenge with human error is before you know you have made an error it feels exactly like you are doing the work correctly; before you know you’re wrong it feels exactly like you’re right. (Conklin)

Leaving the last word to Mick Thomas and WPA:

Monday’s Experts

Talking in the tea room

In the worshop and the office talking all around the place

Monday’s Experts

Hey they’ve always got the good oil

Pity you can’t put a bet on at the finish of the race

“Monday’s Experts” Weddings Party’s Anything

You might also like:

--

--

Tom Connor
10x Curiosity

Always curious - curating knowledge to solve problems and create change