Normalisation of Deviance
Normalisation of deviance is a trap in human psychology that has led to many disasters over the years. As Professor Sidney Dekker says, ‘Murphy’s law is wrong. Everything that can go wrong usually goes right’. Every time a short cut or a know issue fails to lead to an adverse outcome, then people relax a little and slowly it becomes accepted into normal practice — the risk becomes normalised. Until the odds come up and the disaster, always there and through luck laying dormant, is realised.
Normalisation of deviance was first coined as a term by Diane Vaughan who led an exhaustive 10 year study into the Challenger shuttle disaster. Her conclusions did not lay blame on any individuals as outline in a brilliant piece by Malcolm Gladwell
The study’s conclusion is the opposite[to singling somebody out]; it says that the accident happened because people as NASA had done exactly what they were supposed to do “ no fundamental decision was made at NASA to do evil — rather a series of seemingly harmless decisions were made that incrementally moved the space agency toward a catastrophic outcome”
The O-Ring fault that led to the shuttle blowing up shortly after launch was a well known fault, but because nothing untoward had happened in previous launches, every successful launch led to a normalisation of the issue. What was once a critical issue for investigation, slowly lost importance and focus and eventually became a normal part of shuttle launches.
although erosion itself had not been predicted, its occurrence conformed to engineering expectations about large-scale technical systems. At NASA, problems were the norm. The word anomaly was part of everyday talk… the whole shuttle system operated on the assumption that deviation could be controlled but not eliminated”
What NASA had created was a closed culture that in her words “normalised deviance” so that to the outside world, decision that were obviously questionable were seen by NASA’s management as prudent and reasonable.
Todd Henry highlights the impact of normalisation of deviance on a much more local scale, sighting the little items that build to form an overall culture
Over time , small and seemingly insignificant compromises are made that begin to erode the precision and clarity of the team and eventually begin to degrade the culture . You cannot be successful and disciplined in the big things if you are undisciplined about the small ones . Your inattention to detail will eventually catch up with you . As a leader , when you signal tacit acceptance of deviant behaviour , you are playing a dangerous game . Small , public compromises can erode the team’s trust , or that of your client , and essentially give permission for everyone else to follow suit
This normalisation can be seen in many examples of business culture, none more dramatically as with the Volkswagen scandal.
Towards the centre of most great industrial disasters you will find the normalisation of deviance playing a role. Whether it be a deliberate cultural over-bearance such as at Volkswagen or a slow drift to failure
So what are the solutions?
After the two shuttle disasters at NASA key recommendations were:
- Don’t use past success to redefine acceptable performance.
- Require systems to be proven safe to operate to an acceptable risk level rather than the opposite.
- Value diversity — Appoint people with opposing views or ask everyone to voice their opinion before discussion.
- Avoid conflict of interests — Keep safety programs independent from those activities they evaluate.
- Create a culture that is team-based such that each person would feel like they were letting their colleagues down if they were to break the rules.
- Set the standard — executives breaking or bending rules will set the tone for the company’s culture and make it more easy for others to justify their actions.
Leaving the last word to Physicist Richard Feynman,
‘When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next,’ … when new materials, high-energy systems and thin technical margins were involved, ‘we don’t even know how many bullets are in the gun.’
You might also like:
- Black Box Thinking… — Are you creating the environment and systems to be able to learn from your mistakes and improve…?
- A Checklist to save the world (or at least improve your own performance) — Checklists are the the way to stop making simple mistakes, and reduce your cognitive load saving energy for the decisions that really matter.
- Situational Awareness and the Hearts and Minds Safety Program — Safety is a constant topic of awareness in heavy industry — how to achieve a safe workplace and how do you create an environment that drives toward safety through the organisation?
- Looking in the rear view mirror… — Are you aware of the hindsight bias you are applying to your reaction to events that happen in life?