Normalisation of Deviance

Tom Connor
10x Curiosity
Published in
4 min readFeb 27, 2018

Normalisation of deviance is a trap in human psychology that has led to many disasters over the years. As Professor Sidney Dekker says, ‘Murphy’s law is wrong. Everything that can go wrong usually goes right’. Every time a short cut or a know issue fails to lead to an adverse outcome, then people relax a little and slowly it becomes accepted into normal practice — the risk becomes normalised. Until the odds come up and the disaster, always there and through luck laying dormant, is realised.

Normalisation of deviance was first coined as a term by Diane Vaughan who led an exhaustive 10 year study into the Challenger shuttle disaster. Her conclusions did not lay blame on any individuals as outline in a brilliant piece by Malcolm Gladwell

The study’s conclusion is the opposite[to singling somebody out]; it says that the accident happened because people as NASA had done exactly what they were supposed to do “ no fundamental decision was made at NASA to do evil — rather a series of seemingly harmless decisions were made that incrementally moved the space agency toward a catastrophic outcome”

The O-Ring fault that led to the shuttle blowing up shortly after launch was a well known fault, but because nothing untoward had happened in previous launches, every successful launch led to a normalisation of the issue. What was once a critical issue for investigation, slowly lost importance and focus and eventually became a normal part of shuttle launches.

although erosion itself had not been predicted, its occurrence conformed to engineering expectations about large-scale technical systems. At NASA, problems were the norm. The word anomaly was part of everyday talk… the whole shuttle system operated on the assumption that deviation could be controlled but not eliminated”
What NASA had created was a closed culture that in her words “normalised deviance” so that to the outside world, decision that were obviously questionable were seen by NASA’s management as prudent and reasonable.

Todd Henry highlights the impact of normalisation of deviance on a much more local scale, sighting the little items that build to form an overall culture

Over time , small and seemingly insignificant compromises are made that begin to erode the precision and clarity of the team and eventually begin to degrade the culture . You cannot be successful and disciplined in the big things if you are undisciplined about the small ones . Your inattention to detail will eventually catch up with you . As a leader , when you signal tacit acceptance of deviant behaviour , you are playing a dangerous game . Small , public compromises can erode the team’s trust , or that of your client , and essentially give permission for everyone else to follow suit

This normalisation can be seen in many examples of business culture, none more dramatically as with the Volkswagen scandal.

Towards the centre of most great industrial disasters you will find the normalisation of deviance playing a role. Whether it be a deliberate cultural over-bearance such as at Volkswagen or a slow drift to failure

So what are the solutions?

After the two shuttle disasters at NASA key recommendations were:

  • Don’t use past success to redefine acceptable performance.
  • Require systems to be proven safe to operate to an acceptable risk level rather than the opposite.
  • Value diversity — Appoint people with opposing views or ask everyone to voice their opinion before discussion.
  • Avoid conflict of interests — Keep safety programs independent from those activities they evaluate.
  • Create a culture that is team-based such that each person would feel like they were letting their colleagues down if they were to break the rules.
  • Set the standard — executives breaking or bending rules will set the tone for the company’s culture and make it more easy for others to justify their actions.

Leaving the last word to Physicist Richard Feynman,

‘When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next,’ … when new materials, high-energy systems and thin technical margins were involved, ‘we don’t even know how many bullets are in the gun.’

You might also like:

--

--

Tom Connor
10x Curiosity

Always curious - curating knowledge to solve problems and create change