Fear is the Mind Killer: Psychological Safety in Business

Sonny Dewfall
The Pinch
Published in
5 min readJan 7, 2022

In our last post we examined Amy Edmondson’s definition of psychological safety as a belief that one will not be punished or humiliated for speaking up with ideas, questions, concerns or mistakes[1] and we talked mainly about the positive effects of having high psychological safety. That post ended with a line from Frank Herbert’s Dune, “Fear is the mind killer,” as a way of foreshadowing the topic we will cover in this article; what can go wrong in teams that have low psychological safety.

Before we look at teams of people it’s worth having a look at the effects of fear on an individual within a team. In a 2005 study[2], participants (specially picked for their arachnophobia) completed tasks on a computer with and without the presence of a Chilean Rose Haired Tarantula. The study found that the presence of fear (in this case fear of a large, venomous spider) can “attract attentional resources and thus decrease the processing of other information”.

In the context of IT the stakes are often lower but if you have ever had the misfortune to be in a workplace where you are under intense pressure, you may recognize the same fear induced thought paralysis. The perception that your actions will be put under a microscope can cause you to second guess or miss important information especially in situations where complex problem solving is required.

Another way in which low psychological safety can manifest itself, particularly in the context of an organization’s culture, is in the perception of hierarchy. A rigid “chain of command” style hierarchy may result in individuals feeling that they cannot challenge the opinions of those above them. An oft cited example of this phenomenon in practice is the Columbia shuttle tragedy. An undetermined amount of damage was done to the shuttle at launch when it struck a large piece of insulation foam[3].

Rodney Rocha, a structural engineer on the team tasked with assessing the damage had concerns that the imaging of the damage was not good enough to rule out a catastrophic failure on re-entry. Boeing’s official report claimed there was no danger but Rocha still had concerns. In the final safety meeting, no probing questions were asked of the Boeing report and the shuttle was given a clean bill of health. Rocha admits he felt he was “too low down … in the organization [NASA]” to challenge the opinions of the mission management team. Of course, Rocha’s fears were proved correct, and the shuttle was destroyed at a cost of seven lives.

It is also worth noting that Richard Blomberg, the chair of NASAs safety watchdog — the Aerospace Safety Advisory Panel — was dismissed a year before the incident. In his own words “one could speculate I was removed because I was saying things that some people found uncomfortable. It was becoming increasingly hard to say what had to be done on safety grounds because the programme was on shaky grounds financially[4]”. The environment at NASA at the time was demonstrably hostile to individuals voicing concerns and it resulted in disaster.

As we explored in our last article, when facing complex problems teams benefit from cognitive diversity; a team that has many different perspectives is more likely to be able to provide the right mix of opinions to solve a problem. Individuals bring diverse ideas to the table and are able to challenge and improve each other’s thoughts. This only works in an environment that is free of the fear of what Amy Edmondson calls “interpersonal risk taking”. Rocha felt that the interpersonal risk of speaking up was too high — rightly so given that evidence suggests it may have cost his job — and the complex problem of how to deal with the damaged shuttle wing was not solved.

There is compelling evidence to suggest that this kind of culture has a similarly negative effect within financial institutions. This excellent Fast Company article provides a detailed analysis of how “reluctance to complain out of fear of retaliation”, amongst other components of a hostile workplace, created the perfect storm that led the 2016 Wells Fargo mis-selling scandal. The company fired those who reported unethical behavior via a supposedly anonymous ethics hotline, entrenching a culture of fear and disincentivizing employees from raising concerns.

In many ways the key value that individuals bring to their teams is in their creativity and problem solving. This is particularly relevant in the increasingly automated world of IT Operations. As SRE practitioners we are constantly trying to automate where possible. This means that in a complex plant a human might view a large amount of information through several layers of abstraction in terms of monitoring and automation.

As the complexity of systems multiplies whilst the headcount of teams remains constant (or sometimes shrinks) it is all the more important that humans are empowered to speak up and raise concerns as they see them. Toyota’s concept of “Jidoka” or “automation with a human touch” is very relevant here[5]. Everyone on the factory flaw is empowered to stop the production line if they see an issue. In an increasingly automated system this is arguably the most important purpose of the human — to identify and resolve “incidents”, irregularities that cannot be dealt with by machines. With the increased importance of humans in the management of complex system it’s vital that humans are empowered to communicate or to act and this empowerment requires psychological safety.

Hopefully this article has given you some food for thought on how psychological safety is affecting your team. Is there a fear of calling out unsafe changes before they are deployed into Production? Are individuals covering up mistakes out of fear of the consequences? If you have any thoughts or questions, then please let us know in the comments or reach out directly.

[1] Psychological Safety and Learning Behavior in Work Teams — Amy Edmondson

[2] The effects of fear on performance monitoring and attentional allocation — Jason S. Moser, Greg Hajcak, Robert F. Simons

[3] ABC News

[4] https://www.theguardian.com/science/2003/jun/22/spaceexploration.columbia

[5] https://global.toyota/en/company/vision-and-philosophy/production-system/

--

--

Sonny Dewfall
The Pinch

SRE, DevOps and Quality Engineering specialist at Accenture.