Why Smart Teams Fail

(spoiler alert: your boss is not really a vampire)

Théo
Théo
Jul 21, 2017 · 8 min read

W e have all made mistakes (and will, predictably, carry on to do so) and are aware that our individual judgment is far from perfect. This has been well documented in books such as Predictably Irrational or Thinking Fast and Slow and is mainly due to cognitive biases — the influence of factors, both external and internal, that shape our decisions.

But what happens to our faculty of judgment and sense of analysis when we are in a group and work as a team? Teams of really smart people have been known to fail catastrophically more than once: think of Enron, the Columbia shuttle disaster, flight crews that run out of fuel or, more recently, the results of French rugby squad.

Christian Morel is a sociologist who has studied the causes that lie behind collective bad decisions. His analysis is based on the roles played by 3 key specific individuals in groups and the 5 behaviours they are likely to adopt. This results in 8 likely scenarios.

Let’s start by looking at the 3 different key roles…

3 roles

Starting with the fact that most modern organisations are founded on the division of labor, Morel defines three key roles in teams during the phases of collective decision taking: The Manager, The Expert and… let’s call him Junior.

The Manager
  • The Manager: the first key figure in a team is the manager, who can also be the boss. He is a coordinator and a leader. He’s the one setting the priorities and goals of the team, as well as taking important decisions based on the informations he can gather and the opinions of his team.
The Expert
  • The Expert: he possesses deep knowledge about specific issues. He is usually experienced and in charge of a particular topic, which can be technical, business-related or legal. He can be at the same hierarchical level as the manager or below him, depending on whether he’s financially autonomous and if he has exclusive control of his budget.
Junior
  • Junior: last but not least, Junior. He is the one providing support and assisting the team. He is usually part of the newest members in the organization and has no decision power or expertise. He makes good coffee, though.

5 actions

During the process of collective decision taking, these individuals are likely to adopt 5 different attitudes.

They can…

  • carry out: they are fully committed to the execution of the flawed solution. They make sure it is conceived, developed and implemented.
  • request: they ask for the implementation of the solution but don’t to take part in its production. They are kept up-to-date regularly through reporting and see that the creation of the solution progresses.
  • tag along: they agree with the solution and follow its implementation without raising any concern.
  • be ignored: they have no say on the conception and implementation of the solution. Their opinion is not taken into account and they are not associated with the project.
  • reject: they openly reject and criticise the proposed solution.

If we combine the 3 different roles (manger, expert and junior) with the various actions we’ve just seen, Morel identifies 8 possible situations.

8 possible situations

The 8 different situations are divided between 3 “top-down” scenarios, 3 “knowledge-related” situations and 2 “bottom-up” cases.

The following are called “top-down” scenarios since the key person in these situations is the person with the most authority — the manager.

1. Imposed catastrophe

Here’s an example of catastrophe: in the volcanic region of a developing country, the local government chooses to disregard experts’ opinion advising them to evacuate inhabitants living near the volcano since they fear an evacuation might lead to major chaos. This is an example of top-down, imposed situation — I’ll call it a catastrophe. Individuals with authority refuse to evacuate a zone (they carry out the mistake) with the implicit support of locals and inhabitants (who tag along since they are happy not to have to leave all their belongings behind), although a minority of experts are against this decision and advise against it (they reject it).

2. Accepted devastation

The accepted top-down scenario, or devastation, is what happened to Avianca Flight 52 in 1990. This plane ran out of fuel while attempting a go-round at JFK Airport and crashed. The subsequent investigation showed that the Captain did not take into account the fuel issue raised by his First Officer, who tacitly accepted his superior’s decision once he noted his remark had been taken into account but no corrective action was undertook. This kind of situation is most often encountered in settings where individuals possess highly technical skills, and where managers are also experts. As a consequence, experts with no hierarchical status are not likely to go against their manager’s actions.

3. Lonely nightmare

The nightmare can be illustrated by the explosion of NASA’s Challenger shuttle. It occurs when the manager has to handle the expert’s ignorance. In this situation, the expert is opposed to the solution since he doesn’t know whether it is good or bad, for example because he misses key information or data. As a precautionary principle, he is against what the manager wants. But depending on his communication and the pressure surrounding the team, he might not voice his refusal clearly enough. This was the case for the Challenger explosion, where experts were against launching the shuttle but were lacking clear evidence of the impact of the cold weather on rivets to convince their managers.

Knowledge-related situations are caused by the expert taking the lead on achieving the inappropriate solution.

1. Accepted disaster

In 1989, a Boeing 737 flying between London and Belfast crashed. The reason? The Captain and the First Officer had turned off the remaining working engine instead of the one on fire. An engine was not working properly, causing vibrations and the decision to turn off the only working engine slowed down the plane, reducing vibrations and leading the flight crew to believe they had taken the right decision. This is an example of disaster: the manager and the expert worked hand-in-hand, without clear hierarchical difference, to commit an error and aggravate their situation.

2. Required misfortune

An example of misfortune can be the extension of the Maginot Line along the northern border of France. The Maginot Line was long line of fortifications build in the 1930s to deter agressions coming from the East, and in the late years of its construction decision was taken to extend it along Belgian border. Although some architects and military planners were sceptical of its effectiveness in case of agression, they had to obey military leaders and the decision of politicians. Another example of misfortune: aerial bombardments during wartime, including in Kosovo in 1999 or more recently on a Russian airbase in Syria. Although military experts know that these will not be truly effective to diminish the enemy’s capabilities and will often lead to collateral damages, decision is taken to bomb under the pressure from head of governments and the public opinion.

3. Populist debacle

In that case, the expert is pushed to act upon request of the masses. Since there is no authority coming from managers (who have adopted a passive attitude), the expert is on the front line to help and answer needs. This can lead him to make irrational decisions.

“Bottom-up” cases are named this way because they are essentially defined by Junior’s actions.

  1. Self-supporting wreck

In a wreck, juniors cause the bad solution to be implemented since they ignore expert opinions and the management just tags along. The decision to ignore expert opinions can stem from to 2 reasons: either Juniors are convinced that the issue is not really technical and won’t require specific knowledge to be dealt with, or they don’t have the ressources (money, time…) to consult with experts. As for the management, they tag along because they share the belief that individuals at the root of the organisation are the best suited to deal with issues that directly affect them.

2. Accepted fiasco

The fiasco is similar to the previous situation, except that in this case the expert approves of the solution. This is likely to happen when he’s a contractor or provider of service and isn’t fully involved in the situation at stake. He has been temporarily called upon to solve a solution, will do his best to solve it with the information at hand but lacks a global vision of the issue.


Debacle, misfortune, catastrophe… we’ve seen everything that can go wrong when teams take decision. Of course, it’s better when things always go well and as planned. But they don’t (Murphy’s laws), and an accident can be used to greatly improve systems and processes, if lessons are correctly learned and applied.

If you’re interested in learning more about how groups behave, you should read Morel’s book. He goes into greater details and offers alternative situations.


Please 👏 to let me know if you enjoyed this article!

)
Théo

Written by

Théo

ex PhD Candidate @Polytechnique, ex @ScPo Bordeaux. New technologies & trail running. Twitter: @_THEODORE__

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade