Surrounded by fire, hoping for the best

Tom Dolphin
Oct 22, 2017 · 7 min read

You may be familiar with the “This Is Fine” meme that has been circulating on the internet for a couple of years:

© KC Green

People interested in human factors (the things humans do that affect how we interact with machines, systems and each other) and cognitive biases may recognise this as an example of the Normalcy Bias. Humans have a very strong tendency to believe and hope that things will actually turn out to be okay and that the signs of unfolding disaster around them will just go away and things will return to normal. There are extreme examples of this tendency and mundane ones, but they often involve a “suspension of belief” by the person that is so strong (especially in the biggest disasters) that you might find it impossible to accept that they’re not doing it deliberately, consciously and wilfully… until it happens to you and you realise that it’s not under your conscious control to be able to avoid it happening — although you can recognise it sometimes and snap out of it.

Waiting for things to go back to normal

In aviation disasters where an aircraft has been evacuated on the ground, some people have been reported by survivors to have been seen sitting in their seat, awake but disengaged, while everyone else rushes from the burning plane. They are gripped by the intense desire to believe that things are going to be okay and that this isn’t really their worst aviation fear, a plane crash. They even resist when told to evacuate. Amanda Ripley’s book, “The Unthinkable” (an excellent book on this topic — I recommend everyone reads it), talks about how people in all kinds of disasters will ignore what are objectively very, very obvious signs of danger, because of this normalcy bias.

In another example, the Home Office guidance on dealing with a firearms and weapons attack by terrorists talks about urging others to go with you, but not staying with them if they refuse to leave. This is partly because they may well be in the firm grip of the normalcy bias and may be impossible to persuade in the brief time available to you.

“Gahhh, not another stupid fire alarm test!”

A mundane example is one you yourself will probably have experienced in the last month, if not the last week: fire alarms. When the fire alarm most recently went off in your workplace, what did you do? Did you begin the evacuation plan immediately? Did you get up from your desk, leaving your belongings behind, and proceed to the evacuation point? Or did you ignore it for a minute or so, then look around to see what everyone else was doing, find that they were all still sitting around as normal, and decide to go on ignoring it for a while?

The median time to begin evacuation of the World Trade Centre Tower 1 after the basement bombing in 1993 was five minutes, from becoming aware of the fire to leaving one’s cubicle or office (and it would take 1–2 hours to get down from the higher floors).

Five minutes… Sounds like a pretty quick transition from everyday activities to heading for the door after becoming aware of a fire, right? Not so much… Try counting 300 seconds out now. Go on, I’ll wait…

…still counting…

…296, 297, 298, 299…

300… Yep, it’s a long time, isn’t it? a lot of that time was wasted by people hoping that the situation would resolve itself. There was no fire alarm heard in the World Trade Center that day in 1993, as the system had been taken out by the bomb, so it was an unclear situation despite the smoke and noise, and people didn’t know how to react. [This article talks a bit about other factors that affect evacuation times.] Even when a fire alarm does sound, often in these situations there is also a significant contribution to delay caused by people taking their cues from each other, not wanting to be seen as making a fuss about nothing. No-one does anything until one person decides to take action, and then suddenly everyone else follows, as though a spell has been broken — the spell of the normalcy bias.

Don’t just read the in-flight magazine

Going back to the plane crash scenario, people who study human factors have identified that a significant contributor to being able to break through the normalcy bias is listening attentively to the safety briefing before take-off. Even if you fly regularly, the act of considering the possibility of disaster and what you might do in that eventuality appears to be strongly contributory to your brain being able to snap out of the normalcy bias and respond appropriately to the events unfolding. People may find it dull to hear it all again, but it does make a difference.

To use an example my medical friends will recognise, the WHO safety checklist in theatres includes the question “Are there any critical or unexpected steps?”. If you consider the possibility that you might have to convert the laparoscopic (keyhole) surgery to a laparotomy (opening up the belly with a large incision) in the event of bleeding, you will be more likely to recognise that the operation has not gone to plan and that you have reached the critical stage of needing to make that decision. It also makes it easier to take the next step, which is a big one with significant consequences, of making the large laparotomy incision. Normalcy bias will otherwise unconsciously drive the team to keep trying to stop the bleeding with the keyhole laparoscopic instruments because “it’ll be fine in a minute… if I can… just… get… this… bleeding… to stop… more swabs please…”

Don’t Let Your Frog Boil

There is a linked cognitive bias which is that we will tolerate smaller negative changes more than big ones. We’ve all heard of the idea of the Boiling Frog, which is placed in a saucepan of water and heated extremely slowly, so that it will not respond to the tiny changes in temperature, and eventually is boiled to death. This isn’t what real frogs actually do, but it’s a useful image to illustrate the cognitive bias.

In anaesthesia, when the systolic blood pressure drops from the pre-anaesthetic level of 140 mmHg to 120 mmHg, that’s a small change; no problem. Then five minutes later the BP is measured again, and it’s 110 mmHg; again, a small change of little concern. Five minutes more, and 102 mmHg is hardly less than 110 mmHg; forget about it. Now it’s 92 mmHg, which is only 10mmHg less than the last measurement, so that’s fine.

But hang on. We’ve gone from 140 mmHg to 92 mmHg, which is a drop of more than a third of the starting blood pressure. If the BP had gone straight from 140 to 92 on the first reading, you’d have done something about it, even if just to check it again and keep a close eye on it. But because the decrements were small, none of them triggered your internal alarms; this is a shifting baseline problem.

If you set yourself a floor, or an anchor, beforehand, you are much more likely to respond to changes that hit that floor. If you literally say out loud “Her starting blood pressure is 140 systolic; if it goes below 115 mmHg then I’ll do something about it”, then you’re much more likely to notice and respond when it does.

Planning to fail beats failing to plan

Likewise, emergency algorithms help you with decision-making, by bypassing the doubt and the normalcy bias. They say things like “If W and X have occurred, it means Y, and you should do Z.” There’s little room for you to say “Well, maybe X will get better in a minute…?” — the algorithm jolts the brain by saying, “Nope, W plus X means Y, and that basically mandates Z”, allowing you to leapfrog into the state of recognising that something actually needs dealing with and won’t resolve spontaneously.

We also benefit from simulated scenario training — it makes the mental leap from the state of normalcy to a state of alertness and action much easier. It lowers the barrier of the normalcy bias, and lets you acknowledge that something is wrong. You are now in a position to be the one who breaks the group inactivity and starts the necessary actions. Your brain has seen this before, and on that occasion (albeit it was simulated that time) things decidedly did not go back to normal… and so the brain can make that leap out of normalcy.

Think Ahead

So how can you guard against normalcy bias and a shifting baseline? When you’re offered a safety briefing — on a plane, on a ferry, at the start of a conference in a building you don’t know , or even in your mandatory fire safety training at work! — take it. Have a discussion with your family, so that everyone knows, about how you might escape from the house if a fire breaks out one night. When the WHO briefing gets to the question about “critical or unexpected steps”, take it as an opportunity to briefly sketch out a few bad things that might happen and how the team would address them. Watch the Home Office video so you will have an idea how to recognise and respond to a marauding weapons attack. Set yourself a “floor” to your patient’s blood pressure under anaesthesia in advance so you will respond to a drop in pressure much faster. And be prepared to be wrong occasionally, or considered a little bit eccentric, by responding to each fire alarm at work as though it is for real, even when all your colleagues are still in the grip of collective denial; they might be right, but suppose they’re not?

In short, think in advance about how things might go badly awry and there’s a much better chance that your brain will be able to recognise and respond to critical events when they occur. And think now about how you’re going to escape from a fire in your home; you might be very glad you did.

Tom Dolphin

Written by

In no particular order: 🐬🇪🇺🌈💉🍰🐱