Likeapicture.com A Marriage of the Availability Heuristic and Optimism Bias

Part Three: Understanding Heuristics and Biases in Homeland Security: The Availability Heuristic, Black Swans, Weak Signals and Probability Blindness

What are Heuristics?

Angi English
Homeland Security
Published in
10 min readJun 10, 2016

--

Heuristics are cognitive rules of thumb or hardwired mental shortcuts that everyone uses every day in routine decision-making and judgment. They are simple, efficient rules — either hardwired in our brains or learned — that kick in on occasion, especially when we’re facing problems with incomplete information. If you’ll remember from previous series posts, heuristics are part of what Daniel Kahneman calls System One thinking. (Part Two of the series) This kind of thinking process also is governed by the triune brain. (Part One of the series)

Heuristics are not all bad. They give the brain a break from having to calculate every decision using the critical thinking analysis of System Two thinking. A homeland security practitioner’s challenge is to know when heuristics are useful, when they are not, and when they will actually make a situation worse.

Insightful analysis of the heuristics one is using helps in risk perception. Risk perceptions and many other judgments are guided by heuristics, implicit and intuitive shortcuts, which often contrast dramatically with the logical, probability-based analytical process employed by professional experts.

Kahneman, et, al., says, “[t]he central idea of the “heuristics and biases” is that judgment under uncertainty often rests on a limited number of simplifying heuristics rather than extensive algorithmic processing — soon spread beyond academic psychology, affecting theory and research across a range of disciplines including economics, law, medicine, and political science.”

Kahneman and Tversky described three general-purpose heuristics — availability, representativeness, and anchoring/adjustment — that underlie many intuitive judgments under uncertainty. Our brains like heuristics because they act as an ointment for uncertainty.

These three important heuristics are simple and efficient because they piggyback on basic computations that the brain had evolved to make.

Availability Heuristic

The availability heuristic is when you make decisions based on the available memories of the past and the images in your mind from the experience.

In the text Psychology of Terrorism, Bongar, et.al, states that “[u]nder conditions of uncertainty, emotionally evocative events are more easily imagined and more readily available for cognitive processing. Based on the ability to easily recall an event influences our judgment on the likelihood of a similar event, increasing the imagined risk and probability of an event. In the aftermath of a terrorist act, powerfully facilitated by mass media reporting, the event is highly available, thus elevating disproportionately the perception that another act is likely.”

One way to explain why people cannot see climate change as a high probability event to occur is because they cannot imagine it, while the graphic available images of the 9/11 attacks keep people in fear, despite the low probability that it might occur again. This is called “probability blindness.

Probability Blindness

The Sacramento Bee

The term “probability blind,” was coined by Cass Sunstein to mean, that “the feeling of fear simply sweeps the numbers away.” Daniel Gardner outlines an example in his book, “The Science of Fear: How the Culture of Fear Manipulates Your Brain,” “In a survey, Paul Slovic asked people if they agreed or disagreed that a one-in-10 million lifetime risk of getting cancer from exposure to a chemical was too small to worry about. That’s an incredibly tiny risk — far less than the lifetime risk of being killed by lightning and countless other risks we completely ignore. Still, one-third disagreed; they would worry. That’s probability blindness. The irony is that probability blindness is itself dangerous. It can easily lead people to overreact to risks and do something stupid like abandoning air travel because terrorists hijacked four planes.”

The Daily Show Explains the Availability Heuristic

Examples of the Availability Heuristic

A classic example of the availability heuristic at work is when some people refuse to evacuate during a hurricane because they have no available memory of the last hurricane. For example, in 2008, people on the Texas coast failed to evacuate despite warnings of the imminent arrival of Hurricane Ike. The perception among the public was that a category 2 hurricane was not dangerous. However, some people said if they would have had information of the storm’s hazards, broken down into the dangers associated with wind, water, and storm surge, and had the information been conveyed effectively, people in the community may have reacted differently and the outcomes may have been less severe. For homeland security practitioners, this is a vitally important piece of information that informs — when talking about hurricanes, use vivid imagery that will invoke prior experiences and memories of a similar event. Just doing this one thing helps counteract the availability heuristic.

In another example of the availability heuristic, when someone you know gets sick after a flu shot, you are less likely to get one even if it is statistically safe. The news machine is voracious, and so when there is an accident or a disaster or any sort of human tragedy, it is reported and analyzed endlessly. This makes us think that events, which are actually very rare, happen frequently, and perversely, events, which are relatively common, are under-reported precisely because they’re not news. In this example, the memory is available and comes quickly to mind.

On the other hand, many people in New Jersey had never seen a storm like Hurricane Sandy before and many were not prepared for it. Such a dramatic storm did not come to their minds quickly. However, after Hurricane Sandy, a New York Times article describes the purchase of generator sales soared with At Wisconsin based Generac Power Systems, they are running 3 shifts 6 days a week to meet the change in demand from Sandy. This kind of reaction is an example of post-Hurricane Sandy availability bias–overestimating future probability–and availability cascade–emotional public reaction that spreads.

Halvorson and Rock provide another example in their article, “Beyond Bias,” of availability heuristic at work is when you hear someone say, “I’m not worried about heart disease, but I live in fear of shark attacks because I saw one on the news.” That’s making a decision based on the information that comes to mind most quickly, rather than on more objective evidence.” Halvorson and Rock make the point that “This heuristic inhibits us from looking for and considering all potentially relevant information. It can thus block the brain from making the most objective and adaptive decisions. So is a doctor who assumes a new patient has a familiar condition, without more carefully analyzing the diagnosis is drawing a conclusion without fully exploring all the details.”

Inhabitat.com

This was evident, for example, in the failure of governments in the New York City area to invest in flood-proof infrastructure prior to Sandy, with the poster child being the South Ferry subway station. Adam Sobel points out that “the new South Ferry station was completed in 2012 and totaled by the storm — despite well-documented evidence, going back at least 20 years, that a hurricane could cause just the kind of flooding that Sandy caused, in that precise spot as well as others. Now that Sandy has happened, things have changed and all kinds of investments are being made in more resilient infrastructure. But until Sandy, no such storm had happened in anyone’s lifetime in NYC, it was human nature to act as though it never would happen. One side of the availability heuristic is that we often don’t take risks as seriously as we should if they are risks of things that we have never experienced.” This inability to understand risk in this context is partly due to the time distance between event and the weak signals that lead up to a catastrophic event — which brings me to how Black Swan events are related to the availability heuristic.

Black Swans, Availability Heuristic and Weak Signals

downloads.game.net

The mega storm in 2012 that exposed so many issues in New York and New Jersey can be considered a Black Swan event. The term “Black Swan” coined by Nassim Nicholas Taleb describes an event as one that is “a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was.” According to Taleb, “[w]e concentrate on things we already know and time and time again fail to take into consideration what we don’t know. We are, therefore, unable to truly estimate opportunities, too vulnerable to the impulse to simplify, narrate, and categorize, and not open enough to rewarding those who can imagine the “impossible.” Homeland security practitioners and risk managers have always been concerned with severity vs. frequency. However, because of the weak signals between cause and effect are difficult to determine, sometimes black swan events can occur.

The following is a mind-map of the dynamics of a Black Swan event.

Weak Signals Between Cause and Effect

Let’s look at a case study related to a Black Swan event by the Licata Risk Advisors group. Their firm researched the 2010 BP oil well blowout disaster extensively, reviewing all the government investigative reports, the ones by industry groups, and BP’s own analysis. The disaster cost $60 billion and took 11 lives. Days before the event, BP received a safety award (for activities aboard that very same rig) from the Minerals Management Service, the agency in charge of oversight at the time. This was not just an odd quirk. This is constantly happening in business operations of all kinds. The focus is on the somewhat frequent events, while managers are oblivious to the weak signals of much bigger problems brewing beneath the surface. Frequency is easier to manage than severity because it is visible, and there is immediate feedback as to whether it is being managed. BP actually had abundant warning that the well was getting out of control, but the culture was focused so much on cost and speed, and so little on risk management, that the company was somehow able to ignore one sign after another.

The multi fatality BP Deepwater Horizon explosion and oil spill of 2010 was preceded just five years earlier by a major explosion at the BP Texas City, Texas, refinery that killed fifteen people and injured nearly two hundred.

Kathleen Tierney in her book, “The Social Roots of Risk: Producing Disasters, Promoting Resilience: High Reliability and Crisis Management,” talks about the BP Texas City refinery and the weak signals ignored by management in the following case study.

Case Study: Tierney:

Evidently BP learned little from that disaster — which, as we will see later, is not unusual — but the key point for now is that the 2005 Texas City explosion itself was preceded by no fewer than six close calls in the previous ten years in the same refinery system. These were by no means minor incidents, and they were investigated at the time they occurred, yet those investigations had essentially no impact on plant operations.

First, safety programs at the plant focused primarily on avoiding worker accidents and injuries, and not on overall system safety. Senior managers were assessed in terms of their efforts to maintain a safe working environment — defined again as low worker injury statistics — rather than on reducing the risk of facility-wide disasters. The focus was on the strong signals of worker safety instead of the weak signals of the buildup of risk over time.

Second, the refinery was operating under severe production pressure. To squeeze out profits, the plant cut back on maintenance and avoided investing in equipment that would have made refining processes safer. BP executive in London made some investments, but they did not address the core problems in Texas City. In 2004, BP executives challenged their refineries to cut yet another 25% from their budgets the following year.

Third, in yet another example of avoiding the weak signals of risk buildup, there were safety experts in high positions in the corporation with safety standard responsibility but no enforcement of the standards. The safety official at the Texas City plant, who was very concerned about the close calls and about overall system safety, was able to exert little influence over management. It is noteworthy that the weak signals of risk buildup and close calls were not perceived.

The severe event at Texas City, though rare, is the more important to understand in the framework of the availability heuristic. Despite the well documented evidence going back several years that pointed to major problems, the weak signals leading to the event were not recognized due to the availability heuristic.

Conclusion

Understanding the availability heuristic is important for homeland security practitioners because if you want people to react rapidly to a threat, you will want to talk in terms of events or circumstances that will be available for recall in the general public’s mind. Sunstein makes the point clear, “[h]uman beings err because they use the availability heuristic to answer difficult questions about probability — how likely is a terrorist attack, a hurricane, a traffic jam, an accident from a nuclear power plant, a sexually transmitted disease?” This is a unique perspective in understanding risk. From a strict risk management perspective, the availability heuristic has to do with an exposure to an experience and the available memory of it. In other words, people use the availability heuristic to answer questions about probability when examples come quickly to mind and can cause homeland security practitioners and risk managers to use faulty judgment.

Angi English has a Master’s in Security Studies from the Naval Postgraduate School’s Center for Homeland Defense and Security and a Master’s in Educational Psychology from Baylor University. She is also a Licensed Professional Counselor and Licensed Marriage and Family Therapist in Texas. She lives in Austin.

Note: The next post will be: Part Four: Understanding Heuristics and Biases in Homeland Security: The Representativeness Heuristic

--

--

Angi English
Homeland Security

HSx Founding Scholar for Innovation, Center for Homeland Defense and Security, Part 107 Drone Pilot. MA National Security Studies, MS Ed. Psychology