Human-Centered Design and the Missile False Alarm in Hawaii

Scott Robertson
8 min readJan 14, 2018

--

I wasn’t in the room where it happened, but I experienced the consequence. On Saturday, January 13, 2018, somebody pushed a button in Hawaii that sent an “Emergency Alert” text message to cell phones statewide: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

The warning was also automatically broadcast on television and radio. It caused widespread panic. Residents and visitors huddled in closets and basements. Tourists vainly sought clarification from their hotels about what to do. People hugged their children and called their loved ones. Thirty minutes later, another alert declared a false alarm.

The visibly shaken Governor appeared on television with the equally discouraged Emergency Management Administrator and declared “This should not have happened.” He said, “An employee pushed the wrong button.” There will be an investigation and officials will be updating procedures, but let me suggest that some human-centered design guidelines and human-computer interaction (HCI) procedures might have helped avoid the whole thing in the first place.

Here’s what I know from reading press reports:

  1. The accident occurred at a shift change.
  2. During the shift change, there is some kind of routine run-through of a procedure very similar to the actual alert procedure.
  3. In an actual alert, there is a button to click on the screen that will cause the warning to be generated. One person is responsible for pressing this button.
  4. There is a confirmation dialog box that follows the button press.
  5. There is no standardized false alarm alert mechanism.

We can talk all day about hindsight, but the human-centered design process in software development is there to provide foresight, and every one of the five points above is a red flag for practitioners experienced in its application.

First, exactly what is human-centered design? It is an approach to system development that prioritizes the experiences of the people who will be using the system. It takes into account how people perceive information in all senses; what people are capable of doing physically with their hands, fingers, eyes, and whatever else they are using to interact with a computer system; how people process information, what they can remember, and what taxes their information processing capabilities or confuses them; how human feelings and emotions affect performance and attention; how the context of people’s activity influences what they think and do; and so on.

A human-centered designer is therefore someone with knowledge about the behavioral, cognitive and and physiological sciences, who also knows about the design of interactive computing systems. They should be part of any development team, and play just as important a role as the best software engineer or programmer.

The Human(s) in the Loop

Now, to the points above.

  1. The shift change. Think to yourself what is happening at a shift change. The activity is simple to describe: one person is leaving and another is taking over their job. From a technical perspective, there will be a bunch of steps for transferring the activities. But think about it more deeply from the human perspective. How do you feel at the end of a shift? Anxious to get going? In a hurry? What’s on your mind? The drive? Picking up the kids? Stopping at the store? This is a classic situation where people are distracted, inattentive, and cognitively overloaded. It is a common place where errors happen.
  2. The routine run-through. A typical shift change involves following a checklist of actions. This sounds like a great idea for making sure that people complete every step, but again think deeply about how humans master routines. The human brain is an automated routine learner, and the point of learning a routine is to free up attention. When you were first learning to drive, every movement was an agonizing effort and all attention was on the placement of your hands, the pressure of your foot on a pedal, where you should be looking, and what the next step was. Once you became an expert, however, you could drive “without thinking.” Now, when you drive your attention can be elsewhere and you can sometimes complete a journey without even remembering what you are doing. But this routinization comes with a cost. Have you ever driven to the wrong place because you weren’t paying attention? Your driving was excellent, but the whole process was on automatic and the destination was wrong. And, by the way, the wrong destination was a well-practiced one like home. If the run-through at the shift change was almost exactly the same as the actual process for generating an alert, the chances of triggering an alert by mistake because of routinization were increased.
  3. A button click sends an emergency alert to the entire state. There is a mismatch here between the simplicity of the action and magnitude of the consequence. While speed is of the essence, there is a tradeoff between simplicity and outcome. There are many ways around this problem, for example just labeling the button with appropriate verbiage, including a warning icon, and using a danger color (i.e. red) might be enough. Requiring a confirmation as a second step is another approach, which was used here but was ineffective (we will see why in #4 below). Perhaps the most obvious fix here is to require a second person to verify the action.
  4. A confirmation dialog didn’t work. How many times have you seen “Are you sure? Yes, No”? Plenty, no doubt. How many times have you found yourself swearing after pressing “Yes” too hastily? This is one of the most common questions that an IT consultant asks, “Why did you press to confirm?” And the most common and frustrating answer is, “I don’t know.” Well, the reason is the same as #2 above: routinization. When you are on automatic, you don’t see and you don’t think, at least not consciously. Routinization will combine any commonly occurring sequence of actions into a single action without your awareness. That means that pressing the “Send Alert” button (or whatever it is) and pressing the “Yes” to confirm button are actually not two actions in the human mind, but really just one action. After happening together a few times, the two button presses are programmed into a single movement by the brain, and this movement cannot even be stopped once it starts. Hence the befuddlement afterwards, “I don’t know why I did that.”
  5. There is no standardized false alarm mechanism. Everyone makes mistakes. Error recovery is just as important in system design as any other function. This means that the designers must anticipate errors and provide ways to recover. The human-centered design literature is full of advice on how to do this. First, anticipate the errors by using what is known about human behavior. The 4 issues just discussed give you signposts about where errors are likely. Second, design to impede or block errors. If the alert button were a physical button, there might be a flip-open door blocking it. What might the screen version of this blocker be? Third, design to recover from errors. This means two things: tell the person what happened and explain exactly how to recover. In this event, the person who made the error apparently didn’t know what they did until they actually received the emergency alert on their own phone. There needs to be immediate feedback that says something like “An emergency alert has been sent statewide,” and this message should not appear in any other circumstance, including simulations or training. After that, there needs to be a very fast recovery mechanism, with instructions, like “Press this button to send a retraction.” In the actual event, the lack of a standard recovery procedure meant that it took almost 30 minutes to send a text explaining that there was a false alarm.

These are all design guidelines that the HCI expert should have on hand to help during development. However, not everything can be foreseen, and so there are other important HCI processes that need to be carried out when developing systems such as these.

Scenarios

If we focus on the button, the screen, and the moment, then we miss the larger picture. This event happened in the course of an activity with many steps, and at each step there were branches for doing something else. In scenario-based design, all of the possible things that a person might do while interacting with a computer system to accomplish a range of tasks are explored. This includes all of the possible things that a person might do wrong, and all of the paths of recovery. Note that this is not computer programming. This is not coding. This is thinking about what people do, which is the first principle of human-centered design.

Rapid Prototyping

Scenarios can be tested with very basic versions of software, called prototypes. Prototypes can be very simple, even based on paper or cards, and are designed to be modified easily and thrown away when they don’t work. Prototype testing should be done during the design process, using real people who know the tasks (not the programmers or designers) and they should include all activities, including errors. Although this is also not programming per se, programmers and software engineers should be involved in prototyping since it will guide their implementation process ultimately.

HCI to the Rescue

Again, I do understand that it is always easy to criticize in hindsight. But, when I hear that someone “pushed the wrong button,” I cannot let that stand as an acceptable reason to scare the wits out of a million and a half residents and a quarter of a million visitors to our state. I don’t blame the person who pushed the button, but I do question the designers and developers of a system in which such a thing could happen.

Human-centered design and HCI are often not taken seriously because they add time and cost, they have roots in the social and behavioral sciences which can be antithetical to STEM practitioners, and they require interactions between cultures that often find it hard to understand each other. It is not an uncommon practice to release apps and even larger and more complex programs in beta versions with the idea that problems will be discovered in the field and fixed in upgrades. But upgrade cycles based on widespread failures in the field are not good practice, and iterative fixes are very different from good design. In application contexts as critical as statewide emergency alerts, poor design is completely unacceptable.

So, my message to the poor Governor and his staff is: By all means proceed with fixes and procedural work arounds to keep this from happening again. But I would also let loose on the emergency alert system some good interaction designers who are knowledgeable about humans and familiar with human-centered principles. They will come to your rescue before you need it.

--

--

Scott Robertson

Professor and Department Chair, Information and Computer Sciences, University of Hawaii at Manoa. I study HCI, sociotechnical systems, and digital government.