Human-Centered Design and the Missile False Alarm in Hawaii

  1. The accident occurred at a shift change.
  2. During the shift change, there is some kind of routine run-through of a procedure very similar to the actual alert procedure.
  3. In an actual alert, there is a button to click on the screen that will cause the warning to be generated. One person is responsible for pressing this button.
  4. There is a confirmation dialog box that follows the button press.
  5. There is no standardized false alarm alert mechanism.
  1. The shift change. Think to yourself what is happening at a shift change. The activity is simple to describe: one person is leaving and another is taking over their job. From a technical perspective, there will be a bunch of steps for transferring the activities. But think about it more deeply from the human perspective. How do you feel at the end of a shift? Anxious to get going? In a hurry? What’s on your mind? The drive? Picking up the kids? Stopping at the store? This is a classic situation where people are distracted, inattentive, and cognitively overloaded. It is a common place where errors happen.
  2. The routine run-through. A typical shift change involves following a checklist of actions. This sounds like a great idea for making sure that people complete every step, but again think deeply about how humans master routines. The human brain is an automated routine learner, and the point of learning a routine is to free up attention. When you were first learning to drive, every movement was an agonizing effort and all attention was on the placement of your hands, the pressure of your foot on a pedal, where you should be looking, and what the next step was. Once you became an expert, however, you could drive “without thinking.” Now, when you drive your attention can be elsewhere and you can sometimes complete a journey without even remembering what you are doing. But this routinization comes with a cost. Have you ever driven to the wrong place because you weren’t paying attention? Your driving was excellent, but the whole process was on automatic and the destination was wrong. And, by the way, the wrong destination was a well-practiced one like home. If the run-through at the shift change was almost exactly the same as the actual process for generating an alert, the chances of triggering an alert by mistake because of routinization were increased.
  3. A button click sends an emergency alert to the entire state. There is a mismatch here between the simplicity of the action and magnitude of the consequence. While speed is of the essence, there is a tradeoff between simplicity and outcome. There are many ways around this problem, for example just labeling the button with appropriate verbiage, including a warning icon, and using a danger color (i.e. red) might be enough. Requiring a confirmation as a second step is another approach, which was used here but was ineffective (we will see why in #4 below). Perhaps the most obvious fix here is to require a second person to verify the action.
  4. A confirmation dialog didn’t work. How many times have you seen “Are you sure? Yes, No”? Plenty, no doubt. How many times have you found yourself swearing after pressing “Yes” too hastily? This is one of the most common questions that an IT consultant asks, “Why did you press to confirm?” And the most common and frustrating answer is, “I don’t know.” Well, the reason is the same as #2 above: routinization. When you are on automatic, you don’t see and you don’t think, at least not consciously. Routinization will combine any commonly occurring sequence of actions into a single action without your awareness. That means that pressing the “Send Alert” button (or whatever it is) and pressing the “Yes” to confirm button are actually not two actions in the human mind, but really just one action. After happening together a few times, the two button presses are programmed into a single movement by the brain, and this movement cannot even be stopped once it starts. Hence the befuddlement afterwards, “I don’t know why I did that.”
  5. There is no standardized false alarm mechanism. Everyone makes mistakes. Error recovery is just as important in system design as any other function. This means that the designers must anticipate errors and provide ways to recover. The human-centered design literature is full of advice on how to do this. First, anticipate the errors by using what is known about human behavior. The 4 issues just discussed give you signposts about where errors are likely. Second, design to impede or block errors. If the alert button were a physical button, there might be a flip-open door blocking it. What might the screen version of this blocker be? Third, design to recover from errors. This means two things: tell the person what happened and explain exactly how to recover. In this event, the person who made the error apparently didn’t know what they did until they actually received the emergency alert on their own phone. There needs to be immediate feedback that says something like “An emergency alert has been sent statewide,” and this message should not appear in any other circumstance, including simulations or training. After that, there needs to be a very fast recovery mechanism, with instructions, like “Press this button to send a retraction.” In the actual event, the lack of a standard recovery procedure meant that it took almost 30 minutes to send a text explaining that there was a false alarm.




Professor and Department Chair, Information and Computer Sciences, University of Hawaii at Manoa. I study HCI, sociotechnical systems, and digital government.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

The Shopping Revolution: How Can Metaverses Like DecentWorld Reshape Old Habits

Egencia Icon Redesign

UX Case Study — Trew Fields Cancer Awareness and Holistic Health website

U.S. Department of Education: How do families interpret school data?

Why design is essential to digital humanities

How do we create our NFTs?

My Waveforms Names Creation

Introducing E-Commerce to Hill Of Content

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Scott Robertson

Scott Robertson

Professor and Department Chair, Information and Computer Sciences, University of Hawaii at Manoa. I study HCI, sociotechnical systems, and digital government.

More from Medium

Problem Statement for Mental Health Among Young One's

Designing Museum Experiences (DME) Process

A candid 2022 overview of Miro vs Mural by a facilitator of 200+ online workshops

Dichotomy of talking to users: Fundamental of User Research