Pressing the Wrong Button: How Intentional Design Could Have Prevented the Hawaii Missile Alert
A False Alarm
In the early morning of January 13th, a text message went out to the 1.4 million residents of Hawaii that a ballistic missile was inbound and everyone should seek immediate shelter.
The message was sent in error.
The Hawaii Emergency Management Agency took 10 minutes to tweet the error — and 38 minutes to send a correction. In that time, cars were abandoned on the interstate, children were placed in manholes, and many people said what they thought would be their last words to loved ones.
The original explanation was that an employee had “hit the wrong button,” triggering the panic, although a more accurate description is that he “selected the wrong item from a dropdown.” While many people called this an egregious mistake by the employee, this was not a user error; this was a design error. All of this could have been prevented with some thought around common use cases for the system and better naming conventions.
In the annotated screenshot provided by the Hawaii Emergency Management Agency, it’s easy to see that the difference between the option to send a ballistic missile drill and a ballistic missile alarm is strikingly small. According to the official account, the user was supposed to select the option circled in yellow, but selected the one in red.
Designing for Safety
These are the questions we think about as GE designers. GE makes a lot of large, complex industrial equipment, like jet engines and gas turbines. This equipment runs incredibly fast and at very high temperatures. People often think about how user experience and usability design can make software easier to use, but, in some cases, it may actually be better to add friction to these interactions in order to prevent errors like the one that overwhelmed Hawaii.
The goal is not to confuse or frustrate a user (these are not dark patterns) but to promote caution and intention when engaging in actions with severe consequences. Sending out a serious alert should be more difficult than selecting a topping for a pizza or sending a tweet (even one about ballistic missiles).
Take, for example, Blowout Preventers (BOPs). One of these machines is infamously known for the Horizon Deepwater oil spill that claimed the lives of eleven people and caused massive environmental damage in the Gulf of Mexico. The spill was blamed in part on the failure of a (non-GE) BOP that was unable to contain a huge pressure spike and prevent disaster. It’s important to note that BOPs are not automatic systems and require manual intervention by a dedicated worker.
The Human Machine Interface (HMI) on a GE BOP is mostly devoted to the “mimic,” an abstract digital representation of the BOP, with all its valves and sensor data. The three emergency measures, from shutting off valves to shearing the pipe, are represented as small red buttons on a black background at the bottom of the page, and are ordered in severity from left to right.
To execute one of these measures, a trained driller selects an option and is required to confirm it with a two-handed keyboard sequence. This approach means an operator can’t unintentionally or accidentally perform an action, but it also frees the operator from performing a time-consuming confirmation process that could put lives at risk during an emergency.
Preventing Destructive Errors
Even without a complete view of the emergency alert application used in Hawaii, we can make some suggestions. An ideal redesign would put actual alerts in one section, separate from all the other app info. Github does this with the “Danger Zone.” This is a special section at the bottom of a repository’s settings for all destructive actions where a user must type the name of the code repository before proceeding. That’s much more compelling than reading (or, in many cases, ignoring) a simple confirmation popup, and it also requires the user to ensure she is working in the intended space.
Other ideas that sound helpful, but could be problematic, involve redesigning the confirmation screen. In this case, the user has to confirm his choice, but this only matters if the user recognizes that he has chosen the wrong option. Most people presented with a yes/no popup dialog may ignore the prompt. This tendency even has a name: acceptance fatigue. A simple confirmation is not enough to verify intent on the part of the user.
A second user could be required to confirm a real alert, but that risks not getting the alert out fast enough when seconds matter. What if the second operator is temporarily indisposed? This fails the test of speed in an actual emergency.
The two-handed keyboard sequence for performing an emergency measure on a BOP works so well because:
- It requires intention; it’s a complex action that the user cannot perform arbitrarily.
- It’s fast; a trained operator can perform the operation within seconds in an actual emergency.
Appreciating the fact that changing emergency management software is probably a long and difficult process, there are minor changes that can be made today to prevent these serious errors. Separate the test alerts from the real ones and put a dummy option as a spacer between the two. Use a naming convention, such as putting the word “Test” in front of all test alerts. Consistent capitalization, where only actual alerts are capitalized, would reinforce the differences further and add a third level of visual distinction. This is all a good start and could help prevent future false alarms if implemented thoughtfully.
When the stakes aren’t high, it can be easy to ignore such use cases, but at GE we design for these scenarios every day. We’ve learned over time what the false alarm in Hawaii has acutely illustrated; sometimes the more valuable design decision is to prevent mistakes by making high-consequence actions more difficult, ensuring that the user is acting thoughtfully and with intention. With some thoughtful intention on the part of the designer, hopefully we can prevent these kinds of scenarios from happening again.
Special thanks to Tali Marcus and Matt Jones for their comments and critiques.