Hawaii’s false ballistic missile alert was a failure in user experience design

This past Saturday many Hawaii residents woke up to a fairly unusual text alert.

“BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

As you might expect, this caused a fair amount of concern for many Hawaii residents. Those who saw the alert were left to wonder if they were about to be attacked for a full 38 minutes before a second text alert clarified that the first had been a false alarm.

There’s been a lot of coverage of the event and ensuing fallout but I think the most interesting part of this story is the question of how it was possible for something like this to happen in the first place.

The early explanation for the mistake provided by the Hawaii Emergency Management Agency was (roughly) that during a shift change an employee pressed the wrong button, sending out an actual alert of an incoming missile instead of the routine test that was intended.

The term “Human error” was invoked in explanations of the incident given by various government officials and the agency itself, but to a user experience designer the term “human error” sometimes hints at deeper, more systemic problems. Often it’s the case that errors seemingly caused by people are actually facilitated or outright caused by poorly designed systems.

In his book The Design of Everyday Things, user experience expert Don Norman recounts an anecdote about a time he was called upon to investigate the cause of the Three Mile Island nuclear power plant incident. He says:

“The operators were blamed for these failures: “human error” was the immediate analysis. But the committee I was on discovered that the plant’s control rooms were so poorly designed that error was inevitable: design was at fault, not the operators.”

He goes on to conclude that designers must understand both technology and psychology and that machines should be designed with the assumption that people will make errors and behave imperfectly.

This rings true for the system used by the Emergency Management Agency. One of the gaps that was quickly noted was that the system did not include a mechanism to recover from an incorrectly issued notice or a way to easily and quickly send a follow up clarification. Such precautions could have drastically reduced the chance of inciting panic and the need for them could have been anticipated if the software’s designers had accounted for the possibility of users making mistakes.

It’s also possible that a better designed interface could have decreased the chance of the mistake occurring in the first place. An article by the Washington post offers a few additional details about exactly how the error was triggered:

“Around 8:05 a.m., the Hawaii emergency employee initiated the internal test, according to a timeline released by the state. From a drop-down menu on a computer program, he saw two options: “Test missile alert” and “Missile alert.” He was supposed to choose the former; as much of the world now knows, he chose the latter, an initiation of a real-life missile alert.”

Though it’s hard to tell with certainty based on the limited information available, this description doesn’t exactly offer an image of a perfectly usable interface. Having both options so close together seems poorly considered. Separating and differentiating the options at a minimum could have greatly decreased the likelihood of the improper selection in the first place.

UPDATE: The following image was released by the Honolulu Civil Beat on twitter. According to a report from The Verge the image was described by the public information officer for Hawaii’s Emergency Management Agency as an acceptable representation of the system. Do note however that the actual image of the entire screen could not be released for security reasons. It’s unclear if the reference to a “drop down” was hyperbolic or just not represented in this image. You can judge the visual hierarchy for yourself:

“PACOM (CDW)-STATE ONLY” and “DRILL - PACOM (CDW) - STATE ONLY” are the options used to trigger the real and test alerts.

More details might unearth similar issues. Why was clicking through a single confirmation pop up considered enough? Why is a missile alert associated with a shift change instead of an action performed during a shift? Did the system make it immediately clear to the user that they had just sent out a real alert? Did the alert send immediately or was there a window to cancel?

These questions and issues are best considered before a problem occurs. Other agencies and organizations should take note. Too often design is seen as important only for consumer products and ignored for anything internal. For some sectors it isn’t uncommon to see software that’s made without the use of designers at all. However, as long as the end users of the systems we make are humans, there will always be a need for human centered design processes. Proper design informed by testing with real users can go a long way in helping even the trained experts to be effective and avoid or recover from mistakes.

In the last few days the Hawaii Emergency Management Agency and other involved government parties have already corrected some of the more obvious flaws in their system including updated functionality for correcting errors and a requirement that any alert receive sign-off from at least two individuals.

We’ve yet to hear the full analysis of issues and changes that are planned but here’s to hoping those investigative teams include at least a few UX design and research professionals.