Hawaii Missile Alert: Design’s Value Conversation

Cliff Seal
5 min readJan 24, 2018

I’m sure you didn’t hear about the accidental missile alert sent to Hawaii residents recently. I’m definitely sure you didn’t ready any hot takes on Twitter about it being bad interface design.

Any of the detailed, insightful, or critical takes by experts in the field are great reading for understanding what went awry. But, I think we need to examine how this can happen to our own products and to our own users.

We, as design professionals, need to look at how our own users take the heat for our lack of design quality. We need to be able to negotiate for it and see it through.

Design’s value goes deeper than Preventing Crappy UI™.

Design’s value is empowered people.

Let’s be clear: this is bad design.

System screen used to send missile alerts, courtesy of the Hawaii Governor’s office. Source: Honolulu Civil Beat

Of course.

But, bad design can be enabled through systematic devaluation and deprioritization of design competence.

“We don’t need a designer to do this simple interface.”
“The users will be trained.”
“The client isn’t paying for research.”

It’s highly unlikely that whoever wrote the code or made interface decisions was also the person who decided the project was “done”.

Instead, delivery of software is much more like a negotiation between the project manager and the client:

“Does this meet your needs?”
“Not yet, we need these changes.”
“Done. Anything else?”
“Nope. Send the invoice.”

Or whatever.

Even if the responsible people took the unlikely step of deeply reviewing and critiquing the design decisions of this interface—and even if there was validation testing that came back positive—it would still take a relatively seasoned UX-er to know that testing a design outside the scenario of AN INBOUND MISSILE HEADED FOR THE PERSON PUSHING THE ALERT could never represent what would really happen.

There’s no way to replicate that terrifying scenario.

That matters because it applies in the opposite direction, which is what happened here: testing extremely routine tasks. It’s difficult to test user interaction when the task has become “muscle memory”.

This is important because it sheds light on the menu and confirmation components of the design.

“…the employee was asked in the computer program to confirm that he wanted to send the message.”

Sure. But unless the confirmation dialog said something like…

THIS IS NOT A TEST. You are about to send the message ‘BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.’ to all citizens immediately.

Type I UNDERSTAND FULLY to confirm.

…then the combination of choosing a dropdown menu item and confirming is entirely plausible. Sure, it could have been improved:

  1. Better confirmation
  2. Separation of test and non-test options
  3. Turn the screen red in non-test mode
  4. Make action undoable for 10 seconds
  5. Generally following usability heuristics ¯\_(ツ)_/¯

That’s always clear in hindsight. But, what else happens when a mistake like this is made?

What is the potential impact of terrible design decisions beyond the mistakes they cause?

“The cause of the false alarm was human error.”

“[FCC] Chairman Ajit Pai called the false alert ‘absolutely unacceptable’”.

(🙄 at Ajit Pai telling us what’s “absolutely unacceptable” right now.)

I digress: what else happens because of poor design and a lack of relevant accountability?

The person who pushes the button gets blamed.

I’ve not built a button that alerts citizens of missiles, but I’ve built ones that let people send hundreds of thousands of emails to their customers. Sometimes, these smart people make mistakes like the rest of us and send a broken/expired/incorrect email to all those customers. Sometimes they just sent the email and others made the actual mistake.

Who gets blamed?

The person who pushes the button gets blamed.

When design competence is pushed aside, the blame is shifted to the user—people who are just trying to do their jobs. People who may very well have been trained ad nauseam on this exact interface and still managed to make a mistake because the experience betrayed them.

You can say they should have paid more attention, but you have no idea how many actions you take on autopilot every day. You don’t notice being on autopilot because things don’t go wrong.

(Until they do.)

That’s why design matters: because people matter.

This employee is likely traumatized by the pain they accidentally caused to millions of people. They will continue to be blamed by some, possibly forever. That person matters. Their life could have been different with competent design.

HEMA has already decided to add double-confirmation (two people) and cancellation to their system. Those are fine steps.

Thing is: that mistake would likely never happen again, because that task will never be done without full focus by anyone going forward. The trauma of that event will bring that simple interaction into fully-focused consciousness for every user, forever.

Instead, this should catalyze more thoughtful design in systems of scale. It should move us to review our most consequential interactions to value people more.

“Make the right thing easy and the wrong thing hard.”

In order words: tests should have been easy; state-wide warnings should have been hard. But, that’s easy for me to say now. Here’s what we can do instead:

If design is your job, take “empathy” out of fluffy-hypothetical-theory land and attach it to your logical brain.

Silly forms of empathy would have been: how happy was the user after they successfully sent a test?

Fine, but let’s apply it more deeply, instead.

Applied empathy would have been: how can we give the user 100% confidence — at all points in the process — that they are either sending a test or sending a very consequential non-test? How can we give them the tools they need to fix their mistake?

Empathy is useful when we use it as a way to understand the processes of human beings.

It’s nice if they feel happy about an interface they have to use every day.

It’s crucial that they NOT BE BLAMED FOR TERRIFYING MILLIONS OF PEOPLE BY ACCIDENT.

If you’re not a designer, but work in some form of software (especially enterprise applications), start surfacing the value of UX competency now. Use this terribly unfortunate case study as a way to explain that value. Apply it to your product’s consequential interactions.

In other words: stop saying you want your app to “look nice like this other app”, and start saying you want your users to have the confidence of 1000 suns (whatever that means) at all times.

Advocate for design’s true value: empowered people. It’s a rising tide when it matters to an organization.

We must elevate design to a place in which we, as trained, dedicated professionals, can reduce potential suffering by delivering user confidence.

Perhaps then, we can focus on the conversations we really ought to be having instead.

--

--

Cliff Seal

Husband, Principal Designer at Salesforce, co-host of TuneDig. Into music, impactful design, bikes, and trying to be awake.