Usability Heuristics in Critical Systems — 5# Error Prevention

Sara Carvalho
UxD Critical Software
5 min readJul 6, 2021

Discover how usability can help define and prevent errors in critical systems.

In the last article, we’ve discussed the fourth heuristic — consistency and standards — the importance of achieving consistency across devices and follow standard patterns and platform conventions.

This time, we’ll focus on the fifth heuristic — error prevention — and try to understand the reason for this heuristic being so important to our products and services.

#5 Error prevention

Photo by Sigmund on Unsplash

Even better than notifying the user when something is wrong, is to prevent that thing from going wrong in the first place. In critical industries, it is integral that products and services are error-proof, being built in a way that ensures the user does not make potentially fatal mistakes.

Action, speech, perception, recall, recognition, judgement, problem solving, decision making, concept formation… All these mental activities are prone to error. The study of cognitive psychology and a better comprehension of the mental processes involved in using critical systems have helped prevent and reduce injuries and fatalities. Yet there is more to be considered — including the value of mental models of our users and context of use.

“People make mental models through experience, training and instructions.”
Don Norman

Slips and mistakes are two types of error that users can make. A series of planned actions might fail to achieve a goal because the actions didn’t go as planned; these are slips. Slips occur when the user isn’t fully aware of what he is doing — they are unconscious errors carried out through lack of attention. A slip often occurs when the user is in ‘autopilot’ mode. Take cooking as an example: a slip is adding sugar to the frying pan instead of salt.

Mistakes occur when a user’s plan is inappropriate for the situation. People incorrectly interpret what they are seeing and may think they are acting in the correct way, when in fact they aren’t. This can happen when the mental model of the user doesn’t match the interface.

Why do slips and mistakes happen?
People have short memory. They aren’t built to remember minute details, like whether the model of their TVs ends in 1A or 2B. It’s important that people are given options that help them recall these details. This will reduce the probability of error.

People also have limited attention spans, so designers shouldn’t give them multiple notifications to recall at the same time.

Decision-making power decreases as stress levels increase. The fact is that designers should make their users’ lives as easy as possible.

“Knowledge and error flow from the same mental sources; only success can tell the one from the other.”
Ernst Mach, 1905

How to avoid slips and mistakes
1. Establish good constraints and defaults
2. Support undo functionality
3. Warn your users before destructive actions can happen
4. Research and study you user mental model

Examples in everyday life

Lawnmower

A lawnmower prevents dangerous situations by turning off if the person handling it loses control. Imagine you have a hilly garden which means that, if you were to let go of the lawnmower, it would likely roll down the garden of its own accord.
Someone rings the doorbell, and you must stop what you are doing. If the lawnmower continues doing its job without your supervision, things can very easily go very wrong, very quickly!

Error prevention in critical systems

Therac-25

Intro
The Therac-25 was a medical particle accelerator, a device that increased the energy of charged atomic particles. In 1982, this radiation therapy machine was used to treat cancer patients. The patients were exposed to beams of particles, or radiation, in doses designed to kill malignant tumors.

The problem
This machine caused six accidents, resulting in death or serious injury. The patients received an overdose of radiation: the high-current electron beam struck the patients with approximately 100 times the intended dose. Previous models had hardware interlocks to prevent such faults, but the Therac-25 had removed them, depending instead on software checks for safety.

What happened?
In different situations where treatment was ongoing, the machine paused and showed a simple error message, i.e. “malfunction”, with the code number of the malfunction. However, the corresponding error wasn’t included in the user guide.
To restart the machine, the operator only needed to push the “P” key, a single-key command, and all treatment specifications remained intact. This pause and proceed could run up to 5 times, and only then the machine would stop operations. While this was going on, the machine indicated that no radiation dose had been administered to the patient, even though it had.

If the machine was displaying an error code, why let the user procceed so easily? Why was the error a number, and not a message explaining the malfunction in progress?

Final thoughts

If we design with the user in mind, we can reduce many of these errors and improve usability.
First, designers should help the user build a good mental model of the product. To do this, they should make the available functions as obvious and transparent as possible. When necessary, provide clear and easy-to-follow instructions. When the user interacts with the system, designers must ensure they are provided with informative feedback, so they can judge when their model needs to be adjusted.

Usability is critical to critical systems, with errors — as seen in the case of Therac-25 — often resulting in fatalities. Designers should do as much as possible to ensure users have a complete understanding of what’s going on at all times, preventing any mistakes that may occur.

--

--