Bad design costs lives.

Dr Cosima Gretton
Design No Harm
Published in
5 min readJan 30, 2019

Product design is patient safety.

When medical devices are well designed, there is a certain magic. The surgical gown is a beautiful example of human-centred design done well.

Patented in 1973, it is folded in a way that allows the surgeon to open, unfold and slide the gown on without contaminating the sterile exterior. The surgeon washes her hands, opens and spreads the sterile packet containing the gown, which folded neatly into a tight square, onto a table.

US3721999A: Surgical gown and method of folding

Two folds expose the inside of the gown, into which she inserts her hands. With a single sweeping motion, opening her arms upwards and outwards, gravity unfurls the cloth over her shoulders and down to her feet.

She pulls on her gloves while a nurse takes the piece of paper attached to the strap. The nurse circles the surgeon and hands it back to her to tie the gown around her waist.

The ergonomics of human movement and the physics of cloth unfolding over limbs have been carefully considered to create a design that makes it easy for the user to do the right thing: get the gown on and stay sterile.

It is hard to quantify the impact that this simple folding technique has had on mortality rates during surgery. But not all technologies in healthcare have been created with such care and attention.

Humans first.

Over the last two decades, human-centred design, an approach that starts with the person and designs solutions to closely fit their needs, took the consumer product industry by storm. In aviation, the industry transformed safety through the study of human factors: how humans behave physically and psychologically in relation to environments, products, or services.

But medicine has lagged behind. Healthcare is littered with poorly designed products and confusing interfaces that cause patient harm on a daily basis.

Recent events in the UK highlight the extent of this problem: an enquiry has discovered that over the last 20 years, tens of thousands of patients may have died prematurely as doctors and nurses confused two morphine syringe drivers. Similarly packaged, one delivered morphine at a rate of millimetres per 24 hours, the other at a rate of millimetres per hour. Select the wrong one, and the patient would get a 24-hour dose in the space of an hour.

Despite hazard warnings as far back as 1994, and the discontinuation of the device by Australia and New Zealand in the late 2000s, the UK’s National Health Service continued to purchase the device.

But beyond this example, just how big is the problem? And why, in 2019, have we yet to find a solution?

This is the first in a new series on bad design in healthcare. The impact on patient’s lives, why it still persists, and what to do about it.

As new technologies such as machine learning algorithms take a more prominent role in high-risk clinical decision spaces, understanding how humans respond to and make decisions using technology will become ever more critical.

Each post, we’ll explore stories of clinicians and patients whose lives were impacted by poor product design, scrutinise the human factors involved and propose design solutions for the industry.

Errors under the radar.

But just how many medical errors are actually caused by bad product design? Its hard to quantify, and that is part of the problem.

In the USA, deaths due to medical error have been argued to range from 44,000 to 251,454 per year. At the time of the study, the higher estimate placed medical error as the third leading cause of death in the USA. There’s a lot of debate around these numbers. The lack of clarity only highlights the fact that when it comes to medical error, we really have no idea.

How much of that is due to technology? It depends. If we take electronic prescribing as an example, the proportion is significant. A study of errors using prescribing software sold by two incumbents, CSC and Cerner found that the technology was responsible for 42% of prescribing errors across three major Australian hospitals. Another study found that of prescribing errors where technology is at fault, 98% are socio-technological errors: errors due to the interaction of the human and the machine. In other words, bad user interface design.

Complex systems.

Part of the reason why it so hard to measure is the sheer scale, complexity and variability of human-computer interactions that occur across healthcare.

Many ask why the industry can’t be more like aviation, which over the last 40 years has doubled down on human factors engineering to vastly improve safety.

But when 100s of people die in an aviation accident, the incident is discrete, encapsulated and significant. One very public event, a lot of lives impacted, a single standardised human-machine interaction that went wrong. There is a strong driver to conduct a root cause analysis and the problem space is neatly constrained.

Probably the closest comparison to aviation in healthcare is anaesthetics. The anaesthetist needs to put the patient to sleep, keep her alive during the procedure, and wake her up at the end.

The majority of healthcare, however, is what is known as a ‘complex system.’ A complex system is a system whose behaviour is intrinsically difficult to model, due to interactions, dependencies, feedback loops or inverse incentives, for example.

There are infinite small points of failure that are different in each hospital, department, ward and team, but also change dynamically day-to-day. Patient harm occurs when, as James Reason so neatly described it in the 90s, multiple errors line up like ‘holes in swiss cheese.’ Some failure points only appear temporarily. Those that persist are often seen as small and unassuming, so no one makes the effort to fix them. Being complex, the combinatorics of these failure points across every situation is impossible to characterise. That is the challenge both for measuring the impact of bad design in healthcare and for developing solutions.

Unlike in aviation, the impact of each error is perceived as small and local. Those involved are unlikely to be aware of the same error occurring at other hospitals, and any solutions they develop are not propagated across the industry.

In the new era of digital medicine, this is the challenge for clinicians and product designers. Human factors must be a top priority. We need better reporting systems, better product design, and better legal protection for the users of the devices we make. As software begins to take on more clinical responsibility for decision-making, how humans interact with computers will directly impact patient outcomes.

--

--

Dr Cosima Gretton
Design No Harm

Medical doctor | Product Lead | Ex-Test & Trace | Founder @AXNSCollective