Should Hospitals Be More Like Airplanes?
“Alarm fatigue” at Pablo Garcia’s hospital sent him into a medical crisis. The aviation industry has faced the same problem—and solved it.
When Pablo Garcia was admitted to the hospital in July 2013, he was 16 years old, a tenth grader at a high school in Stockton, California. He hoped to be an auto mechanic one day. At about 85 pounds, he was quite small for his age, a consequence of his immune disease, NEMO syndrome, and the cruel havoc it had played on his digestive system.
Stockton is a two-hour drive from San Francisco, but with its depressed, farm-based economy and its high crime rate, it’s a world away from the sparkling City by the Bay. While Pablo has a primary care doctor in Stockton, the city lacks the resources and specialists one finds at a prestigious research and teaching institution like UCSF, so he’s been coming to San Francisco for care since he was a small child.
Pablo’s mother, Blanca, is fiercely protective of her four children, especially Pablo and the younger Tomás, both of whom have NEMO syndrome. Her two sons are constantly battling infections — sometimes painful skin infections that weep, itch and blister; other times pneumonias that cause her children to cough and gasp for air. Their digestive systems are never normal. There may be diarrhea one week, nausea the next, and bleeding the week after that. They are malnourished; Tomás must receive his nutrition through a tube threaded into his small intestine. Whenever Pablo or Tomás is in the hospital, Blanca plants herself in the room, partly to lend support, but also to be a final set of eyes and ears. Hospitals, she knows, can be dangerous places.
As luck would have it, on the night of July 26, Pablo and Tomás were both hospitalized at UCSF Medical Center. Since Tomás was the sicker of the two children, Blanca decided to spend the evening in his room, one floor up from Pablo’s. But for that sad coincidence, she would have been by Pablo’s bedside when Brooke Levitt came in with an anomalous dose of Septra, the routine antibiotic he takes, and undoubtedly would have all but tackled the nurse before she could administer the 38 1/2 pills. She still feels a bit guilty that she wasn’t there — because no one knew better than her that Pablo should only have taken a single pill.
The overdose triggered a grand mal seizure and Pablo stopped breathing. Within a minute, however, the Code Blue team arrived and was able to revive him from his brief period of apnea. Even in a place like UCSF, a Code Blue is a rough, chaotic blur — Pablo’s mother watched in horror as a half-dozen doctors, nurses and pharmacists stormed into the room, ignoring her as they methodically went about their business of ensuring Pablo’s respiration, placing large intravenous lines and preparing, if necessary, to shock his chest (luckily, it didn’t come to that). They left nearly as abruptly as they entered; once Pablo was stable enough to move, he was wheeled to the pediatric ICU at a speed that is something like a trot, where, thankfully, his seizure ended and he stabilized. His mother accompanied him there, wondering if this was the beginning of the deterioration that would end her son’s life.
Luckily, Pablo recovered in the intensive care unit over the next several days. On the morning of August 5, ten days after the overdose, the doctors were now ready to restart Pablo’s Septra.
At the time of his admission, Pablo’s physician had been forced — by a well-meaning policy — to translate the patient’s home dose of Septra (one pill twice a day) into a weight-based dose (5 milligrams of medication per kilogram of body weight). This move set off a series of misadventures, reminiscent of the mangled syntax that can emerge after translating something from English to a foreign language… then back to English again.
But this time around, as the doctors prepared for Pablo’s hospital’s discharge, they chose to override the weight-base dosing policy. The medication was ordered as “Septra, one double-strength pill twice a day” in the computer system. It was just that simple.
The clinicians involved in Pablo’s case that day — physicians, nurses and pharmacists—all made small errors or had mistaken judgments that contributed to their patient’s extraordinary overdose. Yet it was the computer systems, and the awkward and sometimes unsafe ways that they interact with busy and fallible human beings, that ultimately were to blame. And the biggest culprit may well have been the hospital’s incessant electronic alerts. Some automated warnings misled the medical staff; others were lost in the cacophony of alarms going off throughout the hospital.
I wanted to see if medicine might learn from other professionals who need to perform their tasks in a swirling, often confusing, high-stakes environment. The aviation industry seemed like a natural place to look, so I spoke to Captain Chesley “Sully” Sullenberger, the famed “Miracle on the Hudson” pilot. “The warnings in cockpits now are prioritized so you don’t get alarm fatigue,” he told me. “We work very hard to avoid false positives because false positives are one of the worst things you could do to any warning system. It just makes people tune them out.” He encouraged me to visit Boeing’s headquarters to see how its cockpit engineers manage the feat of alerting pilots at the right time, in the right way, while avoiding alert fatigue.
I spent a day in Seattle with several of the Boeing engineers and human factors experts responsible for cockpit design in the company’s commercial fleet. “We created this group to look across all the different gauges and indicators and displays and put it together into a common, consistent set of rules,” Bob Myers, chief of the team, told me. “We are responsible for making sure the integration works out.”
I sat inside the dazzling cockpit of a 777 simulator with Myers and Alan Jacobsen, a technical fellow with the flight deck team, as they enumerated the hierarchy of alerts that pilots may see. They are:
- An impending stall leads to red lights, a red text message, a voice warning, and activation of the “stick shaker,” meaning that the steering wheel vibrates violently. “The plane is going to fall out of the sky if you don’t do anything,” Myers explained calmly.
- Further down the hierarchy are “warnings,” of which there are about 40. These are events that require immediate pilot awareness and rapid action, although they may not threaten the flight path. Believe it or not, an engine fire no longer merits a higher-level warning because it doesn’t affect the flight path. (“Fires in engines are almost nonevents now,” said Myers, because the systems to handle them are so robust.) The conventions for warnings are red lights, text and a voice alarm, but no stick shaker. Impressively, the color red is never used in the cockpit except for high-level warnings — that’s how much thought the industry has given to these standards.
- The next level down is a “caution,” and there are about 150 such situations. Cautions require immediate pilot awareness but may not require instant action. Having an engine quit in a multiengine plane generates only a caution (again, my jaw drops when I hear this), since the pilot may or may not have to do something right away, depending on the plane’s altitude. A failure of the air-conditioning system — which ultimately can lead to a loss of cabin pressure — is another caution event. With cautions, the lights and text are amber, and there is only one alert modality, usually visual.
- The final level is an “advisory,” like the failure of a hydraulic pump. Since jets are designed with massive redundancy, no action is required, but the pilot does need to know about it, since it might influence the way the landing gear responds late in the flight. Advisories trigger an amber text message — now indented — on the cockpit screen, and no warning light.
For every kind of alert, a checklist automatically pops up on a central screen to help guide the cockpit crew to a solution. The checklists are preprogrammed to match the problems that triggered the alert.
And that’s it. I asked Myers and Jacobsen how, with more than 10,000 data points recorded on every flight, they resist the urge to warn the pilots about everything, as we seem to do in healthcare. “It’s a judgment call,” Jacobsen told me. “We have a team of people — experts in systems safety and analysis — who make that judgment.” Because of this process, the percentage of flights that have any alerts whatsoever — warnings, cautions, or advisories — is low, well below 10 percent.
I wondered whether the designers of individual components sometimes advocate for their own favorite alerts. Myers chuckled. “It’s funny, you’ll get some young engineer whose responsibility is the window heat system. He comes in with this list of 25 messages that he wants us to tell the pilot about his system: it’s on high, it’s on medium, it’s on low, it’s partially failed, you can’t operate it below 26 degrees. . . . He comes out of the meeting — a meeting in which the pilots say, ‘We don’t care!’ — and he’s like [Myers affects an Eeyore voice], ‘This is my job, this is my life, and it doesn’t even make it onto the flight deck.’”
Like many of aviation’s safety solutions, the parsimonious approach to alerts came from insights born of tragedies. “The original ‘gear down’ warning was linked to the throttle,” recalled Myers, meaning that it went off, falsely, every time the pilot slowed the plane. “So the pilots’ learned response was throttle back, disconnect the alert.” Predictably, this led to accidents when pilots ignored this alert even when there truly was a problem. Another example: in the early days of the Boeing 727, some alerts were so frequent and wrong that pilots yanked the circuit breakers to quash them.
When I told the Boeing engineers about my world — not only the frequency of computerized medication alerts, but also the ubiquity of alarms in our intensive care units — they were astonished. “Oh, my goodness,” was all Myers could say.
This is excerpted from The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, by Robert Wachter. McGraw-Hill, 2015. You can buy the book here.
Part 1: How Medical Tech Gave a Patient a 39-Fold Overdose
When Pablo Garcia was admitted, he felt fine. Then the hospital made him very sick. Blame high-tech medicine.
Part 2: Beware of the Robot Pharmacist
In tech-driven medicine, alerts are so common that doctors and pharmacists learn to ignore them — at the patient’s risk…
Part 3: The Self-Driving Hospital
We tend to trust our computers a lot. Perhaps too much, as one hospital nurse learned the hard way.
Part 5: How to Make Hospital Tech Much, Much Safer
We identified the root causes of Pablo Garcia’s 39-fold overdose — and ways to avoid them next time.