Onlookers gasp as they watch the Space Shuttle Challenger’s total-loss, in real-time. Source: National Geographic.

The Simple Way to Prevent Aerospace Disasters

Aerospace engineers are often ignored when they predict failures, how can black-swan risk analytics help us improve?

Penn Little
Dec 19, 2019 · 8 min read

On January 28th, 1986, the weather at Cape Canaveral, Florida, was unusually cold. The thermometers adjacent to Kennedy Space Center Launch Complex 39 read as low as 18˚F. Nearby on pad 39B, Space Shuttle “Challenger” was perched, pointing skyward and readying for her delayed six-day manned mission to low-earth orbit.

When tragedy strikes, for a nation, for a family, we question ourselves — could this have been prevented?

The common ground between a team of elite engineers trying to avert a Space Shuttle disaster watched by millions and the solo analysis of an analytical, socially-conscious young man teaches us that simply listening to expert whistleblowers is a solution.

The Elite Engineering Team

Bob Ebeling, a rocket scientist at Utah-based Morton-Thiokol, alerted his boss Allan McDonald that the temperature was far too cold to safely launch Challenger. Ebeling was a domain expert who had helped build and maintain the solid rocket boosters (SRBs) used to thrust American Space Shuttles into orbit. A group conference call with some of NASA’s top brass lasted until the wee hours of the morning. In the call, the engineers fiercely advocated for postponement of the launch, until the mercury rose to acceptable levels. However. officials chose to proceed, despite McDonald’s refusal to sign-off.

McDonald’s team was forewarned by the launch of Space Shuttle Discovery one year prior. While refurbishing the reusable SRBs, the engineers found black soot between the two rubber o-rings. Normally the rings prevented gas leaks from the booster’s solid rocket fuel chambers. In this case, the seal had partially failed and the team had identified cold temperatures as a significant concern for future launches. As a result, McDonald’s team was emphatic that the launch temperature (forecasted to be in the mid-30s) could very-like result in a total failure of both o-rings, on any of the two SRBs.

The alert went unheeded. Exhausted and demoralized, two time zones westward in Utah, a defeated Ebeling headed home, recounting later that before he went to sleep, he told his wife Darlene, ‘It’s going to blow up!’”

The seven-member crew of Challenger was never warned of the potential problem or the engineers’ concerns. Hours later, at 11:32 am EST on the morning of January 28, 1986, they, “waved goodbye and slipped the surly bonds of earth to touch the face of God,” in the somber words of President Reagan spoken to a shocked nation only hours later.

It was later determined that the outside temperature at launch was recorded at 36 ̊ F — well below the 53 ̊ F mark at which the previous partial failure of the now infamous o-rings had occurred.

For Ebeling and McDonald, these events were intensely traumatic. They watched in horror along with millions of ordinary citizens as the disaster they had predicted played-out on a nationally televised broadcast. Many of us still retain flashbulb memories of Challenger transforming in mere seconds from a spacecraft moving at 1480 mph to a debris field.

I interviewed McDonald, now-82, last month. He admitted, “three of my team members had PTSD for decades following the accident.” McDonald, the leader and most conspicuous member of the SRB team, was demoted for testifying to Congress about the events — then restored to his role, by a congressional joint resolution (the only time in history that reparation has ever occurred).

Even 30 years later, it was not the strain of Morton Thiokol’s punishment that distressed McDonald’s team, but the victims of the launch. At the age of 89, Ebeling felt responsible for the death of the Challenger crew. He would later recount on National Public Radio,

“I was one of the few that was really close to the situation. Had they listened to me and wait[ed] for a weather change, it might have been a completely different outcome.”

Have we made progress? Are engineers better prepared for the emotional distress of predicting failures before they occur? Are aerospace organizations, regulators, and entities better prepared to listen to an analysis by engineers on the ground?

Some would say no.

The Entry-Level Engineer

In 2017, Brandon Nelson was employed at Permaswage, an airplane parts manufacturer in Gardena, California, and part of Precision Castparts Corp. (PCC), a massive subsidiary of Warren Buffett’s Berkshire Hathaway Inc. A 2014 UCLA graduate with a B.S. in aerospace engineering, Brandon was an observant, socially-conscious young man.

Nelson became concerned that the small hydraulic valve fasteners referred to as ‘Permaswage fittings’ weren’t being built to customer specifications, nor did they contain proper required lubrication. These fittings were on nearly every airplane currently flying which has hydraulic systems, such as retractable landing gear. In fact, since 1996, PCC has acquired 41 companies that make aerospace parts and materials, creating an arguable monopoly.

Nelson unsuccessfully attempted to take his concerns up the supervisory chain at PCC before watching in shock as the parts that he considered faulty, were shipped to customers.

Similar to NASA in 1986, management at PCC was autocratic. Precision was — and still is — run by Mark Donegan, a CEO whose management style was reported in a 2016 Bloomberg article to be unprofessional at best:

“Those who know the CEO best describe a manager who’s highly effective but at times strains basic decency. These people, most of whom asked that their names not be used for fear of retaliation, say they have witnessed Donegan using profanity and violent language. One heard him threaten to stab someone in the eyes with a pencil. Another says the CEO threatened to rip an employee’s arms off so he could hit the person with the bloody stumps.”

Virtually all airframe manufacturers use these hydraulic fittings. Becoming increasingly concerned, Nelson contacted executives at PCC customers including Boeing, Airbus, Embraer, and Lockheed Martin. Nelson’s godfather, Charles Sena, later recalled the young engineer being “emphatic that these parts could bring down an airplane.”

Nelson felt the refusal to listen was simply irrational. In late September 2017, he texted his brother Justin in frustration:

if something goes wrong on a plane, it’s traceable! The uppers don’t care, dude, and make every1[sic] be dishonest and hide the (inappropriate behavior), hiding doesn’t work!

Similar to McDonald, Nelson finally took matters into his own hands, filing a whistleblower complaint with the Federal Aviation Administration (FAA). While an investigation was conducted, an aviation expert believes the FAA did so after giving advance notice. Also, an FAA spokesperson confirmed the inspection was performed on the Permaswage plant floor as opposed to an inspection of a part installed on an actual airplane, subjected to real-life wear-and-tear. Nelson’s concerns about the manufacturing were with early failure, not something that could be observed in pre-selected parts on the shop floor.

Nelson’s UCLA Diploma along with a handout from his memorial service rest on a bookshelf in his room at his parents’ home in Santa Monica, CA. Source: Author.

Similar to Ebeling and McDonald, Nelson experienced significant emotional strain from the short and long term effects of voicing his concerns. Unfortunately, Brandon’s story does not have a neat resolution. Upon receiving the ‘no-findings’ letter from the FAA, Nelson became distraught. Still inconsolable about the complaint, Nelson was hospitalized in January 2018 and took his own life in March of the same year still distraught about the possibility of disaster.

What Can We Learn?

What can we learn from these stories? Nelson, McDonald, and Ebeling were all specialized, intelligent, conscientious men who saw defects that only their expertise and training could recognize. However, none of them was prepared for the emotional fallout of not having their concerns heard. The institutions involved failed to give adequate reassurance to their employees that their responsibility to the public had been discharged. All three men felt a personal sense of responsibility for other people’s safety and suffered as a result.

Could the tragedy of the Challenger explosion and Nelson’s death have been prevented?

Today’s aerospace industry can be considered a complex system — difficult to predict because it demonstrates emergent properties from interdependent relationships between parts, environment, human factors, and use. Aeronautical failures are black-swan events, edge-cases composed of multiple overlapping failures which makes them difficult to predict. This means that similar to other chaotic systems expert feedback is signal that might be discounted, but should never be suppressed.

The first of the two recent Boeing 737 Max accidents (Lion Air Flight 610), occurred in Indonesia, killing 189 people just six months after Nelson’s death. In a final report released abnormally quickly, investigators purport that multiple errors contributed to the tragedy, including a new maneuvering system, the FAA certification, maintenance, and pilot error.

However, the second Boeing 737 Max accident, which crashed five months later (Ethiopian Airlines Flight 302) killed 159 people and triggered a worldwide grounding of all 737 MAX planes. The investigation is ongoing, and being led by the French government.

While no conclusive evidence suggests the parts Nelson voiced concern about hydraulics contributed to these two plane’s failures, investigators and experts believe those two airplanes contained both faulty parts, and faulty pilots. But Nelson’s distress was at least partially exacerbated by the lack of response to his feedback. A concern for the lives of passengers is a desirable trait in an engineer, and McDonald and Ebeling have shown us that this type of concern can save lives.

Perhaps the solution is simple. Because our systems have so many moving parts, we need black-swan predictive systems, risk-analytics algorithms that incorporate all sources. If any feedback loop is broken or suppressed our predictive models are less accurate, and in aerospace, this is a dangerous concept. In other words, our engineers feedback matters.

Like Ernest Hemingway, the great writer who also died from suicide recommended:

When people talk, listen completely. Most people never listen.”

Penn Little

Written by

Entrepreneur & Investigative Journalist www.pennlittle.com/publications

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade