Automation in Aviation: Humans vs. Computers

Tingyo Tan
Predict
Published in
6 min readJun 28, 2021

Aviation technology helps prevent dangerous situations before they become a problem. It has drastically improved aviation safety over the past decades, and accident rates have been lower than ever. In the cockpit is where it matters the most — cockpit instruments, monitors, and computer systems are crucial in every stage of the flight in every condition. For instance, pilots flying in IFR conditions are trained to fully trust their avionics to avoid false visual cues. Even in VFR conditions, pilots need to rely on their airspeed indicator, altimeter, navigation systems, etc.

On rare occasions, however, pilots must resolve in-flight failures and make the best decision to protect the aircraft and the passengers. Even though these are becoming rarer, every pilot must handle the situation properly. Most of the time, it’s either a simple fix or a quick reference to the aircraft’s emergency procedures. On more serious occasions, pilots may have to declare a mayday and an emergency landing if the situation becomes life-threatening, and they now need to utilize their instruments and recognize the failure to resolve the issue on board.

But what if the flight system is providing false information to the pilots, preventing them from recognizing the cause of the situation and taking the right action? What if the system has a mind of its own, preventing the pilots from taking control of the aircraft? What if the system is designed to confuse itself and provide erroneous values, forcing the aircraft into an unrecoverable nose-dive? System malfunctions or flawed designs could result in a chaotic battle between pilots and flight computer systems, which could eventually result in a fatal accident. Even modern automated systems can sometimes be unreliable? Perhaps technology failures are inevitable?

An airliner flying in the dark at cruising altitude. Pilots sometimes have no visual reference outside the window and must fully rely on their instruments. Photo by Andrés Dallimonti on Unsplash

Modern commercial aircraft now rely on fly-by-wire technology that sends flight control inputs to the computer rather than directly to the flight control system. It can immediately recognize system failures that could pose a danger to the aircraft and protect the aircraft from being in a dangerous state. However, such system complexity has led to confusion and chaos in the cockpit multiple times, leading to improper pilot action and poor decision-making that put the aircraft into unintentional but imminent danger. In critical stages of the flight, pilots must be situationally aware to minimize confusion and errors in the cockpit, while facing the mental challenge of analyzing and outthinking potential faulty information that the automated system or the computer provides.

One of the most notorious in-flight computer system failures happened on Qantas Flight 72. On October 7, 2008, a Qantas Airbus A333 with 315 souls on board experienced two uncommanded pitch-downs after the Electronic Centralized Aircraft Monitor (ECAM) started showing multiple contradictory warnings including an overspeed and a stall warning, leaving all three pilots confused and uncertain as to which systems they could or could not rely on. The aircraft then suddenly entered two dives automatically commanded by the Flight Control Primary Computer (FCPC) without any pilot inputs. 40 minutes later, the A330 made a successful visual emergency landing at Learmonth Airport.

Captain Kevin Sullivan later recounted that “the automation was the worst enemy on that day,” and that it was extremely difficult to “manage that kind of distraction — what’s real, what’s not real. (Smithsonian Channel Aviation Nation, 2019, 20:43)” The Australian Transport Safety Bureau (ATSB) investigators eventually discovered that the Air Data Inertial Reference Unit 1 (ADIRU 1), one of the three ADIRU systems that relates important information to the flight computers about the environment outside of the aircraft including its angle of attack (AoA), mislabeled the altitude data as AoA data so that the FCPC received an AoA of 50 degrees in binary data instead of 37,000ft in altitude (ATSB, 2011). The FCPC then triggered the AoA Protection Mode to prevent the aircraft from a false stall, which sent a signal to the Electrical Flight Control System (EFCS) to pitch the nose down. Afterwards, Airbus has redesigned the AoA algorithm to prevent the same type of accident from happening again.

More recently, two similar fatal accidents of Lion Air Flight 610 and Ethiopian Airlines Flight 302 in 2018 and 2019 respectively were attributed to the design flaw of the new automated Maneuvering Characteristics Augmentation System (MCAS), designed to enhance the aircraft’s longitudinal maneuverability and AoA control, and a chain of system malfunctions including a faulty AoA sensor that was miscalibrated before the flight (a full explanation of MCAS can be found here).

Minutes after takeoff, erroneous AoA data forced the MCAS to automatically trim the nose down on both flights, leading both aircraft into steep dives. Investigators later found that the pilots on JT610 had no idea that MCAS took over their flight controls — had they knew about such a system, they could’ve manually disabled the stabilizer trim switches and effectively prevented MCAS from making inputs to the aircraft (more information can be found in this final report). However, on ET302, the pilots did disable the electrical trim tab system to disable MCAS, but also unknowingly disabled their ability to electrically trim the aircraft’s horizontal and vertical stabilizer back to neutral positions due to strong external forces. Neither of the pilots on both flights would’ve reacted in time under pressure to correct the nose-down trim, and they stood almost no chance at winning the fight against the computer systems.

Flight computers and flight control systems are designed and intended to reduce pilot workload. Pilots are trained to follow the instructions given by the computers, including checklists and emergency procedures, in order to conduct a safe flight. With the improvement of technology every day, the aviation industry has become more reliable and safer than ever. Nevertheless, it is safe to say that pilots should always have the power to override computer decisions while exercising extreme caution. Pilots must strategically plan beforehand and mentally prepare for every possible emergency that could happen to the aircraft and have the ability to determine whether the flight computers are providing inaccurate information. But the Janus-faced nature of flight computers and automated systems has raised more doubts. Should pilots take the risk of making a pilot error or trusting the flight computer systems that could also put the aircraft in an unrecoverable situation? How much control should the pilot exercise? When is the flight system trustworthy, when is it not? And most importantly, can computers think independently like pilots?

Photo by Oskar Kadaksoo on Unsplash

“Whatever consciousness is, it’s not a computation or something that can be described by a physical computation.” — Sir Roger Penrose.

As technology improves, the complexity of avionics will certainly increase. No one can guarantee that automation will be infallible. It is uncertain whether technology will ever be 100% safe and reliable, or if an accident is always inevitable whenever a new technology or system is introduced. Machines are man-made, and humans do make mistakes. No one should expect technology to be faultless from the beginning. Thus, every pilot should understand how to resolve those system failures in case they occur.

It may take a while to achieve 100% reliability in aviation, but it is now important to provide better training to allow crew members to fully understand the capabilities of the aircraft, ensure aircraft are properly maintained, and enhance the interface between pilots and the computer to achieve an in-flight agreement in case any failures occur. Once these conditions are met and as technology continues to advance, then we can determine whether human judgment or the computer should be higher up in the hierarchy.

Sources:

Makó, S., Pilat, M., Šváb, P., Kozuba, J., & Čičváková, M. (2020, July). Evaluation of MCAS System(Rep.). doi:10.35116/aa.2020.0003

Airworthiness Directives; The Boeing Company Airplanes. (2020, November 20). Retrieved February 26, 2021, from https://www.federalregister.gov/documents/2020/11/20/2020-25844/airworthiness-directives-the-boeing-company-airplanes

The Design, Development & Certification of the Boeing 737 MAX(Rep.). (2020, September). Retrieved February 26, 2021, from The House Committee on Transportation & Infrastructure website: https://transportation.house.gov/imo/media/doc/2020.09.15 FINAL 737 MAX Report for Public Release.pdf

Francis, D. (2019, May 11). MCAS Under the Hood. Retrieved February 27, 2021, from https://www.linkedin.com/pulse/mcas-under-hood-dexter-francis

Human Fallibility in Aviation. (2018, July 24). Retrieved February 26, 2021, from https://aerospaceengineeringblog.com/human-fallibility-in-aviation/

--

--

Tingyo Tan
Predict

Writer in aerospace, technology, and inspirational stories. Always fascinated by the wonders of the uncertain future.