When Automation Makes Passengers Freight: United Airlines and Seat Assignments
By S.A. Applin, April 10, 2017
A problem with quantitative methods in business, being used to drive processes, scripts, and actions, is that they can divorce business from people. This undermines trust and leads to unexpected outcomes that are Public Relations disasters. By trusting employees and customers, businesses can augment their processes, which in turn can make them run much more smoothly.
One of the most shocking cases of this happened on April 10, 2017, when United Airlines, who was oversold, needed to send four employees on an aircraft. Offers of the usual cash and a seat on the next flight were disregarded by those who had purchased their tickets, and people were allowed to board. Once the plane was boarded, United initiated a computerized lottery to determine who should “volunteer” to leave the aircraft. As many people may know, when one person was required to “volunteer,” he resisted, and was subsequently removed from the plane by security, who used enough force to bloody his face.
I’d like to examine this business case and business processes in this context though an automation lens. The danger in business processes when they are leveraged to apply to people, is that they often are not flexible enough to account for humans taking agency. Taking agency is our ability to make choices from available options as events unfold. Humans are capable of creativity and expression, and can take agency in more ways than the designers of automated systems could ever imagine. As a result, systems that do not account for people’s needs, yet are applied to people, are almost always set up for eventual failure when the process is designed too narrowly, with an expectation that people will only take the agency from options designed within the process.
As an anthropologist, my research examines people’s interaction with algorithms and automation. I am particularly interested in when processes are too rigid, and become brittle. The United Airlines case illustrates a process which is brittle: designed for narrow outcomes, and not designed for people — neither customers, or employees.
Airlines move freight as well as baggage, and the aviation industry involves a vast and complex network of relationships between machines. These networks coordinate location of planes, people, and freight, both to transport each to destinations, but also to coordinate repair, replacement and other functions in the system. The need to be “on-time” is predicated on airlines streamlining their processes.
What I observed today was seemingly a process designed for planes and freight, applied to people. If a company transports freight and needs to remove a package, a computerized lottery might be a great way to do it. If you are a computer engineer, or someone who believes in a particular type of “fairness,” removing people from a flight, based on a random assessment, becomes a clear choice — and an easier one to code, rather than connecting options to a broader system that can be interrupted by employees in extreme situations.
However, people are not freight. We take agency in different ways, we have different needs, and we fly for different reasons to different destinations. As we’ve seen, using a computer algorithm to insure “fairness” doesn’t always work with people. This is mostly because we are collective, social, cooperative, and human. This is also because algorithms at the moment, could care less about us as people.
This is the problem currently in quantitative data-driven design, and quantitative data-driven rationales. Anthropology and other social sciences, have spent decades developing tools of qualitative data contextual inquiry. These tools help us to understand people, who they are, what they need, and different perspectives on situations. Unfortunately, businesses have been slow to adopt these methods because there is an implied mathematical “rightness” to quantitative data collection that overrules wanting to understand meaning of others. Inferred scenarios from data and personas are often used to substitute real information collected on real people in real contexts. This matters. By omitting qualitative research, companies are unable to truly know what they don’t know.
The application of this particular algorithmic technique to resolve customer seating was a very expensive mistake for United Airlines: 1) the “eviction” lost time critical to the business; 2) goodwill worldwide for customer relations was destroyed as United Airlines showed its customers what to expect in extreme circumstances of an oversold flight; and 3) United Airlines will likely incur an expensive lawsuit as well.
United Airlines must develop algorithms and automated processes that empower its gate attendants to make choices that resolve extreme problems in more creative ways. United Airlines failed to give their workers agency outside of their own strict processes to resolve issues in extreme situations, and lacked trust in their employees to understand travelers. If United Airlines had engaged in a dialog with customers on that flight, they may have found people willing to make a compromise for a bit more money than what was offered, and another ticket. People are willing to compromise if a mutually beneficial solution can be found to a problem. United Airlines trusting its employees to take agency to resolve a problem likely would have resulted in an easy transition, saving them time and money, and building goodwill in the process.