What is “fairness”?

What happens when technology decides?

Fairness is one of those values that Americans love to espouse. It’s just as beloved in technical circles, where it’s often introduced as one of the things that “neutral” computers do best. We collectively perceive ourselves and our systems to be fair and push against any assertion that our practices are unfair. But what do we even mean by fairness in the first place?

Photo by Kate Ter Haar

In the United States, fairness has historically been a battle between equality and equity. Equality is the notion that everyone should have an equal opportunity. It’s the core of meritocracy and central to the American Dream. Preferential treatment is seen as antithetical to equality and the root of corruption. And yet, as civil rights leaders have long argued, we don’t all start out from the same place. Privilege matters. As a result, we’ve seen historical battles over equity, arguing that fairness is only possible when we take into account systemic marginalization and differences of ability, opportunity, and access. When civil rights leaders fought for equity in the 60s, they were labeled communists. Still, equity-based concepts like “affirmative action” managed to gain traction. Today, we’ve shifted from communism to socialism as the anti-equity justification. Many purposefully refuse to acknowledge that people don’t start out from the same position and take offense at any effort to right historical wrongs through equity-based models. Affirmative action continues to be dismantled and the very notion of reparations sends many into a tizzy.

Beyond the cultural fight over equality vs. equity, a new battle to define fairness has emerged. Long normative in business, a market logic of fairness is moving beyond industry to increasingly become our normative understanding of fairness in America.

To understand market-driven models of fairness, consider frequent flyer programs. If you are high status on Delta, you get all sorts of privileges. You don’t have to pay $25 to check a bag, you get better seats and frequent upgrades, you get free food and extra services, etc. etc. We consider this fair because it enables businesses to compete. Delta cares to keep you as a customer because they rely on you spending a lot more over the year or lifetime of the program than you cost in terms of perks. Bob, on the other hand, isn’t that interesting to Delta if he only flies once a year and isn’t even eligible for the credit card. Thus, Bob doesn’t get the perks and is, in effect, charged more for equivalent services.

What happens when this logic of fairness alters the foundations of society? Consider financial services where business rubs up against something so practical — and seemingly necessary — as housing. Martha Poon has done phenomenal work on the history of FICO scores which originally empowered new populations to get access to credit. These days, FICO scores are used for many things beyond financial services, but even in the financial services domain, things aren’t as equitable as one might think. The scores are not necessarily fair and their usage introduces new problems. If you’re seeking a loan and you have a better score than Bob, you pay a lower interest rate. This is considered acceptable because you are a lower risk than Bob. But just like Delta wants to keep you as a customer, so does Chase. And so they start to give you deals to compete with other banks for your business. In effect, they minimize the profit they make directly off of the wealthiest because they need high end customers for secondary and competitive reasons. As a result, not only is Bob burdened with the higher interest loans, but all of the profits are also made off of him as well.

Photo from Flickr

For a moment, let’s turn away from business-based environments altogether and think more generally about how allocation of scarce resources is beginning to unfold thanks to computational systems that can distribute resources “fairly.” Consider, for example, what’s happening with policing practices, especially as computational systems allow precincts to distribute their officers “fairly.” In many jurisdictions, more officers are placed into areas that are deemed “high risk.” This is deemed to be appropriate at a societal level. And yet, people don’t think about the incentive structures of policing, especially in communities where the law is expected to clear so many warrants and do so many arrests per month. When they’re stationed in algorithmically determined “high risk” communities, they arrest in those communities, thereby reinforcing the algorithms’ assumptions.

Addressing modern day redlining equivalents isn’t enough. Statistically, if your family members are engaged in criminal activities, there’s a high probability that you will too. Is it fair to profile and target individuals based on their networks if it will make law enforcement more efficient?

Increasingly, tech folks are participating in the instantiation of fairness in our society. Not only do they produce the algorithms that score people and unevenly distribute scarce resources, but the fetishization of “personalization” and the increasingly common practice of “curation” are, in effect, arbiters of fairness.

The most important thing that we all need to recognize is that how fairness is instantiated significantly affects the very architecture of our society. I regularly come back to a quote by Alistair Croll:

Our social safety net is woven on uncertainty. We have welfare, insurance, and other institutions precisely because we can’t tell what’s going to happen — so we amortize that risk across shared resources. The better we are at predicting the future, the less we’ll be willing to share our fates with others. And the more those predictions look like facts, the more justice looks like thoughtcrime.

The market-driven logic of fairness is fundamentally about individuals at the expense of the social fabric. Not surprisingly, the tech industry — very neoliberal in cultural ideology — embraces market-driven fairness as the most desirable form of fairness because it is the model that is most about individual empowerment. But, of course, this form of empowerment is at the expense of others. And, significantly, at the expense of those who have been historically marginalized and ostracized.

We are collectively architecting the technological infrastructure of this world. Are we OK with what we’re doing and how it will affect the society around us?

(Header photo by Robert Steinhöfel)