Facebook’s Civil Rights Settlement is Just the Beginning

Aaron Rieke
Equal Future
Published in
4 min readMar 21, 2019

Facebook recently announced changes to its targeted advertising system for housing, job, and credit ads. These moves came as part of a settlement to a series of lawsuits accusing the platform of violating longstanding civil rights laws.

Significantly, Facebook will no longer allow landlords, employers, creditors, and similar advertisers to explicitly target — or intentionally exclude — people based on their age, gender, zip code, and hundreds of other sensitive targeting categories. This is an important concession that will go a long way toward curbing overt discrimination by these advertisers.

But unlawful choices by advertisers are just one part of a larger problem. Facebook’s reforms will not limit some of the most powerful and invisible drivers of discrimination in digital marketing. In particular, Facebook has not yet honestly confronted its own role in contributing to discriminatory outcomes.

Facebook argued in court that its advertisers are “wholly responsible for deciding where, how, and when to publish their ads.” This is patently false. As Upturn described in an amicus brief in one of the just-settled lawsuits, Facebook itself plays a powerful role in determining who ultimately sees which ads, beyond the targeting choices that an advertiser makes. In some cases, as a result of Facebook’s own conduct, people can be unfairly excluded from seeing messages about important life opportunities, even when advertisers have no intent to discriminate.

This can happen in at least two ways.

First, Facebook often builds custom targeting lists for advertisers called “Lookalike Audiences.” In essence, given a list of people provided by an advertiser, Facebook will build a new list of users who “look like” those people. Facebook alone builds the new list, using thousands of pieces of data about its users’ behavior on and off Facebook. An ex-Facebook executive described Lookalike Audiences as the “the most unknown, poorly understood, and yet powerful weapon in the Facebook ads arsenal.” In the settlement agreement, Facebook agreed to blunt this tool ever so slightly by not directly considering age, gender, and other sensitive factors when building Lookalike Audiences for housing, credit, and job ads.

But this is far from an adequate fix. The vast amounts of data that Facebook collects about its users means that there are simply too many proxies for these factors for Facebook’s algorithms to avoid. For instance, musical tastes and shopping habits are likely to track closely with users’ age, gender, and income. There is still a strong risk that, from a small source list of a few hundred mostly white parents, Facebook will create a Lookalike Audience of hundreds of thousands, if not millions, of people — most of whom are white parents.

Second, after a target audience has been specified (for example, “all users in New York City”), Facebook itself decides which users will actually see that ad in their news feeds, based on the company’s own profit-maximizing calculations. As users scroll through their feeds, Facebook conducts automated, real-time ad auctions — billions each day — to instantaneously determine which ads to show which users. These auctions are not settled on price alone. They take into account Facebook’s own predictions about which users are most likely to find a given ad “relevant” or engage with it. If Facebook predicts that men are more likely to click on ads for construction jobs, for example, Facebook will steer those ads toward more men. This kind of “optimization,” as Facebook calls it, might be appropriate for certain consumer products. But when it comes to key life opportunities, it may cross the line into unlawful discrimination.

It would be unreasonable to expect a single settlement to fully resolve these difficult issues. Importantly, the plaintiffs were able to get Facebook to commit to studying “the potential for unintended biases in algorithmic modeling,” and share what it learns with researchers and advocates.

But Facebook should offer more than internal research. As Upturn has argued, the company should allow regulators, journalists, and civil rights groups to independently study the nature and actual reach of ads on its platform. This would mean publicly releasing lots more information about the ads the company runs. Facebook is working on a similar — but still insufficient — transparency mechanism for political ads. If the company is earnest about wanting to address discrimination, it must expand and improve upon on this effort.

This settlement showed that, when pressured, Facebook can make meaningful changes to better protect people’s civil rights. But the company still has a long way to go, because the challenges of digital discrimination run deep. Ensuring that our most vulnerable communities have fair and equitable access to jobs, housing, and financial resources has always been a core social responsibility. It is imperative that Facebook take this responsibility seriously.

--

--