With Big Data Comes Big Responsibility:
Facebook Announces Changes to “Ethnic Affinity” Ad Targeting
Coty Montag, Deputy Director of Litigation
Elizabeth Reese, Harvard Public Interest Fellow
Last week, Facebook announced that it is taking several actions to deter and disable advertisements that may discriminate based on ethnic affinity. Why is this so important? Imagine two people, both 18 and from Washington, DC, who log onto their Facebook pages and see a different set of advertisements. The first, who is white, sees ads for an SAT prep course, available apartments in a booming part of the city, and low-interest college loans. The second, who is Black, sees ads for a fast food job and a high-interest payday loan, but does not see the ad for the housing opportunity, even though he is searching for a place to live. Each set of advertisements — whether included or excluded from the user’s page — was crafted specifically for the teenagers based on their assumed ethnic identity. This raises serious civil rights concerns.
As technology continues to develop, so do ways it can be used to discriminate. In this era of big data and ‘tailored’ content, internet companies know a lot about you and have tremendous power to shape what you see online and how you navigate your world. LDF is committed to working with companies to ensure that such power is not wielded in a way that violates your rights. When Facebook launched its “ethnic affinity” ad targeting, LDF immediately joined a coalition of concerned advocacy groups to ensure Facebook was aware of the ways that this technology could be used to discriminate against users — and potentially violate federal anti-discrimination laws.
Facebook uses an algorithm to assign an “ethnic affinity” to each user to predict race and national origin and then sells advertisers on the ability to target ads to that demographic. Facebook never asks users to self-identify their race or ethnicity. Instead, using “signals” such as language, likes, and group memberships, Facebook identifies and sorts users into its “Hispanic,” “Asian-American,” or “African-American” ethnic affinity groups, or it puts them in a group called the “general population.” It then gives advertisers the ability to target users based on what it identifies as their ethnic affinity. This targeting goes two ways: advertisers may direct ads to users — or away from them — based on purported ethnic identity. Facebook has insisted it is not racially profiling users, it’s merely identifying an affinity toward, for example, “African-American content” that a user might enjoy because she previously showed interest in the Black Lives Matter movement or the film Straight Outta Compton. But this tech-enabled targeting is not without its dangers. We were concerned that Facebook’s “ethnic affinity” advertising could be a Pandora’s box of potential harm hidden in our newsfeeds.
Regrettably, there has always been a market for discrimination in this country. The United States has a long history of targeting racial minorities with products that take advantage of them and denying minorities equal access to the market for homes, jobs, and educational opportunities. Entire communities of color were, and continue to be, denied equal access to things like mortgage loans because of their race. Disproportionate access to certain parts of the market can be devastating for minority communities and serve to reinforce contemporary segregation. We need to ensure that today’s employers, landlords, and lenders don’t rely on targeted advertisements to narrow the audience of their advertisements to certain “ethnic affinity” users. Otherwise the availability of a mouse click may become a carefully hidden “for whites only” sign. By irresponsibly wielding the power to filter and forward information to millions of people based on ethnic affiliation, companies like Facebook could easily be creating social network-sanctioned segregation — and without users even realizing that the online world they log into is treating them differently because of their race or national origin.
That is why LDF is pleased to see that Facebook is taking steps to deter ethnic affinity advertisements, including developing tools that will find and disable ethnic affinity advertisements in the areas of housing, employment, and credit. Additionally, Facebook announced it will update its advertising policies and step-up educational efforts for advertisers on this issue. LDF is pleased that our conversations with Facebook have helped it recognize the unique responsibility it has to ensure that twenty-first century technology doesn’t mean twenty-first century discrimination. We hope that other tech companies will take notice.