Facebook’s Illusion of Control over Location-Related Ad Targeting

Facebook’s advertising principles and statements from the VP of ads, Rob Goldman, emphasize that its Ads Preferences tool allows users to “control how your data informs your ad experience.” However, Irfan Faizullabhoy and I have observed that when it comes to one of the most privacy-sensitive types of data, location, Facebook does not provide meaningful controls and is misleading in its statements to users and advertisers. Moreover, Facebook gives advertisers tools to run ad campaigns targeting people “who live in” or “were recently in” a geographic area as small as a single house. Taken together, Facebook creates an illusion of control rather than gives actual control over location-related ad targeting, which can lead to real harms.

Few Location Controls are Available

Location Settings mention two things: Location History and the location permissions given to the Facebook app on the mobile device. An explanation About Location History says it “lets you explore what’s around you, get more relevant ads, and help improve Facebook.” Given that advertising is presented as one of the main use cases for Location History, a reasonable Facebook user might conclude that turning off Location History and not granting Facebook the permission to access location on the mobile app will prevent the geo-targeting of ads. But that is not the case.

Exercising Location Controls has No Effect on Ad Targeting

My profile does not contain my current city, I haven’t uploaded photos to Facebook for years, I don’t post content tagged with my location or check-in to places. I don’t give access to my location to WhatsApp, Instagram or Facebook Messenger. I don’t search for places on Facebook. Yet the location-based ads, using my actual locations, keep coming. Some of the explanations by Facebook for why I am seeing a particular ad even mention specifically that I am seeing the ad because I was “recently near their business”.

The Location Controls provided by Facebook give an illusion of control over the data that informs one’s ad experience, not actual control. Moreover, Facebook makes false claims about the effect of controls.

Facebook claims that “Local awareness ads were built with privacy in mind. […] People have control over the recent location information they share with Facebook and will only see ads based on their recent location if location services are enabled on their phone.” However, as the Cambridge, MA and other numerous ads I’ve seen over the last few months show — this claim is false. For example, since the Cambridge ad states I’m seeing it because I “live or were recently near Cambridge,” and I don’t live in Cambridge, it means I am seeing it based on my recent location, despite my location services and Location History being disabled. The explanation given by Facebook for showing me this ad relies only on being near a specific location.

How Facebook Learns your Location when Location Settings are Off

Source: https://www.facebook.com/business/help/1150627594978290

Reading Facebook’s explanations to advertisersprovides insight into how this is done. Specifically, Facebook tells advertisers that it learns user locations from the IP address, WiFi and Bluetooth data.

Facebook Allows Targeting a Very Small Geographic Area

Targeting a single house on the ad interface by including and excluding areas

The possible harms of not giving users meaningful controls about their location data are amplified by the tools Facebook provides to advertisers to target people based on their location. As we’ve shown in a recent academic paper published at the Workshop on Technology and Consumer Protection, Facebook provides advertisers with tools to run ad campaigns targeting people “who live in” or “were recently in” a geographic area as small as a single house.

This means anyone in the world can create an ad campaign to reach people who have recently visited a particular location, such as a place of worship or an abortion clinic. And since individuals cannot meaningfully stop Facebook from inferring or using their location for advertising, they also cannot avoid such ads. Imagine opening Facebook during a visit to an abortion clinic in order to communicate with friends for support in a difficult decision, and instead, seeing an ad campaign for cute baby clothes created by anyone who wants to target women making this difficult decision.

Facebook Should Do Better

In order to faithfully claim that Facebook provides control and transparency as stated in its Advertising Principles, Facebook should:

  1. Give users meaningful controls over the location information it collects, infers, and uses for advertising, including information obtained through sources such as IP address, WiFi, and Bluetooth. This would include a dedicated Location section in Ad Preferences, and an ability to opt-out of location use entirely, or, at the very least, an ability to meaningfully specify the granularity of its use and exclude particular areas from being used.
  2. Change how Facebook makes location targeting available to advertisers, specifically by disabling the ability to target small geographic regions.

Notes

What Facebook Users Can Do: Besides disabling Location History and minimizing Facebook’s, Instagram’s and WhatsApp’s location privacy permissions on the mobile devices, Facebook users who want to help researchers and journalists keep Facebook accountable may want to consider installing the browser plugin of ProPublica’s Political Ad Collector and the AdAnalyst tool created by a team of researchers from Northeastern, MPI-SWS and CNRS.

Safe Harbor for Public-Interest Journalism and Research: If Facebook is truly committed to improving on privacy, it should establish a “safe harbor” for public-interest journalism and research focused on Facebook’s platform. Facebook’s current Bug Bounty Program Terms, specifically, limitations on test accounts and prohibitions on automated testing, severely limit the types of studies of privacy-related issues in its advertising system that are feasible.

Assistant Professor of Computer Science at USC. I develop and help deploy algorithms that enable data-driven innovations while preserving privacy and fairness.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store