When Is Digital Targeting Discrimination?
Your digital experience is not a random act of advertising. It is a strategically curated display of image and text designed to make you believe certain things about the world and yourself with the intention to move you to action. We want you to think something so you’ll do something. But these decisions aren’t made in a vacuum by objective algorithms. They’re made by engineers who build the code and marketers who deploy the budget. And you don’t have a lot of control over that.
Your digital experience, whether you realize it or not, is customized by your online behavior, your physical location, and a whole host of attributes that you might not be aware of. Gender, race, education, household income, and countless other data points are used to build a profile of you that advertisers use to curate the ads you’re served, recommend the content you watch, and the social posts you see.
For most of us, digital targeting is a little creepy. It’s like they’re watching me; they know me. But it can be convenient too. I like knowing what other TV shows I might be interested in after binge-watching the last season of Scandal. And I don’t hate the 15% off ad for Blue Apron. It’s little more than generalization about my possible interests and buying habits. Until it is more. Until it becomes discrimination.
I started in advertising as a philosophy major turned media planner and was fascinated by the audience profiling tools we used to build targeted media plans. From a marketer’s perspective, the most cost efficient route to market is to identify a target audience and meet them in their watering holes with an appropriate message or offer. It makes sense. But as our digital experiences are becoming more curated reflections of ourselves based on our known interests and probable future ones, by their very nature, they become limiting.
In 2015, Carnegie Mellon University published a study that showed Google’s online ad platform was 141% more likely to show an ad for high-earning jobs to men rather than women. In the study, researchers built a model to simulate internet users with no search history who began to visit job-finding sites. When the models visited a third-party news site, Google served an ad for a career coaching service promoting $200k+ executive positions. This ad was served 1,852 times to the male profile and 318 to the female one.
In 2015, Carnegie Mellon University published a study that showed Google’s online ad platform was 141% more likely to show an ad for high-earning jobs to men rather than women.
The study doesn’t give a reason for the significant difference. It could be that the advertiser set the buy to target males or that Google’s algorithm determined men were more likely to click. Regardless, with today’s pay-gap between men and women, the result of digital targeting has meaning. It’s illegal to discriminate in hiring practices but it’s not illegal to target ads or content to specific audiences.
Drawing the line between discrimination and digital marketing practices leaves me want for a solution. But I don’t have one. The role of digital marketing and algorithm-based media is to efficiently deliver the right message in the right medium to an open, receptive audience. This will not change. But perhaps there is a company or organization that wants to play with broadening horizons. I don’t think there is a true rule that can be written for these more subtle digital generalizations but I did read an interesting piece from Upturn which breaks down in detail what is going on in the payday loan space. High-interest payday lenders are using digital advertising in a measured attack on the financial health of low-income neighborhoods. Not only are they leading financially desperate people into a the trap of high-interest loans but they are also mining and sharing their data with little regulation or care for security. Its cases like these clearly show the ethical dilemma on our hands.
What do you think? Could any of your marketing initiatives be unintentionally discriminatory? Have you noticed this before?