Facebook famously released different trailers to “Straight Outta Compton” for users of different “ethnic affinity groups”

Hey, Internet. Quit Being So Creepy.

t.tippy
Learning UX

--

A Growing Body of Research Indicates Users Favor Less Personalized Design

Ten years ago, anyone typing “Egypt” into a Google search field would yield very similar results to one another; perhaps links to a Wikipedia page, National Geographic, the World Factbook. Today, every user receives unique results, custom tailored to their interests and preferences. After entering “Egypt” into the search bar, you might return several news articles, while your brother might see sites dedicated to tourism or the history of the pyramids. Google is not alone in creating a customized experience for each user. Sites like Amazon and Netflix personalize product and movie recommendations, and Facebook personalizes each user’s news-feed. And thousands of smaller companies mine user data for a host of opaque purposes.

This personalization can range from the convenient (Google fetching local showtimes when returning a query for a newly released movie), to the offensive (Facebook displaying different trailers to “Straight Outta Compton for users of different races). Product designers and engineers express their intent to deliver the most relevant and useful results for their users, however, much of what makes these products “relevant” is calculated using vast swaths of user generated data, most of which users unwittingly shared, simply by turning on their phone or computer.

A growing body of empirical research suggests that users are not uniformly comfortable with this customized web experience. A 2009 study from the Annenberg School at the University of Pennsylvania demonstrated that 66% of users did not want targeted advertisements, and once users were informed of the tracking mechanisms that support targeted ads, even more (up to 86%) did not want them at all.

Recently, a team of researchers from the Information School at Berekely examined the current state of personalization and the factors which push it from something that is “beneficial or acceptable, to something that is unfair.” Using an experimental vignette design, the researchers measured users’ perceptions of fairness in response to personalized content. These vignettes were situated in three domains: targeted advertising, filtered search results, and varied retail pricing using data types including race, gender, residence, and income.

The overall results of this study indicate that users have nuanced, context-dependent attitudes about personalization. However, the research clearly demonstrates that personalization based on household income or race is viewed negatively by users. The team writes that “Users found the use of income to target ads, filter search results, and display different prices to be unfair, even when the user provided it” Additionally, the use of race in personalization was seen as unfair across all three domains, regardless of whether it was provided or inferred, accurate or inaccurate.

The team concludes that more surveys, experiments, and interviews need to be done in order to better understand this complex issue. In the meantime, designers and engineers can ask themselves “Just because we can, does that mean we should?” Collecting data and deploying elegant products which deliver a highly unique experience to each user is a thrilling technical accomplishment. But when we begin to alienate, and possibly discriminate against users, it’s time to go back to the drawing board.

--

--

t.tippy
Learning UX

Graduate Student in User Experience & Interaction Design. Exploring ethics & accessibility.