To deliver on the promise of personalization, marketers must confront doubts about the data and algorithms at the heart of technology
Consumers need marketers. Yes, marketers are selling things, but they are also trying to help consumers navigate the growing thicket of choice to satisfying conclusions. Marketers need consumers, too. They need data about them. They need to turn that data into something personal and meaningful. They need to step in offering well-intentioned suggestions and step back at the nearest sign of discomfort.
Consumers understand the value of relevant marketing, but have not always wanted to pay the price. They are torn; their feelings articulated in the decorated American poet A.R. Ammons’s famous phrase: “One can’t have it both ways and both ways is the only way I want it.” Skepticism and anxiety about marketers’ motives persists. As personalization becomes more sophisticated, marketers must adapt by raising the level of transparency about what personal data they are collecting and illustrate the value they deliver in exchange. They must offer consumers more control over their data. And consumers must be certain that all personalization is done fairly, and not a ruse for abuse.
The promise and peril of personalized marketing
Marketers tout personalization as the new standard for interacting with consumers. Salesforce Marketing Cloud CEO Scott McCorkle calls it “Marketing’s Law of One.” “Right person, right message, right time” is the guiding mantra with mobile phones increasingly splitting people’s time into slimmer moments. Individual-level data are the era’s most valuable asset and algorithms are the primary tools for maximizing its value. Algorithms calculate. They reason. They predict. They even choose what and when people see. Data plus algorithms are intended to solve the modern consumer’s biggest problem of too many options and too little time by helping cut through the clutter.
The most sophisticated algorithms are owned by technology giants and digitally forward firms. One of modern masters is Netflix, whose algorithm pushes out tens of millions of versions of its service and uses the information collected about viewing habits to inform its programming. Three years ago, the mega-hit House of Cards promoted debate about whether its origin was in predictive modeling or a producer’s creative mind. The answer: Both.
Netflix says its programming decisions are a mix of 70 percent data and 30 percent judgment, according to the Chief Content Officer Ted Sarandos. “But the 30 needs to be on top, if that makes sense.” Data analysis gives Netflix confidence in lower-profile releases like the documentaries like What Happened, Miss Simone? and The Square. “We score with home runs, too, but we also score with singles and doubles and triples,” Sarandos earlier this year.
Data and algorithms are prerequisites for personalized marketing and its primary threat. A key problem with data is simply that consumers do not understand how much is being collected. Just a quarter of people realize their location is being tracked when they are online, according to strategists at the design firm Frog. Nor do they know how much marketers are profiting from it. Credit card companies have long generated a healthy revenue stream from selling anonymous data to third-party networks for targeted advertising. MasterCard, by one estimate, earns more than $1.2 billion a year selling consumer data. Google and Facebook don’t need to sell to third parties. They can sell it directly to advertisers who can pay twice — to access the data and the platform.
The problem with an algorithm is even more basic. It’s not a human being. And people know it. We can forgive errors of judgment in our friends, but shadowy lines of code do not get the benefit of the doubt. Perhaps the most famous example of the anger aroused by an algorithm is the now familiar story of Target sending baby product coupons to a 17-year-old whose father did not know she was pregnant. Target’s marketing department sent out the coupons, but the algorithm that knew more than the dad was the target of the outrage. More recently, Facebook came under criticism for running experiments that adjusted the positive and negative content in user’s feeds without their consent. Facebook justified the experiments on the grounds that it was trying to build a more useful product that kept users more engaged. It did not take a giant leap to see how such experimentation on moods could be directed for the purpose of, say, making consumers nostalgic for comfort food.
How marketing adapts in the era of personalization
This past year, to promote its NX crossover vehicle Lexus created 1,000 different ads for unique audiences using Facebook’s data and targeting tools. More data sources plus Moore’s Law could eliminate the need for group targeting and push personalization closer to the Marketing Law of One. Marketers tell a beautiful story about how data and algorithms will make lives better. They will sense your craving for sushi, ship you laundry detergent before you realize you’re out, and help you live healthier by keeping you motivated to exercise.
To adapt in the era of personalization, marketers need to put themselves in the mindsets of consumers whose views of personalization are more nuanced. Research reports and academic journals are littered with statistics showing people find personalized ads more appealing and memorable. But overall, attitudes toward personalization are a mixed bag. In the United Kingdom, for instance, just one in five people are happy with companies using their information to offer them more personalized products and services, according to a Deloitte study of personalization. One of the strongest statements about American attitudes is found in a 2015 University of Pennsylvania study, which found more than half of people resigned to giving up their data. The implication is sobering: People don’t want to lose control over their information; they just believe the decision is already final.
As long as people remain skeptical of marketing and conflicted about its use of personal data, algorithm-driven personalization will always run the risk of creepiness. The good news for marketers is that consumers’ comfort with how algorithms are used is malleable and changes across contexts. Almost half of Americans say it is acceptable for a retail grocery loyalty program to keep track of shopping habits in exchange for saving money on their purchases, according to a 2016 Pew Research report. Acceptability drops to one-third for a social media platform that uses their information and photos to deliver personalized ads. Barely one-fifth say it’s acceptable for a technology company to create a smart thermostat that could lower a home’s energy bill in exchange for tracking people’s movements from room-to-room.
The algorithm’s fundamental challenge is that people know it’s a machine. We can forgive errors of judgment in our friends, but shadowy lines of code do not get the benefit of the doubt.
Consumers fear abuse, expect privacy, and want control. More powerful machines do not address these concerns. They feed doubts about the marketers’ motives. Netflix and Facebook have algorithms that are the envy of companies everywhere, yet only Netflix is among the top five relevant brands in marketing consultancy Prophet’s 2016 ratings. Facebook doesn’t crack the top 50. Everyone knows Facebook. It makes people happy. It fills an important need. But it’s not a relevant brand, says Prophet’s Jesse Purewal because “it’s not a brand people trust.”
To thrive in this era, marketing must adapt to meet consumer expectations about a full range of feelings about data sharing, the value they get from algorithms, and the risks they face participating. Marketers should take three specific steps to adapt.
1. Marketers must raise the level of transparency about what data they collect and how they use it in their algorithms
They must be more transparent about the value delivered in return. The terms and conditions agreement full of legalese that consumers agree to when they sign up for a service is an inadequate format for disclosure. While people are generally aware of data collection, they are uniformed about the specifics.
More and more companies offer an easy-to-read explanation of data collection and sharing in plain English. Then they bury it on a web site. Companies should break apart that wall of text and push it out proactively in personalized communications just like they would a personalized product, packing it inside relevant advice or a relevant offer. Beginning with the initial sign-up, marketers should be more active in offering a choice about information sharing. Marketers should remind consumers they can adjust their preferences more frequently than simply when marketers send updates on privacy policies. Since consumers feel differently about information sharing depending on the context, marketers should offer greater privacy options such as by device, time of day, or behavior. And they should use the same optimized user-experience for customizing those preferences that they do for buying a product through an online checkout.
2. Companies must make their algorithms more human
Algorithms are rigidly rule-based actors. They pursue objectives single-mindedly and focus on data at hand, which can lead to myopia. Writing in the Harvard Business Review, Michael Luca, Jon Kleinberg, and Sendhil Mullainathan recount a story of myopia from a consumer packaged goods company buying goods in China and selling them in the U.S. It used an algorithm to predict which goods would sell best. Those items did sell, but after seven months the company started seeing more returns because the algorithm did not fully take projected consumer satisfaction into account. Within large organizations, marketers are the employees best positioned to understand the full range of consumer needs. Marketing departments need to be closer to the creation and optimization of these algorithms so that they can help select the right data inputs that lead to long-term company goals.
3. Marketers must shift to a more proactive, pro-consumer stance about data ownership
The current view of consumer data among most companies is that it belongs to them. They focus on data security and protection. When a company’s servers are hacked and giant chunks of consumer data stolen, the standard response is to apologize in an email and offer free identify theft monitoring for a year.
A more consumer-friendly stance is that consumer data should belong to consumers. One of the most progressive statements comes from Apple CEO Tim Cook who, at a summit on privacy last summer, declared: “We believe the customer should be in control of their own information.” One potential implication of this shift would be to give people copies of data about themselves. They wouldn’t own it, but they would own a copy to use in any manner they choose.
Today, companies look to be helpful by taking people’s data and serving it back to them through data visualizations and advice. In the future, proactive firms would think of ways to provide consumers with raw, online-accessible files for them to own and use. Most people wouldn’t be able to manipulate these files in the sophisticated ways that create the visualizations and insights marketers send them. Today, the assumption is a single all-knowing tech giant like Amazon can do this best. But the goal of giving people data would be to spur a new marketplace of intermediaries that could empower people to understand their behaviors in new ways. They might upload the data given to them from say three apparel-makers, an internet provider, and a streaming service like Netflix. An intermediary might help them shift their wardrobe drawing from a full range of TV and movie tastes, a realistic price point, and their favorite clothiers. All the while, the consumer would be the one in ultimate control of personalization.
Marketers who think data privacy and ownership issues will fade as younger consumers become a larger part of the economy take a great risk. Studies do show that Millennials are more comfortable sharing personal information than their Boomer parents. They will exchange data for something valuable in return.
The full truth about young people’s attitudes is nuanced, though. They show more comfort with trading social media information for personalized ads than medical information for easier doctor’s office scheduling. Their digital behaviors also display savvy as well as suspicion. People under 30 have embraced social media platforms like Snapchat and Whisper that offer more anonymity and privacy. Nearly three times as many people 18–29 said they deleted or edited something posted online versus people ages 50–64, according to Pew. Young people are more likely to have cleared their browsing histories, disabled cookies or declined to use their real name on a website.
Relevancy, privacy, and security are fundamental human desires and each person weighs these elements differently when constructing their own marketing preferences. Personalization already offers a relevancy-for-value trade-off and is positioned well to continue doing so in the future. In building more meaningful relationships, marketers will have to work harder to close the gap between the relevancy people know personalization can bring with the caution they feel about what it means for their personal identities. Trading a mobile coupon for your current location is the basis for a transaction, not a relationship. As the early 20th century cultural critic H.L. Mencken wrote: “It is mutual trust, even more than mutual interest, that holds human associations together.” For marketing to thrive in the era of personalization, consumers must believe there are good human motives at the core of marketers’ machines.
John Balz is a Strategy Director @VML. Follow him on Twitter @Nudgeblog.