OK, Google, You’re Creeping Me Out: Advertising in the Age of Voice Devices

Gilad Rosner
Startup Grind

--

Burger King thinks they’re clever. They just launched a TV commercial that triggers Google Home to tell you what a Whopper is. BK gets points for creativity, but they are wading into a new debate that is bigger than increasing sales of questionable burgers. The slow emergence of voice-enabled home devices brings with it new concerns over how we are advertised to.

Let’s set the clock back 4 weeks: On March 16th, a gentleman named Bryson Meunier uploaded a video to Twitter of him asking Google Home what his day looked like. After telling him the time, the weather and the outlook for his commute, Google Home said,

“By the way, Disney’s live action Beauty and The Beast opens today. In this version of the story, Belle is the inventor instead of Maurice. That rings truer if you ask me. For some more movie fun, ask me something about Belle.”

Mr. Meunier titled his tweet, “New Beauty & the Beast promo is one way Google could monetize Home,” which is mild enough. He also posted to reddit, noting that he’d not searched for anything that might trigger the ad, and that he wasn’t a fan of ads on Google Home. The Register took a slightly more prickly tone: “Spammy Google Home spouts audio ads without warning — now throw yours in the trash.”

Serving you ads is core to Google’s business model, and there’s nothing in the terms of service that would prevent them from doing so with Home. What makes this story stranger is Google’s response to the Verge when they asked them for more information: “This isn’t an ad; the beauty in the Assistant is that it invites our partners to be our guest and share their tales.”

Really?

As the Verge and the Register point out, it sure sounds like an ad; there’s even sound effects behind it. Google went one step further and attempted to clarify:

“This wasn’t intended to be an ad. What’s circulating online was a part of our My Day feature, where after providing helpful information about your day, we sometimes call out timely content. We’re continuing to experiment with new ways to surface unique content for users and we could have done better in this case.”

It’s not an ad. It’s helpful information. Uh huh. That hair you found in your food? That’s just delicious extra protein we thought you’d like.

There isn’t anything intrinsically wrong with serving ads this way…. is there? Ad support is a hallmark of modern entertainment and, since the dawn of the Internet, of useful electronic services. Using Google’s services has always meant being exposed to ads, right alongside your searches or your email. So, why the ick factor? Why the Register’s shrill rejection? It comes down to context and expectation.

If you’re not in the privacy practitioner or academic community you may not have heard of Helen Nissenbaum, but her work is very helpful in understanding why people feel that their privacy has been violated. She calls her theory “contextual integrity,” and in a nutshell it says that people experience privacy violations when informational norms are breached.

The norms concern who sent and received personal information, the type of information, and the constraints governing the flow of the data. Examples include: You don’t expect your doctor to have your financial information just as you don’t expect your accountant to have your medical information. You don’t expect your best friend to tell your parents your secrets. You don’t expect strangers to know about your sex life. Much of this idea is predicated on the idea that humans construct and live in (more and less) distinct informational ‘spheres’: work life, family life, financial, medical, in public, alone, and so on. The argument I’m making is that slyly slipping ads into interactions with a virtual assistant, especially in the home, (currently) violates expectations:

  • Interactions with a virtual assistant are not search, ergo the expectations of using Google search are not applicable.
  • The ads breach the ‘home’ boundary in an unsolicited manner (also true of the Burger King ad).
  • Despite being subject to a blanket terms of service, Mr. Meunier did not assent to being marketed to through his Google Home device.

Admittedly, these are not privacy violations in the sense of Mr. Meunier giving information to Google, but contextual integrity is still useful here in understanding why, for example, ads inserted this way turn people off versus product placement in movies. Google pitches its Assistant as a personal secretary…. but do you expect your secretary to advertise to you and then take a fee for it? Scroll back up and look at Google’s strong denial that the Beauty and the Beast promo was an ad: “after providing helpful information about your day, we sometimes call out timely content.” I’d be more convinced by this argument if Google did not stand to gain by it, but they do. If my secretary told me I should check out a movie, I would naturally assume that this was a friendly encouragement and take it as such. I would feel manipulated if he or she dropped it in as normal conversation but they were in fact getting paid. Google’s disingenuous response shows that they are trying to normalize this manipulation.

Let’s go one step further and remember that there are children in the home. Mr. Meunier’s son’s squeal of pleasure can be plainly heard on the video. You may recall that in December US and European privacy and consumer protection advocates filed multiple complaints against the makers of My Friend Cayla, a wildly insecure network-connected doll. As if weak security allowing children to be spied on wasn’t enough, the complaint made to the FTC states:

“Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. For example: Cayla tells children that her favorite movie is Disney’s The Little Mermaid and her favorite song is “Let it Go,” from Disney’s Frozen. Cayla also tells children she loves going to Disneyland and wants to go to Epcot in Disneyworld… This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.”

The same concern is directly applicable here. When Google Home starts hocking movies, tv shows, toys, theme parks and food, children in earshot could have difficulty in separating the ad from actual intent of the interaction. Google’s statement illustrates that that is the goal.

The fact is, our expectations about voice interactions in the home may break down as virtual assistants start to proliferate. Google has successfully normalized software robots reading your email and then serving you ads based on its content. (Consider how you’d feel if it were regular postal mail as a comparative thought exercise.) Movie producers have successfully normalized product placement. Our semiotic environment is awash in advertising; it’s the modern condition. Perhaps concern for children will hold such things back, but one look at the multicolored cereal boxes placed at the eye height of kids in supermarkets makes me rather skeptical.

“We’re continuing to experiment with new ways to surface unique content for users,” says Google. “Unique content” is doublespeak; you have a salesman living in your home.

I am indebted to Dr Ewa Luger for her thoughtful input while writing this.

--

--

Gilad Rosner
Startup Grind

Privacy and technology policy researcher, and founder of the nonprofit Internet of Things Privacy Forum. http://www.iotprivacyforum.org/