The Privacy Technology Driving the Google-MasterCard Deal

Assi Barak
Sep 4, 2018 · 3 min read
Credit: Illustration by Tam Nguyen/Ad Age

Google and MasterCard recently struck a deal after 4 years of negotiations. To put it short:

“For the past year, select Google advertisers have had access to a potent new tool to track whether the ads they ran online led to a sale at a physical store in the U.S. That insight came thanks in part to a stockpile of MasterCard transactions that Google paid for”

This raises a spur of privacy debates (The Inquirer, TheNextWeb , The Telegraph, to name a few). A Google spokeswoman sited by Bloomberg states: “Before we launched this beta product last year, we built a new, double-blind encryption technology that prevents both Google and our partners from viewing our respective users’ personally identifiable information”.

Jules Polonetsky of the Future Of Privacy forum, stated to have been briefed by Google on this, referred readers to his June 2017 post on Fully Hommorphic Encryption. This is corrected by Steve Weis in the following twitter thread to Googles paper published in Real World Crypto 2017, “Private Intersection-Sum Protocol with Applications to Attributing Aggregate Ad Conversions

So: Private Set Intersection (PSI) it is! To quote the paper:

“Protocols for private set intersection (PSI) allow two or more parties to compute an intersection over their privately held input sets, without revealing anything more to the other party beyond the elements in the intersection ..”

“… We consider a particular variant of the PSI problem, which we call the Private Intersection Sum problem. In this setting, there are two parties that have private input sets consisting of identifiers, and one of the parties’ datasets additionally has an integer value associated with each identifier. The parties want to learn cardinality of the intersection, as well as the sum of the associated integer values for each identifier in the intersection, but nothing more”.

Private Set Intersection is not a perfect solution as the paper acknowledges in the “Additional Security Precautions” section, stating that additional measures like Differential Privacy are required:

“In general, though, privacy may be violated as a consequence of certain input distributions. For example, if there are “outlier” v j values that are unusually large, the sum will be large; a priori knowledge of such values will allow a party to identify users. It is also possible that repeatedly executing the protocol in sequence will leak information due to correlated inputs in different sessions. Such problems are an artifact of the functionality itself 2 and would affect any intersection-sum protocol. One strategy for resolving this issue would be to compose differential privacy techniques … with the cryptographic protocol, by adding appropriately sampled noise to the inputs.”

Given the limited information supplied, the Privacy question remains open:

  • What Security Model did they implement? Honest-But-Curios,Malicious ?
  • Have they used Differential Privacy, in which setting and with which parameters?

As in other examples like Differential Privacy used by Apple in ios 10 to “ help discover the usage patterns of a large number of users without compromising individual privacy”, (An excellent description by Prof. Mathew Green), Cryptography experts are set with a dilemma addressing this issue: To quote Mathew Green relating to Apple :

“On the flipside, as security professionals it’s our job to be skeptical — to at a minimum demand people release their security-critical code (…), or at least to be straightforward about what it is they’re deploying. If Apple (Google — A.B) is going to collect significant amounts of new data from the devices that we depend on so much, we should really make sure they’re doing it right — rather than cheering them for Using Such Cool Ideas. (I made this mistake already once, and I still feel dumb about it.)

But maybe this is all too “inside baseball”. At the end of the day, it sure looks like Apple (Google A.B) is honestly trying to do something to improve user privacy, and given the alternatives, maybe that’s more important than anything else.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade