A Sidenote On Zuckerberg’s Senate Hearing — Why Is Facebook Taking Questions For Cambridge Analytica ?

Marion Bergeret
6 min readApr 11, 2018

--

Last night I watched a couple hours of Mark Zuckerberg’s hearing at the US Senate. There is a lot to say about it, so many different questions were raised and I could have commented on each of his answers on so many levels. But there’s a point my legal mind really wanted to make — call it legalistic, I hope it’s of interest to a few of you.

It struck me that I could try to convey a more legally-substantiated framework in which to read the currently-developing events, as someone who has been studying and practicing data protection law and the right to privacy for ten years. I am also only just starting to measure the complexity of the ethical and legal issues attached to data protection, so I do not aim to give an opinion on those in such a brief note. Here we go.

The assumption that Facebook “sold” data to Cambridge Analytica was repeated several times in the media in the days before the Senate hearing, and then again during the hearing itself. There is also a commonly held idea, that seems to be accepted even by Zuckerberg himself, that Facebook can be expected to control what third parties like Cambridge Analytica do with data after they are given access to them by users. Zuckerberg went so far as to say that he was looking into technical solutions to address that issue yesterday.

Under the European data protection scheme — old and new — “data controllers” determine the means and the purposes for which personal data are processed. If a data controller entrusts a third party supplier with a mission regarding its own processing of personal data, that third party becomes a “data processor”, and the data controller must hold them accountable (through a data processing agreement — a contract) for how the data processor uses the data the data controller shares with them.

Crucially, data processors do not process personal data for their own purposes. Rather, data processors are considered to act on behalf of data controllers, with the consequence that (with a few adjustments under the GDPR) (i) data controllers remain responsible to users and regulators alike for any misuse of data under their watch, even if by a data processor’s fault, and (ii) if data processors act outside of the scope of a data controller’s instructions, and process personal data other than as requested by a data controller, they can be exposed to liability as if they were a data controller.

In this case, the current focus seems to me to be on Facebook’s responsibility for the entire issue, as if Cambridge Analytica were Facebook’s data processor, when actually, any data protection specialist would agree both (i) that Cambridge Analytica had their own responsibility towards users, whom they communicated with and obtained consent from directly, and whose data they collected and used for their own purposes from the start, and (ii) that they appear to have failed to comply with any data controller’s basic obligations under both the old Directive scheme and the GDPR.

Cambridge Analytica was not Facebook’s data processor. App developers or the parties using data collected through them do not process personal data on behalf of Facebook. They are data controllers in their own right. They have a direct relationship with users, and users allow them — through a specific consent mechanism — to access their data. Facebook did not “sell” data to Cambridge Analytica — Cambridge Analytica collected data through users themselves. And Facebook could not have entirely controlled use of the data by Cambridge Analytica. To hold otherwise would be to negate any company but Facebook’s accountability to users for use of their data, with potential effects detrimental to many data controllers who engage in controller-to-controller data sharing.

The fact that the data Cambridge Analytica was given access to were initially collected and hosted on Facebook holds less relevance in legal terms than current commentary claims it does. Facebook is a data repository of sorts, which users themselves gave developers access to. The university study could have been conducted through a Typeform or other more traditional survey, and yet any data collected could still have been similarly misused, contrary to applicable law.

The fact that Facebook holds so much data that the data were found to be more revealing and powerful than a user might have expected when they consented to allowing Cambridge Analytica access is an ethical issue somewhat removed from Facebook’s legal responsibility under data protection laws.

In broad terms, data controllers’ obligations include informing users transparently, prior to the collection of their data, about the data that is being collected (arguably that was done through the Facebook Connect widget), but also about the purposes for which the data will be used (in this case, that was also done — the data were claimed to be collected for a university study), and the people the data may be shared with (or “recipients” of the data — in this case, I believe none were stated to users at the time they consented to their data being shared with the developer). Data controllers are also required to have a valid legal basis for the processing of personal data at all times (in this case consent — at least as regards the primary users who gave access to their accounts).

Assuming, from the information available, that Cambridge Analytica’s collection of users’ data was pretty close to compliant with then-current laws (and arguably that would still be the case, with a few adjustments, today), the crux of the issue here is that Cambridge Analytica, having collected the data quasi-lawfully, did not hold themselves to the obligations they had, by law, as a fully capable data controller.

These include the obligation to process the personal data they collected via their Facebook app solely for the purpose of a university study, i.e. the only purpose they had disclosed to users when they had asked for their consent. When they allowed Trump’s campaign to use the data, they also appear to have failed to comply with their obligation not to share personal data with recipients they hadn’t disclosed to users.

The fact that users’ friends’ data were also collected, albeit without their consent and without a valid legal basis, is another one of the major points they would have to address to justify the lawfulness of their actions. The mere fact that Facebook allowed them to do it is no defense in that respect.

Any claim that Facebook “sold” user data to Cambridge Analytica and should have controlled the way in which Cambridge Analytica used the data it collected on the platform, as data controller, is misleading. It fails to see the regulatory reality that Facebook has little to do in the relationship between users and third party app developers in general. My point is not to condone Facebook’s policy in this case, but to draw attention to the fact that a few of the questions Zuckerberg was faced with yesterday should be directed at Cambridge Analytica as well.

Facebook will likely be held accountable for allowing users to consent to sharing their friends’ personal data, and failing to inform regulators and users of the issue when they discovered that Cambridge Analytica had misused the data. There might also be a few important loopholes in their developer terms they will have to explain — though these would not necessarily trigger their liability directly. Arguably, some of the underlying shortcomings in the then-current regulation that these issues highlight have already been addressed through reinforced information and consent mechanisms and data breach notification requirements under the GDPR.

Focusing on new technical means Facebook should implement to control the use of data once data have been allowed, by users themselves, to leave its platform, is to skim over the fact that an infinite number of smaller players have access to users’ personal data and already have very concrete legal obligations to act responsibly and protect those data as well.

The GDPR increases the sanctions dramatically in situations like these — up from a couple hundred thousand euros under the Directive regime, to twenty million euros or 4% of a company’s global annual turnover.

While small players have been able to stay under most supervisory authorities’ radars in the past few years, my understanding is that they should be preparing for part of the enforcement focus shifting in their direction in the coming months, as evidenced by how seriously the UK’s ICO seems to be taking its investigation of Cambridge Analytica itself.

All views my own. With thanks to Laure Carli and Alexia Karides for early morning proofing 🙌🏻👌🤗.

--

--

Marion Bergeret

Tech & Privacy | Legal @Alan (ex @Snips @Sonos @BakerMcKenzie) | Fulbright & Samuelson Law Tech & Public Policy Clinic @Berkeley @Oxford @LeWagon