The Facebook Loophole

As corporations become data powerhouses, will researchers be tempted to evade ethics reviews by offloading data collection to them?

Sebastian Deterding
7 min readJul 1, 2014

The media storm over Facebook’s recently published emotion contagion study has given the public an unlikely primer in research ethics: It is refreshing to find TechCrunch or Forbes discuss esoteric matters like “informed consent” or “Institutional Review Boards (IRB)”, otherwise confined to the late night programming of public broadcasters and the everyday sorrows of academics.

But beyond the question whether the study was legal (probably), ethical (your call), or a smart thing to do (…), it raises a bigger question: Is it a glimpse into an ungainly, gaping hole in current research ethics, opened by industry research and cozy industry-university relations?

The Cornell Double Whammy

A lot of the present kerfuffle revolves around the two university researchers, Jeffrey Hancock and Jamie Guillory, who co-authored the article and were employed at Cornell University when the study was conducted. As such, when they run studies with human subjects, they are typically subject to review and approval by an IRB, which typically requires getting informed consent from all participants. A nuisance for sure, but a critical safety check to ensure that the freedom of research doesn’t veer into things like the Milgram experiment, the Tusgekee syphilis study — or people freaking out that their social network of lacking choice unwittingly manipulated their emotions. Many academics feel that that no IRB in the world would have let the Facebook study happen as it did, certainly not without informed consent. So how could it still happen?

Enter Cornell University, which on June 30 released a curt media statement:

[The involved researchers] analyzed results from previously conducted research by Facebook into emotional contagion among its users. Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper […]. Because the research was conducted independently by Facebook and Professor Hancock had access only to results — and not to any data at any time — Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

Cryptic? Here’s the background: The US Federal Policy for the Protection of Human Subjects (“Common Rule”), which Cornell (and most other US research universities) abide by, requires that studies involving human subjects are reviewed and pre-approved by an IRB. The media statement appeals to two exemptions from the Common Rule — a legalistic double whammy:

1. “Look ‘ma, no hands!” — That little word “engage”

Technically, Cornell argues, their researchers weren’t even “engaged in human research”. According to the non-binding guidelines of the US Office of Human Research Protections,

an institution is considered engaged in a particular non-exempt human subjects research project when its employees or agents for the purposes of the research project obtain: (1) data about the subjects of the research through intervention or interaction with them; (2) identifiable private information about the subjects of the research; or (3) the informed consent of human subjects for the research.

The study article itself states in its footnotes that the university researchers “designed research” and “wrote the paper” — the research was “performed” and data “analyzed” only by the Facebook employee.

2. Found Footage: “existing data”

Note the phrase “previously conducted research” in the media statement. The Common Rule states that research is exempt from IRB review and approval if it involves

“the collection or study of existing data, documents, records, pathological specimens, or diagnostic specimens, if these sources are publicly available or if the information is recorded by the investigator in such a manner that subjects cannot be identified”.

Thus, the study could have gotten around IRB approval and informed consent if Facebook had already done the study on its own as part of its constant service tinkering, and after the fact the university researchers came along and said: Gee, we could use the data you generated for a nice journal paper. Again, the article itself suggests this as well:

“Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging.”

In sum, Cornell says all was kosher because, look, Facebook obtained the data: We never “engaged” in any research here, we just analysed and wrote about “existing data”!

Given that the Cornell researchers “designed [the] research” (which happens before you run a study — one would hope) and engaged in “initial discussions” with the Facebook author who ran it, this is not just disingenuous: If this became common practice, it would tear a massive hole into the ethical checks and bounds on human subject research.

The Buddy Shortcut

University and industry researchers collaborate all the time, and researchers regularly switch camps from academic halls to industry labs and back. Companies like Google, Microsoft, or Facebook all have research departments that sponsor and present at academic conferences, complete with recruitment booths to hire PhDs and post-docs fresh from the mint.

If the logic of Cornell’s defence were to gain foothold, at such an event, university researchers could over a beer with their industry friends “just have an idea for a study” — and three months later magically find an email by their industry pal, telling them that “what a coincidence”, the data s/he gathered provides just the right material for the study the university colleague had in mind: shouldn’t we write a paper about this together? Whenever there is a topic that interests both company and university researchers, the latter could conveniently circumnavigate all those pesky IRB and informed consent processes by handing over the “data obtaining” part to their less regulated counterparts.

I perfectly agree with danah boyd that IRBs and informed consent do not equal ethical deliberation: they are often treated as a formality to protect universities from legal liability. Researchers often try to game them to do what they want to do anyhow. The Facebook study highlights one (new?) way of gaming the system and thus only illustrates, as boyd holds, that “We need ethics to not just be tacked on, but to be an integral part of how everyone thinks about what they study, build, and do.”

Update (July 4): The Corporate Vacuum

Let’s shift focus from the scientists to the Facebook employee. The editor-in-chief of the journal that published the study, the Proceedings of the National Academy of Sciences of the United States of America (PNAS), just released an “Editorial Expression of Concern” that will be printed in the same issue as the study itself. It states:

Obtaining informed consent and allowing participants to opt out are best practices in most instances under the US Department of Health and Human Services Policy for the Protection of Human Research Subjects (the “Common Rule”). Adherence to the Common Rule is PNAS policy, but as a private company Facebook was under no obligation to conform to the provisions of the Common Rule when it collected the data used by the authors, and the Common Rule does not preclude their use of the data. Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper. It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.

This statement is not fully correct: The PNAS “Editorial Policy” sets out its own demands for studies published in it, including “Research involving Human and Animal Participants and Clinical Trials must have been approved by the author’s institutional review board” and “For experiments involving human participants, authors must also include a statement confirming that informed consent was obtained from all participants.” These are similar to the Common Rule, but don’t demand adherence to it. Why is that important? Because a Facebook employee — Adam Kramer — was a co-author, and indeed the lead author of the paper. Following the article, he “performed research”: he did not pick up “existing data” somehow “lying around” in Facebook.

The Cornell press statement logic allows scientists to publish on Common Rule-exempt data as long as they pay formal lip service to it being “existing”. The PNAS logic is even more troublesome: Because Facebook is a private entity not beholden to the Common Rule, it is apparently also not beholden to the editorial policies of scientific journals if its employees wish to publish in them. Imagine, for the sake of argument, that the study had been single-authored by the Facebook employee. Following PNAS, this may be “a matter of concern”, but it would be permissible for him/her to run the study and publish its results in the PNAS, all without review and approval of an ethics board, and without informed consent. (At most, Facebook author A would need a university researcher sidekick B later exempting A’s data collection as B’s secondary use — or maybe find a Facebook employee C who folds A’s research study into C’s ‘business as usual’ A/B test, so that A could use it as “existing data”. The Common Rule and journal policies typically require IRB approval from all involved institutions/researchers to prevent this kind passing the buck.)

In Sum

As I’ve argued previously, this is just one outgrowth of digital networked technology enabling new entrants to engage in activities that were previously confined to an elect few — like large-scale human subject research. One major issue is that these new entrants are typically not subject to existing laws and regulations (no Common Rule for Facebook), and fervently resist being subjected to them (see Uber, AirBnB, etc.). That is, they claim that somehow, “because new technology, old rules don’t apply.” And as the Facebook emotion study shows, not only does that create an unregulated Wild West for new actors: It also seduces old actors to weasel themselves out of their laws and regulations.

--

--

Sebastian Deterding

Empirical philosopher designing and researching playful & gameful things. Works at @dc_hub. Edited @gamefulworld. Home at codingconduct.cc