Illegal, Immoral, and Mood-Altering

James Grimmelmann
17 min readSep 23, 2014

How Facebook and OkCupid Broke the Law When They Experimented on Users

This summer, Facebook and OkCupid revealed that they run behavioral experiments on users. First, in June, a Facebook researcher published a study reporting that when Facebook showed users News Feeds with fewer emotionally positive posts, their own posts became sadder. Then, in July, OkCupid announced that it “took pairs of bad matches (actual 30% match) and told them they were exceptionally good for each other (displaying a 90% match.)”

Most of the resulting discussion has treated this as a story about ethics. Which it is — and the lapses of ethical judgment shown by Facebook and OkCupid are scandalous. But the ethics are only half of the story. What Facebook and OkCupid did wasn’t just unethical. It was illegal. A common assumption is that even if research laws ought to apply to private companies, they don’t. But that assumption is false. Facebook and OkCupid are bound by research laws, and those research laws quite clearly prohibit what they did.

A Little Unavoidable Legal Background

Federal law — primarily the so-called “Common Rule”— regulates research on people in the United States. The details are complicated, the gist simple. If you engage in “research involving human subjects,” you must have two pieces of paper before you start. You need a signed informed consent form from the person you’re experimenting on, and you need approval from an IRB (short for “institutional review board”).

Neither informed consent nor IRB approval bear much resemblance to how Facebook and OkCupid operate. Informed consent requires much more than having boilerplate terms and conditions that everyone clicks through and no one reads—a process even OKCupid CEO Christian Rudder has admitted provides nothing more than “the charade of consent.” Informed consent under the Common Rule means telling participants about the research. It means warning them about the risks. It means giving them a chance to opt out without penalizing them if they do. It means giving them a chance to ask follow-up questions to someone who’ll provide answers.

Then there is IRB approval. An IRB isn’t allowed to approve a project unless it ensures that the research is appropriately safe, that the participants will give genuinely informed consent, that the researchers will protect participants’ privacy, and so on. That’s a far cry from letting researchers “run almost any test they wanted, so long as it didn’t annoy users,” as happened at Facebook. And under the Common Rule, an IRB needs to have a diverse membership: men and women, scientists and non-scientists, insiders and outsiders. It can’t just consist of Christian Rudder approving his own projects, as happened at OkCupid.

Both informed consent and IRB approval are complex and subtle requirements. The Common Rule has an extensive list of exceptions, rules for modifying informed consent, and procedures for fast-tracking IRB review. But all of these are ways to make informed consent and IRB approval work smoothly for institutions that are already committed to ethical research, not get-out-of-jail-free cards for institutions that want to ignore the Common Rule entirely. For example, weakening the informed consent protocol is something that only an IRB can approve; a researcher can’t just decide on her own that getting consent would be too much of a bother.

Let me repeat. The Common Rule is law. If you are subject to it, it is not up to you to decide whether all of its requirements are convenient for you. Consent is not a choice; IRB approval is not optional. It’s on you to carry out your research in a way that complies with the law.

The State(s) of Research Law and Ethics

You may at this point be raising an objection. I thought the Common Rule only applied to federally funded research. You’re right, it does. And I thought Facebook and OkCupid are private companies. Right again, they are. But that is hardly the end of the story.

For one thing, many academic journals require Common Rule compliance for everything they publish, regardless of funding source. So my colleague Leslie Meltzer Henry and I wrote a letter to the journal that published the Facebook emotional manipulation study, pointing out the obvious noncompliance. For another, nothing in Facebook’s user agreement warned users they were signing up to be test subjects. So we wrote a second letter to the Federal Trade Commission, which tends to get upset when companies’ privacy policies misrepresent things. And for yet another, researchers from universities that do take federal funding can’t just escape their own Common Rule obligations by “IRB laundering” everything through a private company. So we wrote a third letter to the federal research ethics office about the Cornell IRB’s questionable review of two Cornell researchers’ collaborations with Facebook.

And there’s something else, something we didn’t mention at the time. Federal law isn’t the only game in town. States have human-subjects research laws, too. Those state laws go further, sometimes much further, than federal law. California has one. So does New York. Leslie and I work at the University of Maryland, so we took a close look at what Maryland had to say on the subject. A very close look.

Maryland: We have good values, good laws, and a good flag.

House Bill 917 passed our General Assembly in 2002, by votes of 135–1 and 47–0. It was spurred by not one but two high-profile research ethics scandals in Maryland, one in which a volunteer died after receiving a dose of a non-FDA approved drug and another in which landlords rented apartments containing lead paint to families with small children. And what it does is elegantly simple: it closes the private-research gap in the federal Common Rule, turning it into a seamless system of protection for research participants, regardless of who pays for the research. In Maryland, all research must comply with the Common Rule, not just federally funded research. If you do research on people in Maryland, you need informed consent and IRB approval. End of story. What’s more, it puts serious teeth in the federal law. If you violate House Bill 917, the Maryland Attorney General can go to court to stop your research.

But wait, you may be saying, Facebook isn’t in Maryland, and neither is OkCupid. True. But they have users in Maryland, and given the size of the experimental groups, it’s overwhelmingly likely that they experimented on residents of the state. Facebook manipulated with hundreds of thousands of News Feeds; that’s thousands of Marylanders. OkCupid gave bad recommendations to about five hundred users. Even under conservative assumptions, the odds that OkCupid managed to avoid Maryland entirely are 100:1 against.

Oh, and one more thing. House Bill 917 also requires that every IRB make the minutes of its meetings available for public inspection. Leslie and I decided to exercise our rights as interested citizens. We sent letters to Facebook and to OkCupid reminding them about Maryland’s research laws and demanding the minutes of their IRBs’ meetings. What happened next will shock you.

Facebook: We Are Above the Law

Our Facebook letter drew a response from Edward Palmieri, a Facebook Associate General Counsel for Privacy. The letter is worth a read, if only for its head-spinning cognitive dissonance. Most of the letter is a detailed description of the emotional manipulation study:

We appreciate your interest in Facebook’s internal product development research, some of which, like the PNAS study reference in your letter, has been made public through articles published in academic journals. …

The PNAS study is an example of such research. We conducted the review to evaluate claims by some scientists and the press that using Facebook, in particular seeing positive posts from friends in one’s News Feed, could trigger negative emotional reactions. We believed it was important to research this claim, and we elected to share the findings with the academic community. …

As part of the research described in the PNAS study, the News Feed algorithm for a small percentage of randomly-selected users was tweaked during a single week so that certain posts had varying chances of being deprioritized when those users viewed their News Feeds.

To be honest, we were expecting some nit-picking lawyerly trench warfare. But no. Palmieri’s letter openly and repeatedly acknowledges that Facebook does “research” on its users, and shows how in loving detail. Under the Common Rule — and thus under House Bill 917, which borrows the Common Rule’s definitions, “research” is “a systematic investigation … designed to develop or contribute to generalizable knowledge” and a “human subject” is a “living individual about whom an investigator (whether professional or student) conducting research obtains data through intervention or interaction with the individual.” That’s exactly what Facebook did, and it’s exactly what Palmieri’s own letter explains that Facebook did.

Now for the whiplash. After a page spent agreeing that Facebook does human subjects research of precisely the sort that the Common Rule and House Bill 917 regulate, the letter abruptly denies anything of the sort.

The federal Common Rule and the Maryland law you cite were not designed to address research conducted under these circumstances and none of the authorities you cite indicates otherwise.

That’s it. That’s Facebook’s entire response to our letter pointing out that it’s legally required to get informed consent and IRB approval before running experiments on people. The claim, quite literally, is that Facebook is above the law that applies to everyone else. Lawyers have a word for this kind of argument: “conclusory.” All it does is set out the conclusion that the author would like to be true. It offers no evidence, no reasoning, just the hope that if you say something confidently enough, everyone else will nod along.

Unfortunately for Facebook, the argument that Maryland’s research ethics law wasn’t “designed to address” Facebook’s research is laughably wrong. House Bill 917 couldn’t be clearer. It says, “A person may not conduct research using a human subject unless the person conducts the research in accordance with the federal regulations on the protection of human subjects.” There you have it. No qualifications, no exception if your name is Mark Zuckerberg.

That leaves some kind of vague purpose-based argument that behavioral experiments like Facebook’s are somehow exempt because they’re different in kind from typical biomedical experiments. That argument might have flown decades ago, before the Common Rule’s drafting, but it’s a non-starter today. Anyone who has spent time in a social science department, or anyone who has spent time with people who have, knows that behavioral experiments are bread and butter for informed consent and IRB approval. The university researchers who worked with Facebook on other behavioral experiments took their projects to university IRBs: not the kind of thing you do if you think the Common Rule is irrelevant to your research. Indeed, the Cornell IRB concluded that the emotional manipulation study itself was “research … conducted independently by Facebook,” not that it wasn’t Common Rule-style research at all. So it makes no sense to say that these laws were not “were not designed to address research conducted under these circumstances.” They were.

If you’re interested in further legal details, Leslie and I set them out in a letter to Doug Gansler, Maryland’s Attorney General. But there really isn’t much more to it: the Common Rule explicitly allows the states to supplement it, House Bill 917 uses the federal definitions to extend the Common Rule to private research, and all of this was extensively documented at the time. Should Attorney General Gansler take an interest, I wonder whether Facebook will give him the same dismissive treatment it gave us.

OkCupid: Ethically Rudderless

At least Facebook wrote back. OkCupid simply ignored our letter. We’ve had to settle for the next best thing. Even if OkCupid wouldn’t talk to us, its CEO, Christian Rudder, enjoys talking to the press. He was particularly voluble in an interview with On the Media’s Alex Goldman and P.J. Vogt for their TLDR podcast. Goldman and Vogt were dogged interviewers. In the course of their fifteen-minute conversation, they got Rudder to discuss all of the important ethical issues—and Rudder managed to get every single one wrong.

Christian Rudder

The centerpiece of Rudder’s defense was that it would be unethical not to run experiments on users.

CR: I think part of what’s confusing people about this experiment is the result. The algorithm does kind of work, y’know and power of suggestion is also there. But like, what if it had gone the other way? What if our algorithm was far worse than random? Then if we hadn’t had run that experiment we basically are doing something terrible to all the users. Like this is the only way to find this stuff out, if you guys have an alternative to the scientific method I’m all ears.

The argument overall is specious, but Rudder is right about science. A cancer drug may not work for everyone in a clinical trial; it might even hurt or even kill some of them. But the trial is good for society as a whole, because future patients will receive better care.

But here’s the thing. The central pillar of modern research ethics is that in most cases researchers don’t get to decide for themselves whether an experiment is worth it. Unless the risks are minimal or nonexistent, that decision belongs to the participants, not to the researchers. That’s what the Common Rule does: it systematically takes these decisions out of researchers’ hands, and gives them to participants and IRBs.

The question, then, is not whether OkCupid is allowed to test its algorithms, or whether oncologists are allowed to test cancer drugs. Of course they are. Of course they should. The question is whether these tests will be carried out on willing participants or on unwilling victims. In his TLDR interview, Christian Rudder took three swings at the real issue: informed consent.

First, Rudder claimed that informed consent would have biased the results:

AG: Was there any consideration given to an opt-in procedure where people could, beforehand, be part it and then just having a control group?

CR: No, there wasn’t. Once people know that they’re being studied along a particular axis, inevitably they’re gonna act differently. Just the same way that people on reality TV don’t act like themselves.

Unfortunately for OkCupid, “gonna act differently” is not the legal standard for waiving informed consent, because if it were, informed consent would never be viable. Telling people things changes their behavior. A little bias in research results is simply part of the price we pay for informed consent.

No, the Common Rule’s standard for waiving informed consent is much higher: that the “research could not practicably be carried out without the waiver.” But of course OkCupid’s matchmaking experiment could have been. All you need is an experimental group of users whose reported match percentages are tweaked and who gave informed consent, and a control group of users whose match percentages aren’t tweaked and who also gave the same informed consent. That would have isolated the effect of tweaking the percentages.

Next, Rudder argued that “informed” consent doesn’t do any good anyway:

[CR]: Like I was in some psych experiments when I was in college, just ‘cause they give you twenty bucks to go to the department and you, y’know, you sign a form. But that is informed consent — which users can’t see but I’m putting in quotes — and you uh, y’know you sit down and you hit a button when some word blinks on the screen or a dot appears and you like move a lever or whatever, and you have no idea what they’re measuring you for. Y’know they don’t tell you anything, they could just be measuring whether you’re obeying their instructions or how you greeted the person of another race at the very beginning of the whole thing and the experiment is just a sham. So like, you’re not really informed. …

It’s not a coincidence that Rudder’s hand-picked example is an experiment with the same central characteristic as the OkCupid mismatch experiment: deception. But there is a crucial difference: an IRB could find that this hypothetical experiment would qualify for the Common Rule’s exception to informed consent. (It would have to be an IRB: a waiver from informed consent isn’t the same as exemption from IRB review.) The reason “they don’t tell you anything” if they’re testing “how you greeted the person of another race at the very beginning” is that if they did tell you, you’d greet that person the way you thought the researchers wanted you to. In other words, Rudder has singled out a rare case in which informed consent is excused and mistakenly concluded that in the typical case informed consent is useless.

Finally, Rudder flirted with a claim that using a website is itself a kind of implicit consent to be experimented on. In the original blog post, he wrote “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.” On TLDR, he expanded on the theme:

CR: Y’know, here’s a way to think about it. Like I think, let’s talk about advertising for a second, ‘cause I think that’s a…Y’know, look, people have become about advertising, have become a lot more savvy than the 1950s when you see a doctor smoking a cigarette in an ad and you’re like “Oh man I totally believe that these cigarettes are good for you.” And I think that, we’re at the beginning of that process right now with these kinds of experiments, is that in 20 years I think everyone will be like “Oh yeah they’re just running some experiment” and they’ll take all of this stuff with a little bit more of a grain of salt, y’know? If you think that OkCupid has unlocked the mysteries of love and has an ironclad algorithm, prophetically can tell you exactly who is right for you, you’re a crazy. Y’know? So like, we’re doing our best, for sure, and it’s the same thing. Like I think people will realize that that’s how these sites work, that’s how they evolve, they’re doing the best job that they can, and they also have their own interests as well. And, and maybe that’s the process that we’re looking at. And that’s the kind of, again the kind of conversation that I think Facebook on accident, and OkCupid on purpose is trying to kickstart.

Three things. First, Rudder couldn’t bring himself to say that the public today knows how websites run experiments on users, only that it might someday learn. Second, a big reason people have become more savvy about misleading advertising is that the government has become more aggressive in policing it. If we want the same level of public savviness about website experiments, a good place to start would be more FTC oversight. And third, there’s an even better way to have a “conversation” with users about the fact that you run experiments on them: tell them about it beforehand and get their permission. The only reason not to ask is that you’re afraid they might say “no.”

Even if he is reluctant to admit it, Christian Rudder knows that the match-making experiment crossed a crucial ethical line. When the experiment was over, OkCupid sent emails to all the unwitting participants telling them of their correct match percentage. That’s not something you’d do if you really thought the initial lie was harmless or that users wouldn’t care. Notice after the fact is no substitute for informed consent up front — but it concedes the point that the experiment was something ethically different from the day-to-day operations of the site. You don’t write to users to tell them you tested a new font.

Rudder is a clever amateur researcher. Unfortunately, he is also an amateur ethicist.

AG: Have you thought about bringing in, say, like an ethicist to, to vet your experiments?

CR: To wring his hands all day for a hundred thousand dollars a year?

AG: Well, y’know, you could pay him, y’know, on a case by case basis, maybe not a hundred thousand a year.

CR: Sure, yeah, I was making a joke. No we have not thought about that.

I’m not sure which is sadder: that Rudder’s immediate reaction to the idea of ethical review was to mock it, or that he had never thought about it before.

A Culture of Contempt

OkCupid used this image to announce the blog post about experimenting on them.

Facebook and OkCupid’s responses have something in common: contempt. Contempt for users, contempt for ethics, contempt for the law. These companies felt so secure in their entitlement to dissect users however they wanted that it was incomprehensible to them that someone might object. It was doubly incomprehensible that the law might take users’ side.

Many of Facebook’s and OkCupid’s defenders have argued, in one fashion or another, that what they did is nothing special. All tech companies run experiments. All tech companies manipulate their users. All tech companies hide behind vague and inscrutable terms of service. These are reasons to be more concerned, not less.

We do need to have a conversation about social science research on users in the age of Big Data. But that conversation should start from the position that how companies treat their users is a subject on which society has already spoken. If we do not start there, we will never get there. Yes, too many companies act within a culture of disdain towards users. But this is a setting in which the law tells them, no, users are people too, and you must treat people with dignity and respect.

We aren’t saying that companies shouldn’t do experiments. We’re just saying that when companies do experiments, they must do them ethically. Informed consent and IRB review aren’t just the right thing. They’re the law.

Addendum (September 25)

Many people have asked whether all website A/B testing now requires informed consent and IRB review. The answer is “no,” and I should have said more about why. Better late than never.

There is a line. Courts, agencies, scholars, and IRBs have spent decades drawing the line that distinguishes regulated “research” from things like doctors’ routine clinical practices, hospitals’ internal quality improvement processes, and historians’ studies of recent events. The line is controversial in places, but no one doubts that it exists; the Common Rule has not stopped doctors, hospitals, and universities from carrying out most of what they do outside of the Common Rule’s IRB framework. So too for websites.

The line has a solid legal basis. The Common Rule defines “research” as “a systematic investigation … designed to develop or contribute to generalizable knowledge,” which excludes work tied to particular settings without larger lessons. It also has numerous exemptions, e.g. for research on classroom teaching, using surveys, or involving only pre-existing publicly available data. And even for research under the Common Rule, there are procedures to waive informed consent and expedite IRB review in appropriate circumstances.

Facebook was on the wrong side of the line. Facebook retroactively tried to present the emotional manipulation study as “internal product development research.” But it was designed by a team including university researchers, was intended to replicate or refute a published academic study, made broad claims about human behavior that would apply well beyond Facebook, and was itself published in a scientific journal. This was not “internal” research or a secondary analysis of existing data. It had a research purpose from the start, and was indistinguishable in all important respects from standard university research.

OkCupid was on the wrong side of the line. The use of deception is a strong signal that this was not just a test of OkCupid’s website or algorithm: it was an investigation of social and cognitive biases in romantic relationships. (“[D]oes the mere suggestion cause people to actually like each other? As far as we can measure, yes, it does.”) Just like the emotional manipulation researchers, OkCupid published the results for public discussion. OkCupid itself situated the experiment as part of the social process of science: the assembly of consensus knowledge about the world through the design of replicable experiments and the public sharing of results.

“Is A/B testing research?” is a category error. A/B testing is a way of doing things, some of which are regulated “research” and most of which aren’t. (I should have been clearer about this, and that the same is true of “experiments.”) What Internet companies do is not automatically exempt from research laws just because it takes place online or in the private sector.

--

--