Facebook crosses the line into mind control territory

Enrique Dans
Enrique Dans
Published in
3 min readJun 29, 2014

--

An article published in the Proceedings of the National Academy of Sciences (PNAS), one of the world’s most prestigious peer-reviewed journals, entitled Experimental evidence of massive-scale emotional contagion through social networks reveals that at least one member of Facebook’s Core Data Scientist Team, Adam D.I. Kramer, has taken part in a highly disturbing experiment using Facebook users’ wall that suggests emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.

Here’s what happened. Over the course of a week in January 2012, some 700,000 Facebook users were subjected to a psychological experiment without their consent or knowledge. Updates to their wall by friends were intentionally filtered to show either states of mood where positive words or connotations predominated, or the opposite. After a week, an evaluation was made as to what extent these users were highly likely to share updates that were overwhelmingly positive or negative.

The experiment is simple: it establishes the degree of influence that information shared in a their social surroundings exercises over their mood, as the article explains:

Emotional states can be transferred to others via emotional contagion, leading them to experience the same emotions as those around them (…) This research demonstrated that (i) emotional contagion occurs via text-based computer-mediated communication; (ii) contagion of psychological and physiological qualities has been suggested based on correlational data for social networks generally; and (iii) people’s emotional expressions on Facebook predict friends’ emotional expressions, even days later.

But as the editor of PNAS subsequently explained, she thought the experiment was deeply troubling and that it amounted to programming people’s moods. In other words, 700,000 people manipulated over the course of a week to make them happier or unhappier. Some of those affected might even be able to trace what happened on their Facebook page that week, and conclude that their mood was indeed conditioned by Facebook.

It is easy to imagine the potential of this ability to control mood in this way, for example, our willingness to buy certain products or services. When might we expect to read a report showing that by plastering somebody’s wall with stories about their friends’ trips abroad that they then bought a holiday? Why not manipulate our states of mind in the run up to elections so that we can influence how we vote?

In short, by experimenting on its own clients, Facebook has clearly crossed a line here. All companies that have data on their customers analyze it and often tell the world about it. Analyzing data that has been produced spontaneously or generated by external factors is unlikely to bother anyone, for example establishing whether users might have shared a good or a bad piece of news, or even more personal information. If this experiment had been based on data from users whose walls had been genuinely updated positively or negatively, I wouldn’t have had any concerns.

But it is the exploitative aspect to the experiment that is so deeply troubling, and that makes Facebook now look like some sinister thought control operation.

In light of this story, Facebook deserves to lose every one of its users. It is completely unacceptable that by using a given platform to share one’s personal life with others one is suddenly, and without consent, subjected to such a highly manipulative experiment.

The entire chain of command that approved this experiment, worthy of Dr. Josef Mengele, should be sacked, and at the same time a rigorous internal audit carried out to discover whether other such experiments have been undertaken.

If this is how somebody like myself, a scientist who obviously supports research and who on the whole has a positive opinion of Facebook and other companies using innovative tools, then I cannot imagine the indignation that those with a more skeptical or pessimistic view of the world might be feeling.

Facebook’s experiment means that from now on, social network users will be wary of manipulation, or that they are being used as guinea pigs in some brainwashing experiment. What has happened is completely unacceptable, regardless of its supposed legality within the company’s terms of service. Whoever authorized this has no values worth speaking of, or any understanding of ethics.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)