Gabriella Buck
3 min readFeb 1, 2016
Robertson, R. (2014, July 04). Facebook’s Emotional Manipulation Study — Should We Be Worried? Retrieved from http://www.unionroom.com/facebooks-emotional-manipulation-study-worried/

The potential myth of informed consent

Psychological testing exists in a bipolar realm in which informed consent can conflict with science’s responsibility to elucidate unknown questions. However, this discord is further intensified when efforts to scrutinize human behavior are aimed to serve profitable gains. Facebook manipulated the number of positive and negative posts that nearly 700,000 randomly selected users saw in their News Feed to determine whether moods were transmittable via social media. Contrary to expectations, the results demonstrated that peoples’ emotions were reinforced by what they saw, evidence for what the authors termed “emotional contagion”.

To advance knowledge in the field of human behavior often requires the careful probing of subliminal processes. A legitimate evaluation of such processes could be rendered invalid if the individual is cognizant of their reactions to a given stimulus. However, any failure to fully inform subjects of an experiment weakens individuals’ ability to be independent agents in their social sphere. The study mentioned that altering the News Feed was consistent with Facebook’s data use policy, which all users agree to prior to creating an account on Facebook. This, according to the researchers’ code of ethics, was a legitimate indication of informed consent. Technically, however, informed consent as defined by the US federal policy for the protection of human subjects requires explicit explanation of research aims and expected duration of subject participation, a description of possible risks, and a statement that participation is voluntary. Max Masnick, a researcher with a doctorate in epidemiology, responded to Facebook’s study saying, “As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use. The researcher is responsible for making sure all participants are properly consented.” An alternative view, however, would assume that any individual who fails to absorb Facebook’s data use policy is to blame.

What is perhaps more disturbing still is the additional issue of manipulating users’ sentiments for the ultimate purpose of user engagement and business advertising. Jacob Silverman, author of the book Terms of Service, which explicates the advantages and disadvantages of technology’s current complexity, commented, “If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there’s little reason to think that they won’t do just that.” In line with this opinion, the unknowing participants involved in this particular psychological experiment thus become mere pawns in a game driven by capitalist interest. Silverman confronts this unabating issue in his book when he argues that social media platforms, like Facebook, act as the state’s surveillance cameras to excavate individuals’ personal data for advertising revenue. According to Silverman, the Internet was already “a vast collection of market research studies; we’re the subjects”.

By publishing this study, Facebook unintentionally exposed technology’s unforgiving tendency to rely on debatably non-consenting users as subjects for psychological tests that further an online platform’s takings. Yet, social media websites bypass the unlawfulness of such an act through a blanket form of consent in the Terms and Conditions section presented to users upon their registration. The question remains then, how informed must consent be in order to avoid claims of ethical violation?

References

Arthur, C., & Swaine, J. (2014, June 30). Facebook faces criticism amid claims it breached ethical guidelines with study. Retrieved from http://www.theguardian.com/technology/2014/jun/30/facebook-internet

Arthur, C. (2014, June 30). Facebook emotion study breached ethical guidelines, researchers say. Retrieved from http://www.theguardian.com/technology/2014/jun/30/facebook-emotion-study-breached-ethical-guidelines-researchers-say

Booth, R. (2014, June 30). Facebook reveals news feed experiment to control emotions. Retrieved from http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

Goel, V. (2014, June 29). Facebook Tinkers With Users’ Emotions in News Feed Experiment, Stirring Outcry. Retrieved from http://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-users-emotions-in-news-feed-experiment-stirring-outcry.html