One of Facebook’s many social experiments: emotional manipulation

Kinny Cheng
That Is #SoMe
Published in
3 min readApr 4, 2015

Originally published on 12 September 2014

A post by Vindu Goel of the New York Times, back in July (but still holds much relevancy on the issues at hand):

The social network is facing potential investigations after it disclosed last week that it deliberately manipulated the emotional content of the news feeds by changing the posts displayed to nearly 700,000 users to see if emotions were contagious.

Outrageous?

The company did not seek explicit permission from the affected people — roughly one out of every 2,500 users of the social network at the time of the experiment — and some critics have suggested that the research violated its terms of service with its customers. Facebook has said that customers gave blanket permission for research as a condition of using the service.

“The company did not seek explicit permission” because “customers gave blanket permission”?

Yes, Facebook, I suppose this cleans those filthy hands of yours from the general legal responsibilities — and probably ethical ones too, given your continued stance on the matter of privacy.

How about applying common courtesy and asking for permission, again, even if you believe it was already given?

To be fair, Facebook didn’t change the actual content, but instead the types of content — based on its resulting emotion, either positive or negative — shown in a given user’s timeline. In other words, it gives those algorithmic filtering methods more than just a bit of a play:

In the study, which lasted one week in January 2012, Facebook changed the number of positive and negative posts that some users saw in their feeds to gauge how emotions can affect social media.

While the case may be that Facebook never changed the actual content posted by users, the methods through which they are presented can be as-important where its resulting interpretation is concerned.

Algorithmic filtering, by nature, is more of a hindrance to the typical social media network user than it is beneficial. It encourages the selective consumption of information, meaning that not everything is shown and gets seen. Humans naturally have their own means of filtering out unwanted information — and applying a further level of this will only bestow an even greater level of ignorance amongst its users.

By choosing, or filtering, the different types of emotions to present a user, a misrepresentation of the actual post results will only lead to a misinterpretation of the conveyed message as a whole — that is, both its informational and emotional aspects. Hence, what gives Facebook the right to intervene and muck around with how its users chooses to express themselves, being a basic human right?

Is it really so difficult to understand the similarity of the human psyche when it’s applied online, rather than offline? Differing personalities aside, the basic building blocks of human socialisation are virtually identical with respect to the individual. It doesn’t take a rocket scientist to read between the lines — and nor is social media a totally new phenomenon (duh!)

Of course, Facebook is always treading on thin ice, especially so where its ongoing reactive stance on maintaining user privacy (or lack of) is concerned.

And how about these statements from the people responsible:

Richard Allan, Facebook’s director of policy in Europe, said that it was clear that people had been upset by the study.

“We want to do better in the future and are improving our process based on this feedback,” he said in a statement. “The study was done with appropriate protections for people’s information, and we are happy to answer any questions regulators may have.”

Possible interpretation: Mr. Allan feels they have not done a good-enough job (of being stealthy) because it had upset a lot of users, which it should probably never have…

“We clearly communicated really badly about this and that we really regret,” Ms. Sandberg said in the NDTV interview. “We do research in an ongoing way, in a very privacy protective way, to improve our services and this was done with that goal.”

Possible interpretation: Ms. Sandberg regrets her current situation (of having to feel regret), but not any of the actions resulting in results of their study-via-illicit-ways. We will continue our preferred ways of getting inside your head…

If you still feel comfortable with Facebook playing with your head like this, then all I can say is…

“Fine”.

Kinny tweets aviation, social media and technology on Twitter.

--

--

Kinny Cheng
That Is #SoMe

Aviation, social media and technology fanatic and writer. Creative and Editorial Conscience for a media startup. Loves food, photo-taking, and getting around!