The Difference Between Truth and Lies is Roughly 94,985 Shares

Facebook fake news problem explained in one screenshot.

“Words had to change their ordinary meaning and to take that which was now given them. Reckless audacity came to be considered the courage of a loyal ally; prudent hesitation, specious cowardice; moderation was held to be a cloak for unmanliness; ability to see all sides of a question, inaptness to act on any. Frantic violence became the attribute of manliness; cautious plotting, a justifiable means of self-defense. The advocate of extreme measures was always trustworthy; his opponent a man to be suspected.” Thucydides: III 69–85

Here is a screenshot I took on October 31st, a week before the election. the main news story, as it appeared in my feed, was from the New Yorker. But when I clicked on it, a related articles tab opened underneath(this is quite a common way for Facebook to suggest relevant news stories). The articles which Facebook had suggested I read were from two fake news sites, which touted outrageous (and highly viral) headlines about Obama’s removal for treason, his impeachment, and the republican ‘fight for justice’.

By placing these articles alongside the New Yorker, Facebook was implicitly giving them a type of endorsement — they are seen to be on the same playing field.

It is highly telling that the New Yorker article get 1 share while the worldpoliticus.com article got 94,986 shares. Which do you think had more influence?

Traditional media sources have been calling out Facebook for playing a role in Trump’s victory over Hillary Clinton by spreading false news stories on the social network. My inclination is to be skeptical of such claims — venerable news outlets like the Washington Post and the New York Times have a vested interest in hyping these claims since the fake sites are their direct competitors.

Yet where the fake sites excel is in getting their message spread — stories are shared by millions of people ready to BELIEVE and BE ANGRY at the content of these articles. Of course, not being constrained by the truth means that such sites are free to engineer the headlines and content of their stories so that it exactly (and scientifically) is the type of content people want to share.

Facebook CEO Mark Zuckerberg’s response has been to deny that Facebook had a role in the shaping the outcome of the presidential election. It is a two-fold claim; first that people are not going to be affected by what they read on Facebook to the extent that they will change their vote. Second, that Facebook news feed is an algorithmic and neutral curation process and as such Facebook has no editorial responsibility.

Both these claims are dubious. The billions of dollars in Facebook ad revenue (including political campaign spending) are a strong indication that what people see in their news feed can guide them towards taking an action desirable by the advertiser. There is ample evidence that Facebook advertising works. For Zuckerberg to effectively claim it does not is not only disingenuous it also, if true, undermines his own business model.

The second claim, that Facebook is a neutral platform has not been as widely challenged. If anything people have claimed that Facebook should start taking responsibility for what is being shown in its feed, a claim that implicitly agrees with the notion that currently the platform is neutral. The firing of Facebook Trending Stories editorial team (which has been replaced with an Algorithm) is seen as evidence that Facebook has indeed been striving towards objectivity.

This is a false argument however in that it erroneously equates ‘algorithmic’ with ‘neutral’. In fact if an Algorithm is designed (or in the case of machine learning ‘allowed to be evolved’) in such a way that the results produce a highly one sided outcome, and there is no attempt to address this one-sidedness, these results can in no way be described as neutral. And Facebook’s algorithm has become just that — it takes into account factors that prize sharability of content and how viral it can get, how many people will be engaged with it over and above any other factors (like its truthfulness).

Facebook users are not meant to notice the subtle ways in which Facebook manipulates our emotions. The content often comes mediated as if shared by our friends, and so it seems logical that it would appear in the user’s feed. But the algorithm is still there, controlling what we see.