IMAGE: Mipan — 123RF

Does Facebook have an editorial line?

Enrique Dans
Enrique Dans
4 min readMay 13, 2016

--

An article in Gizmodo called “Former Facebook workers: we routinely suppressed conservative news” cites a number Mark Zuckerberg’s ex-cohorts discussing so-called news curation by which stories about Facebook were not put out as trending topics, while anything with a right-wing slant were often simply eliminated. This follows staff quizzing Zuckerberg about whether they had a duty to stop Donald Trump becoming president.

The company has responded swiftly, denying firstly that it suppresses news on the basis of its politics and that it takes accusations of bias seriously, subsequently publishing a piece by trending topics head Tom Stocky, who says that the allegations are untrue and that Facebook simply audits topics on the basis of their newsworthiness, eliminating only “junk, repeats, hoaxes, or subjects with insufficient sources”, and that its algorithm doesn’t distinguish material on the basis of its ideology, that everything it does is systematically registered and checked, and that any violation of these norms would be grounds for dismissal.

Accusations of trending topics bias have long circulated on the social networks. The question of algorithms that can detect and select topics is more complicated than it might appear and in general they work on the basis not only on the number of mentions alone, but also on factors such as their increase over time and in which parts of the world. We accept the idea of an editorial line in traditional media, and indeed this is one of the main reasons we choose to read, watch, or listen to this or that outlet.

In the case of the social networks, these have always attempted to reflect the views and interests of as wide a range of views as possible, perhaps ignoring that they still might tend toward one end of the political spectrum or the other. The color of the news we wee in our timelines is created by algorithms according to our interests, as are trending topics. And this is precisely where the problems begin: given that algorithms cannot be transparent, so as to avoid their misuse, any difference between what a person sees and what they would like to see on the basis of their personal bias is clearly a case of manipulation.

In which case, are Facebook’s trending topics manipulated? It’s not easy to find out, and if it were the case, it might be due to company policy or to non-official activities prompted by the political climate, which happens in other organizations.

It appears that although Silicon Valley companies do not discriminate on the basis of their employees’ political views, they probably do tend to attract people of a more liberal or left-wing disposition, who believe in social change, for example. This tendency could end up being reflected in process such as news selection, if it is done by hand. In the majority of news organizations that I know of, the editorial line is not expressed directly, but is projected through other processes that created a certain climate and that end up conditioning which stories are chosen.

Facebook has been undergoing a transition from a site that started out as a place to share stuff with friends and family to one we can use to access all kinds of information, including news. In which case, accusations of bias are a serious matter.

It is obviously very tempting for Facebook to try to influence how its audience of 1.5 million people think. What’s more we already know about its “psychological” experiment, which did it quite a lot of damage.

As the saying goes, there is no power without responsibility, and Facebook will now have to work hard to prove it isn’t using its position to editorialize the news on its timelines and trending topics if it is to quash the rumors.

Whoever controls the internet pretty much controls the world. But before we start conjuring up images of Zuckerberg sitting in his mountain hideaway stroking a cat while he removes and installs governments at will it would be useful to know if the anonymous sources quoted by Gizmodo is accusing Facebook of ordering its employees to be biased, if they did so themselves, or if they were simply disgruntled and trying to get their own back. We also need to work out if the manual aspect of news selection even allows for bias. We would also need to look through all its news archives one by one to see if there is a particular editorial line that was being kept hidden.

Facebook now finds itself facing Caesar’s famous dictum about his wife, it must not only be above suspicion, but also make sure that it is seen to be above suspicion. This is surely a topic that will be trending for some time to come. It will be interesting to see how Facebook users respond, and what the traditional media have to say about it.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)