The Facebook “trending” controversy is really about the perception of objectivity, argue commentators.

Photo by Marcin Wichary

Much has been written about Facebook’s alleged manipulation of its influential “trending” news section. According to the allegations, that manipulation took two forms: First, Facebook contractors writing headlines for the “trending” section might “inject” stories into the section — even if few Facebook users were discussing it — if they felt that doing so may better represent the day’s overall news. (Like a Black Lives Matter protest, for example). Second, and of greater concern, was that the staff allegedly suppressed and censored conservative-oriented news. As a former contractor told Gizmodo: “I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”

As the ACLU’s Jay Stanley writes, one “can make two complaints about Facebook” here. The first is that it shouldn’t be biased. But, as Stanley points out, “[i]In the sense that [Facebook] is technically a publisher, it has the same right as any newspaper to pick and choose what it will publish, and to be liberal, conservative, or anything else.” Essentially, the company has no legal obligation to be “balanced.” The second, more interesting complaint, as Stanley puts it, isn’t about legal obligation, but about perception: Facebook “has implicitly mislead its readers into believing that they are seeing an ‘objective’ measurement of mass interest in various stories when they are not.”

That concern, about perceived impartiality, is echoed by danah boyd:

What is of concern right now is not that human beings are playing a role in shaping the news — they always have — it is the veneer of objectivity provided by Facebook’s interface, the claims of neutrality enabled by the integration of algorithmic processes, and the assumption that what is prioritized reflects only the interests and actions of the users (the “public sphere”) and not those of Facebook, advertisers, or other powerful entities.

As Farhad Manjoo writes in The New York Times:

The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power … The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.