Facebook’s news headache

Sarah Frier
Nov 15, 2016 · 4 min read
Image for post
Image for post
Photographer: David Paul Morris/Bloomberg

I’ve spent a lot of time over the past week talking to current and former Facebook employees, trying to get a sense of how the company sees its role and responsibility in spreading information, particularly false or misleading news.

The first official statement I got from Facebook was along the lines of, “Nothing to see here! News on Facebook had no influence on the election.” (Which is ironic, given that their ad business is built on the influence of the news feed.)

And then there was the post from Mark Zuckerberg, saying updates may come soon. He noted that the issue is a very tricky one, because a lot of news isn’t completely wrong, it’s just misleading or biased, and Facebook shouldn’t be the arbiter of truth.

We hear you, Zuck. But what’s going on behind the scenes?

There are two seemingly divergent forces at play. The social network’s news feed over the past couple years has been adapted to seem more human, with algorithmic tweaks based on responses from a panel of about 1,000 real people regularly surveyed by Facebook. Meanwhile, Facebook wants its handling of news to be as unbiased and hands-off as possible. You might recall that they fired the human editors in charge of their trending topics team after Gizmodo reported they may have liberal biases.

Facebookers would probably prefer to spend their time on 360° video and cool software. But here we are.

As top management holds high-level meetings to come to grips with how much of an impact it might have, Facebook’s goal is to avoid ever appearing biased against conservatives (or any other group) again.

Facebookers would probably prefer to spend their time on 360° video and cool software. But here we are. What happens now? For Facebook, everything always comes back to user experience and engagement metrics — and even if some employees do feel a higher sense of journalistic duty, data will be king. As luck would have it, the numbers do show that people don’t like being misled when they click on links. That means Facebook will probably devote more energy to making sure that its algorithm can rate whether stories are misleading or false, not just determining whether they’re relevant or not.

Meanwhile, there are some other key questions to address:

Apart from these, there are other questions that have been the subject of much debate: “Was Facebook responsible for Trump’s win?” and “Is Facebook a media company?” While certainly interesting, those debates don’t get us closer to what we want to know, which is how Facebook will behave as a key distributor of information, as more and more people rely on it as their main source of news. Some of Zuckerberg’s own employees don’t think he’s doing enough.

And let’s not let Twitter and Google off the hook, either. We’ll be talking more about this subject in next week’s Decrypted podcast. Subscribe here!

This story originally appeared in Bloomberg Technology’s Fully Charged newsletter. Sign up to receive it in your inbox.

Bloomberg

The first word in business news.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store