I’ve spent a lot of time over the past week talking to current and former Facebook employees, trying to get a sense of how the company sees its role and responsibility in spreading information, particularly false or misleading news.
The first official statement I got from Facebook was along the lines of, “Nothing to see here! News on Facebook had no influence on the election.” (Which is ironic, given that their ad business is built on the influence of the news feed.)
And then there was the post from Mark Zuckerberg, saying updates may come soon. He noted that the issue is a very tricky one, because a lot of news isn’t completely wrong, it’s just misleading or biased, and Facebook shouldn’t be the arbiter of truth.
We hear you, Zuck. But what’s going on behind the scenes?
There are two seemingly divergent forces at play. The social network’s news feed over the past couple years has been adapted to seem more human, with algorithmic tweaks based on responses from a panel of about 1,000 real people regularly surveyed by Facebook. Meanwhile, Facebook wants its handling of news to be as unbiased and hands-off as possible. You might recall that they fired the human editors in charge of their trending topics team after Gizmodo reported they may have liberal biases.
Facebookers would probably prefer to spend their time on 360° video and cool software. But here we are.
As top management holds high-level meetings to come to grips with how much of an impact it might have, Facebook’s goal is to avoid ever appearing biased against conservatives (or any other group) again.
Facebookers would probably prefer to spend their time on 360° video and cool software. But here we are. What happens now? For Facebook, everything always comes back to user experience and engagement metrics — and even if some employees do feel a higher sense of journalistic duty, data will be king. As luck would have it, the numbers do show that people don’t like being misled when they click on links. That means Facebook will probably devote more energy to making sure that its algorithm can rate whether stories are misleading or false, not just determining whether they’re relevant or not.
Meanwhile, there are some other key questions to address:
- How far can Facebook afford to go with its algorithm changes? As Zuckerberg said, fake news is one thing, biased news without proper context is another. While they will probably fix the fake stuff rather quickly, it might be trickier to block any opinions masquerading as news.
- While Facebook has courted media partners for its instant articles product, and even paid some of them to try live video, might it become more cautious about having any close ties to news organizations?
- Besides algorithmic changes, are there other signals Facebook can give to its users about what to trust, while still keeping its neutrality? A more extreme solution: before a user shares a story, a prompt might ask: “Some users have reported this story as false. Do you still want to share it?”
- Will Facebook change how content is visually presented? Facebook is the great equalizer — an article from a reputable journalistic organization looks the same as one from a biased source. Will they be treated differently in the future?
Apart from these, there are other questions that have been the subject of much debate: “Was Facebook responsible for Trump’s win?” and “Is Facebook a media company?” While certainly interesting, those debates don’t get us closer to what we want to know, which is how Facebook will behave as a key distributor of information, as more and more people rely on it as their main source of news. Some of Zuckerberg’s own employees don’t think he’s doing enough.
This story originally appeared in Bloomberg Technology’s Fully Charged newsletter. Sign up to receive it in your inbox.