Facebook Steps Up

Jeff Jarvis
Whither news?
Published in
5 min readDec 15, 2016

Today Facebook made changes in its News Feed to address the scourge of fake news (though I really wish we’d call this what it is: lies and propaganda). These are steps in the right direction — not a solution, of course, but a good indication that Facebook is now taking the problem seriously. I saw that intent last week when I visited with Facebook execs in Menlo Park (before attending two Google events about news, where this was Topic A).

Earlier this week, Adam Mosseri, who’s in charge of Facebook’s News Feed, talked me through the changes, which he said are aimed first at “the worst of the worst:”

  • Most importantly, as John Borthwick and I wished in our list of suggestions, Facebook is presenting more information to users at two critical moments: when they choose whether to click and whether to share. The solution is to empower and inform users, not to act as The Great Gatekeeper.
  • Facebook will use various signals — including virality and comments about news — to note suspicious links among the hundreds of millions it traffics every day, sending questionable items to a set of fact-checking agencies, which will check the veracity of the items. When they find a lie, the fact-checkers will report it to Facebook, which will pass the information on to users. Note well that Facebook will not — as I believe it cannot — act as The Great Editor, deciding what is true and false. It passes that task on to outside entities. (I asked whether this is a business arrangement. It is not. If Facebook is going to pay for video, it might want to consider paying for truth, which is also good for business.)
  • Another wish on Borthwick’s and my list is coming true: Facebook will make it easier for users to report fake news, a function that is now buried deep in a menu maze as daunting as calling the cable company. I asked how they’ll do this. Mosseri said they will A/B test various methods, to see what works best.
  • Facebook will make a change to its ranking algorithm taking into account what Mosseri called “informed sharing” — that is, giving credence to links that people fully read before sharing. If people are less likely to share a link after actually reading it, then Facebook will sense something might be fishy there.
  • Importantly, Facebook — like Google — will attempt to disrupt the economic motive of fake news factories, making them ineligible for advertising. To do this, the company will look at signals including domain and logo spoofing.

All this is good. Of course, there’s more I would like to see. In addition to helping to support fact-checking, I would also like to see Facebook take alerts about debunked news from media companies. That is, if The Washington Post or The New York Times reports on a rumor or meme, its stories should appear alongside said lies (as it often will already with the related content links that pop up when you start interacting with an item). Facebook is open to that. As Borthwick and I suggested, I’d like to see Facebook track back memes to their sources — and give greater prominence to the brands of those sources.

I would also like to see Facebook share data about how lies spread and what motivates people to reconsider before sharing lies so researchers can study how to inform the public more effectively. And I hope Facebook shares data about miscreants with Google (which should share back) as well as programmatic advertising networks and recirculation engines (a la Taboola and Outbrain) so we can hold their feet to the fire to take away economic support from aggressive liars. Having said that, I want to repeat that I am concerned about creating blacklists; I would not want to see partisan sites practicing their free speech lumped into fake news lists.

Finally, I have pushed Facebook to hire an editor. It so happens that a job posting for a head of news partnerships became public this week. It has good aspects, building positive and collaborative relationships with news organizations. I have argued, however, that this function must be closely involved with Facebook’s product work. Look at the changes being announced today: At a platform, product is what matters.

When I talk about fake news these days — as I did nonstop last week in Silicon Valley — someone inevitably argues that the masses don’t care about the truth. I don’t believe that, or else I’d quit journalism school and journalism (not to mention a democracy; just hand it all over to Putin). Of course, some people will revel in falsehoods and lies; some always have. I am confident that a significant segment of society wants to be correct if only given the chance to be. That is why journalism exists. That is why the path Facebook is on matters.

I am very concerned that there will be a populist backlash against technology — namely, Silicon Valley technology companies — when the movements that voted for Brexit and Trump realize that immigrants are not costing them their jobs; technology is. I also worry that especially in Europe, authorities will use hate-speech statutes to get Facebook, Google, et al edit and censor conversation on their platforms. This is why it is in Facebook’s interest to tackle this problem. The bigger and better reason for Facebook to do what it is doing — and for both media and users to help — is to improve the quality of discourse on its platform and thus the quality of the experience there, not to mention the quality of civic discourse in society.

On the Google front, note that Borthwick and I wished that Google especially would clean up the hateful mess that is exposed in search autocomplete (“Jews are…). It is doing that. Google, like Facebook, is recognizing that it cannot hide behind the idea that it merely reflects reality. No, reality is gamed and altered according to the platforms’ vulnerabilities. At one of the Google news events last week, I also saw suggestions to present alternative perspectives on stories to users. These are positive steps.

Lies, propaganda, fake news, hate, and incivility won’t be “fixed” with any product or algorithm or staffing tweaks. This is a battle we are stuck in forever. So let’s keep fighting. In my post with Borthwick, our last suggestion was to form an ongoing project to work on informed conversations. We’re working on that. More soon.

And, of course, it is critical that we in journalism and we the public not push off our responsibility to deal with fake news and lies onto the platforms alone. We have our work to do, too. Note the great work the First Draft Coalition is doing to educate the public (and journalists) to do their jobs in halting the spread of lies. A good example:

--

--

Jeff Jarvis
Whither news?

Blogger & prof at CUNY’s Newmark J-school; author of Geeks Bearing Gifts, Public Parts, What Would Google Do?, Gutenberg the Geek