Negotiating for News

The Principles at Stake With Facebook, et al

Facebook did not set out to dominate the distribution of news. Neither did news organizations choose to let it. But now we are there, with Facebook providing a sizable and growing share of traffic to news — even before the advent of Instant Articles. Twitter, YouTube, Instagram, Snapchat, Apple— they all present what we used to think of as our news to our readers. And so it is time to have a serious discussion about the principles and business terms at stake in this new era of distributed journalism.

In a panel called Facebook: Friend or Frenemy? during Social Media Weekend at CUNY recently, Columbia’s Emily Bell and I debated some of these issues, with Mashable’s Jim Roberts sometimes playing peacemaker (the full video is below). Here I want to address some of the questions we discussed and others I’ve been mulling over recently.

My starting point is the one Arthur Sulzberger made to the WAN-IFRA newspaper congress in Washington a few weeks ago: We must engage. To ignore the platforms and their people, to continue to believe that we can make a business by demanding that everyone come to us is delusion. This moment also presents an opportunity: Facebook and Google want to make friends with publishers; they see the value in news; they are tripping over each other offering new features and business deals that advantage news. Now is the time to talk and to address the complex questions around news and the platforms.

1. Should an informed society be Facebook’s mission?

I will say no. For if the answer were yes, then that would mean Facebook might be driven to hire its own reporters and editors and make its own content. In our discussion, Emily said she would welcome the platforms paying for journalists. I get her point: She’s asking who will pay for the work we do. But I say that creating its own news content would put Facebook — like Google and the other platforms — in channel conflict with the news ecosystem.

Facebook and Google depend on that ecosystem; they have shown that they and their users value good content. So it is better that they partner with — not compete with — news organizations to provide that journalism. It is in the platforms’ interest for quality news organizations to succeed. Therein lies the mutual benefit that forms the basis of productive negotiation.

Only Mark Zuckerberg — and the people of Facebook — can say whether informing society is the company’s goal. Facebook says its job is to connect people to (1) each other, (2) information, and (3) entertainment; news is only a subset of the second. At the end of the day, Facebook won’t judge its success by whether the the public is better informed about the Greek debt crisis. Mission? No. Responsibility? That’s another question.

2. Does Facebook have a civic responsibility for news?

Once Facebook finds itself in the position of being a critical distribution channel of news — and soon, I predict, a key supporter of news organizations through both audience and revenue generation — then, as Dick Costolo saw at Twitter, certain responsibilities fall on its shoulders. Emily Bell sums up the values at stake:

Making sure news is accurate, which seems pretty basic; being accountable for it if it is not accurate; being transparent about the source of stories and information; standing up to governments, pressure groups, commercial interests, the police, if they intimidate, threaten or censor you. Protecting your sources against arrest and disclosure. Knowing when you have a strong enough public interest defence to break the law and being prepared to go to jail to defend your story and sources. Knowing when it is unethical to publish something. Balancing individual rights to privacy with the broader right of the public interest.

We can’t expect Facebook to fact-check all that appears on its platform [did you really have a great time at that party or are you just saying that?] but it and Google can decide to give greater privilege to trusted sources. Facebook’s real-name policy helps identify the sources of information but also causes issues with the protection of sources who are vulnerable to exposure and attack. Google has spent untold dollars fighting government efforts to violate the privacy of its users. These issues are already in play.

There is also potential conflict between Facebook’s community standards and journalistic standards. Pando Daily’s Sarah Lacy just complained on FoxNews that her ad for a story that used so-called bad language was initially refused by Facebook. At the International Journalism Festival in Perugia, a researcher spoke about Danish newspaper Berlingske’s removal from Facebook over pictures from a book about naked hippies (Apple — which is now also in the news distribution business — also banned the book). I defend “bullshit” as political speech. Sometimes, news is naked or profane. Should news organizations be held to lowest-common-denominator standards of decency? This, too, needs to be discussed.

I could not pretend to grapple with all these questions and more in a blog post. Nor do I expect Facebook or the other platforms to come with ready-made answers; they don’t have journalistic thumbsuckers in residence to do that. I have said that we need to begin an open discussion of these issues and I’m eager to play host to it; that is a proper role for a journalism school in this process.

3. Should Facebook provide transparency into its algorithm for news?

I hear this often: Editors wish Facebook would be transparent about how its News Feed algorithm works as it decides which of their articles are shown to any given person. The problem from Facebook’s side — or Google’s regarding search or Twitter’s regarding trending — is apparent: if their formulae were revealed, then they would be gamed, reducing the value of the services.

Ah, but can’t they trust the editors of respectable publications to use this knowledge well? Perhaps, but what will those editors do with it? Given their present mass-media business model built on volume, editors will be subject to the moral hazard of using this knowledge to serve whatever gets the most clicks, and inevitably tht is crappy cat content. Besides, I’m not sure news organizations would know what to do with information about the algorithm if they had it; it reportedly includes 100,000 weighting factors.

In my discussion with Emily and Jim, I had another thought: Perhaps what trusted news publications should seek from Facebook is an allocation of prioritization and attention. That is, let’s say there is an important story that Facebook’s interest data says a given user wouldn’t care about. But a newspaper’s editors say: “Trust us, this really matters.” So it pulls a switch to assure that anyone following that newspaper will be served this story in a timely manner. If the news organization abuses the privilege — if readers believe they’re being spammed, as they did with Facebook’s short-lived Social Reader — then it will lose followers. The system has to be set up to reward informing the reader over merely enticing the reader to click. No cats allowed.

While we’re on the subject of algorithms… In Silicon Valley, I often hear technologists ask why algorithms freak out editors. My answer: It’s because editors think the algorithm took away their power, their judgment, their prerogative. It’s because the algorithm is a black box they don’t understand, replacing the black box that is the newsroom, the editorial process. It’s also because they still think that everyone should see news all the same, as editors deliver it, because that’s how mass media worked; they fear that customization threatens the value of their news judgment.

Mysterious algorithms also raise the fear of manipulation. Just as with the moral panic over the threat of subliminal advertising in film and then television, this new technology — the social network — raises fears that Facebook will pull puppet strings over us. Well, in truth, all media attempt to manipulate. I’m working on a separate post about the hot topic of media impact and metrics and there I’ll argue that changing behavior is the real goal of journalism, just as for marketing. The question then is of transparency and responsibility: not transparency of code but of goal and motive. This, too, is worthy of discussion.

I tell the technologists that what they should do is explain to editors why news sites need their own algorithms to prioritize and target news to their readers. Facebook needs an algorithm because it must select from the perhaps thousands of things it could show any given user any day. News sites tend to deliver 4–600 items of content each day, expecting users to find what matters to them. Ridiculous!

I’m not saying that news should become 100% personalized. I just argued that even on Facebook, there is some news that most everyone should or will want to know. But I do want news organizations to learn from Facebook, reducing the noise and increasing the relevance of what they give each of us. They need one thing to do that: Data.

4. Should Facebook give news organizations data about their users?

Yes. So should Google. This must be at the heart of negotiations between platforms and news organizations. Data about interests is required to give people greater relevance and value wherever they encounter a news organization on the web — on Facebook, on their own sites, through any other platform. I’ve long argued that the platforms should find ways to share this data with news and content providers. I’ve answered others’ arguments about privacy with plenty of ideas about how news organizations and platforms could create new services to generate interest signals and do privacy right. I’ve also said that the better news organizations serve people on Facebook — using interest data themselves — the better it is for Facebook.

The platforms know much more about our users and even our content than we do. I’m not suggesting that they will give away their crown jewels. But hell, if they gave news organizations a fraction of what they know that could be golden for the business of news.

There’s only one problem: The news organizations don’t have the technological know-how to handle that data.

5. Could Facebook give news organizations technology?

Of course, it could. So could Google. Editors tend to look this gift horse in the mouth. They worry that if news organizations surrender to Facebook and Google, handing over their technology needs and even now their sales to the platforms, then one day they’ll find rugs pulled out from under them. Yes, and the answer to that is a negotiated business term: a long-term contractual assurance. If that assurance could be given, imagine all the technological help the platforms could give to news companies.

We need help building membership and user profiles to collect and act on all that data I wish we’d get.

We need help with the forms and use cases of news. See Mark Zuckerberg’s responses to Arianna Huffington and me in his recent Q&A. “One of the biggest issues today is just that reading news is slow,” he said. And: “Making sure news organizations are delivering increasingly rich content is important and it’s what people want.” That’s why Facebook created Instant Articles.

He also said: “There’s an important place for news organizations that can deliver smaller bits of news faster and more frequently in pieces. This won’t replace the longer and more researched work, and I’m not sure anyone has fully nailed this yet.” This was precisely why I had hoped that one of the big platforms might buy Cir.ca and open it up to the news industry: to encourage us to break news into its constituent parts so we could give people alerts that matter to them. Thus news organizations would serve readers better on Facebook, perhaps allowing stories like the tragedy of Ferguson, Missouri to bubble up in people’s Facebook feeds faster.

We need help with distribution. I have been arguing for years that content should travel with business model attached — brand, revenue, data, links. I’d love to see the containerization and portability of news that Facebook has pioneered with Instant Articles as the foundation of an open standard for distributed news (so news companies do not have to rebuild every story for every platform). I am talking with others who also want to work on that. If you’d like to join, let me know.


In our panel, Emily Bell said that “good journalism does not equal profitable journalism.” Jim Roberts and I disagreed. I am not ready to give in to the idea that journalism will be supported only by charity (there isn’t enough) or by paywalls (which redlines journalism for the privileged) or by mass-media advertising (cats!); that is why I run programs in both entrepreneurial and social journalism. I believe that journalism can move from volume- to value-based models by serving people as individuals and communities rather than as a mass; that is why I care so much about data. Google and Facebook are personal-services companies and so must journalism be; that is the premise of my book. I believe that we must engage with technology companies and find common ground and not have the hubris to think that we can compete with them where they excel. The technology companies also need to understand that journalism is not mere content; journalism brings responsibility. I believe it is in their self-interest to collaborate with us. And finally, I believe that they can help us reimagine and reinvent journalism and now is the time to work with them to do that.

What I propose here is merely an outline for the beginning of a conversation with news and platform companies: Facebook, Google, Twitter, Snapchat, Apple, Amazon…. We need to build bridges or we will be left as islands.

Like what you read? Give Jeff Jarvis a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.