Facebook says it will now publish news even if it violates its standards

Paul Dughi
Thoughts On Journalism
4 min readOct 24, 2016

Facebook maintains it’s not a news organization. When the majority of people get their news from you, you are whether you want to be or not.

A Knight Foundation study shows 63% of Facebook and Twitter users say Facebook “serves as a source for news about events and issues outside the realm of friends and family.”

Facebook says that whether an image is newsworthy or historically significant is highly subjective. Tough to use an algorithm for that.

“Images of nudity or violence that are acceptable in one part of the world may be offensive — or even illegal — in another. Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.” — Joel Kaplan, VP Global Public Policy and Justin Osofsky, VP Global Operations and Media Partnerships

After complaints and making viral news itself with what it filtered (or didn’t), Facebook says it’s changing to allow more items people might find newsworthy or important — and here’s the rub — “even if they might otherwise violate our standards.”

“Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”

Would people like it better if there was a National News Editor for Facebook feeds? Or would they prefer no human intervention at all and just let popularity dictate what you see?

A national poll done by Morning Consult gives us an indication of how people feel about that issue. For social media, it found that 31% said “reader interest” should determine what news stories show up on social vs. 11% who say editors should pick. It’s quite a bit different from what they think is happening right now.

Those polled also weighed in on what they think is happening at traditional media companies and what they think should happen. 26% said “reader interest” should dictate coverage with 33% saying a blend of interest and editor discretion. Only 15% thought editors alone should pick. Compare that to what they think is happening and you see the differences. They believe 36% of the material is picked by editors and only 13% by reader interest.

Google sees the world differently

Is Google a news organization? I would classify that as a “no”, but it is a source for a lot of people. I use Google News as one of my first stops when I’m looking for news — it’s an easy way to get multiple sources on a story. Like Facebook, its algorithm picks what’s popular and shares it. Unlike Facebook, it says it has a responsibility to make sure the information it’s sharing is legit.

For more than seven years, Google News has been tagging news stories as “Opinion”, thereby helping users differentiate between fact-based reporting and opinion. It’s similar to how newspapers and TV stations labelled things as “Commentary” or “Editorial.”

Earlier this year, Google News added a Local Source tag to highlight local coverage of major news stories. That was a big step forward for local news organizations that often had better and more in-depth coverage of stories, but less audience. So an Associated Press re-write of a local news story, distributed broadly across a network, would almost always — if not always — show up higher in the list (or sometimes completely displace) the actual reporting done by the local journalists.

Now, Google News is adding in what could be an important feature by adding a “Fact Check” tag for the top stories. It will provide links to what Google says are a variety of sources that have done fact checking on the trending story.

“We’re excited to see the growth of the Fact Check community and to shine a light on its efforts to divine fact from fiction, wisdom from spin.” — Richard Gingras, Head of Google News, writing on the Google blog on journalism.

--

--