The S̶o̶c̶i̶a̶l News Network

Extra, extra, read all about it! Facebook is now a news company, whether it wants to be one or not

2016: The year Facebook well and truly became a news organization, thanks to the launch of its live video feature. But is Facebook ready to assume the responsibility that comes with it? That remains to be seen.

From platform to publisher

Facebook’s path to becoming a news organization has been steady, if circuitous.

For years the company has been a major player in the news business, since it is a major source of incoming traffic (and thus dollars from display ads) to news websites. In fact, Facebook now sends more traffic to news sites than any other source — even Google.

Publishers live and die by its newsfeed, carefully crafting headlines optimized for social media success and assigning articles, videos, and photos that will entice Facebook users to like, comment, and — most crucially — share with their friends. Whenever Facebook makes a tweak to the algorithm, which it did again in June 2016, shockwaves ripple throughout the media industry. Most famously, Upworthy — an erstwhile digital darling which at one point was the fastest-growing publisher in history on the strength of its Facebook-optimized “curiosity gap” headlines — saw its traffic numbers plummet with one tweak of the algorithm in 2013. It never recovered, and has since laid off writing and editorial staff and made a pivot to video.

Facebook introduced a “trending news” section in 2014 that highlights topics and stories that are most popular on the site at any given moment. This section is managed by human editors — who are off-site contractors and not full-time Facebook employees — with assistance from the newsfeed algorithm. In May 2016, a former “trending news” editor claimed that they were encouraged to suppress conservative news stories. Facebook denied the claim, but faced a media fire-storm from right-wing publishers and news outlets.

In 2015, Facebook began partnering with publishers to host content natively via “instant articles.” Instead of posting links to their own websites, publishers are now encouraged to post full articles directly on Facebook. With this feature, Facebook is both the platform where the articles are shared and the host that receives all the traffic. (It does split ad revenue with publishers.)

And just last week, Facebook announced it will limit “clickbait” stories and punish the publishers that post them. Editorial staffs worldwide will be forced to re-examine the stories they assign, how they write headlines, and what they share with their Facebook followers.

Clearly, Facebook has grown into a juggernaut in the news business. But until the launch of live video, Facebook’s clout was in controlling what news its users saw, not in providing users the ability to create the news.

Going live

Introduced to all users in 2016, live video is Facebook’s latest gambit to keep and attract users. With a smartphone and the click of a button, any Facebook user can begin streaming a live video to their friends, which can then be shared to Facebook’s 1.6 billion users.

When Mark Zuckerberg spoke a recent conference, he highlighted uses for Facebook Live that resemble the type of popular content on Snapchat, a fast-growing social media rival: people dancing, skiiing down a hill, and getting a haircut.

Initially, Facebook’s vision of a feel-good, fun Facebook Live was being realized. Eight hundred thousand users watched live as two Buzzfeed employees used rubber bands to explode a watermelon, and a live video of a mom wearing a Chewbacca mask has amassed more than 160 million views since being posted in May.

But everything changed in July 2016. In the span of a week, three incidents were broadcast on Facebook Live that have forever changed what the technology is capable of and what its true purpose is. Facebook user Diamond Reynolds went live after her fiance, Philando Castile, was shot by a police officer. Reynolds broadcast in real-time as Castile bled to death. The following day, Michael Bautista went live while Micah Johnson killed five police officers and wounded seven others in Dallas. A few days later, hundreds of Turkish users broadcast members of the military battling in the streets during an attempted coup.

Controversially, Reynolds’ video was taken down by Facebook in the immediate aftermath of its airing. An hour later, the video re-appeared with a “graphic video” warning. Facebook offered no explanation for the temporary removal other than to blame a “technical glitch.”

Facebook’s policies for taking down live videos are complex. Users can “flag” graphic content, and if enough users do so it can trigger one of two responses: automatic removal by a bot, or review by an employee. Facebook contracts out this review process to a company in The Philippines, where workers determine whether posts violate Facebook’s community standards or not.

What happened to Reynolds’ video? Was it an automated process with no human intervention? Was it an editorial decision made by a contract worker, who very likely lives in another country and may be unaware of the societal implications of the video’s removal? Was it a Facebook VP who made the call? No one knows, and Facebook won’t say.

Unsurprisingly, these incidents elicited strong reactions from the media:

Facebook is confronting complexities with live videos that it may not have anticipated just a few months ago, when the streaming service was dominated by lighter fare such as a Buzzfeed video of an exploding watermelon. Now Facebook must navigate when, if at all, to draw the line if a live video is too graphic, and weigh whether pulling such content is in the company’s best interests if the video is newsworthy. — Mike Isaac and Sydney Ember, New York Times
Facebook Live was to be another cheery venue for recipe videos. Facebook sees Live — and really, its entire platform — differently than its users do. It was not promoted as a digital flare gun to draw attention to atrocity, and it became one only because of user ingenuity. — Katie Knibbs, The Ringer
The risk is this: Facebook’s control over what the vast majority of people see online — news included — is overwhelming. Before the advent of Live Video, though, Facebook could more easily claim to be a neutral provider, simply serving up 3rd-party stories via an allegedly objective algorithm that was ultimately directed by the user itself, and using that user direction to build the best identity repository in the world to sell ads against. And while the reality of Facebook’s News Feed is in fact not objective at all — algorithms are designed by people — actually creating the news will, I suspect, change the conversation about Facebook’s journalistic role in a way that the company may not like. — Ben Thompson, Stratechery

The implication of all of these articles is clear: Facebook was unprepared for live video to be used this way, and it must face the realty that it is no longer just a platform for sharing news… it is a news organization that has provided its users with a powerful new tool to create and break news as it happens.

Ethical considerations

With live video, Facebook is providing users with a direct and immediate way to report news to its more than 1 billion daily users. No matter how much Facebook wants users to share light-hearted and fun videos, it was only a matter of time that the technology was used for breaking news. Facebook has unleashed this technology on the world, and it has a responsibility to put standards and practices in place to deal with situations that will inevitably arise.

What happens during the next mass shooting, when a hostage goes live and law enforcement requests the video be taken down? What happens when pranksters go live with an elaborate hoax that causes a panic? What happens when an attacker goes live while committing an act of terrorism?

No one knows. And if Facebook does know, it certainly isn’t telling.

Yet with Facebook Live, editorial decisions are being made every day:

  • What is considered newsworthy?
  • What is considered acceptable?
  • What is considered inflammatory?
  • What is considered dangerous?
  • What does the newsfeed algorithm prioritize: cute puppies or breaking news?

It’s much easier for Facebook to blame technical glitches and its opaque algorithm (which, we must remember, is actually designed by humans) than be open and transparent about how and why such decisions are made. But as Ben Thompson of Stratechery writes:

The most powerful journalistic entity in the world, though, doesn’t get the luxury of sweeping such significant editorial decisions under the rug: that rug will be pulled back at some point, and it would be far better for society and for Facebook were they to do so themselves.

Complicating matters is Facebook’s primary goal: Keeping users on the site as long as possible, in order to serve them as many ads as possible. And the company has been remarkably successful at achieving it, to the tune of a $360 billion market cap and annual profits of more than $3.5 billion.

In the summer of 2014, Facebook was heavily criticized for its newsfeed being full of ice bucket challenge videos as opposed to stories about racial tensions in Ferguson, Missouri following the shooting of Michael Brown. And it makes sense why that happened: Users are simply less inclined to “like” stories about racial tensions and police shootings than baby photos and videos of people dumping water on themselves. (Facebook made a meager attempt at addressing this issue by offering users several emjoi — sad, shocked, surprised, etc — that represented different “reactions” to a story other than “like,” but that is far from a solution to the problem.)

This likely made good business sense — the ice bucket challenge videos were a viral phenomenon that delighted users — but Facebook arguably did its users a dis-service by not providing them with more information about a crucially important new story.

If Facebook were to more fully embrace its position as a news organization, optimizing feeds for feel-good content that keeps users on the site for as long as possible will often be at odds with showing users the most important news of the day.

In many ways, Facebook is a company in conflict with itself. Is it a platform for friends, family and social networking, or a platform for sharing and creating news?

As Margaret Sullivan of the Washington Post writes:

Yes, social media platforms are businesses. They have no obligation to call their offerings “news” or to depict their judgments as editorial decisions. They are free to describe their missions as providing a global town square or creating a more connected globe.
But given their extraordinary influence, they do have an obligation to grapple, as transparently as possible, with extraordinary responsibility.

Facebook also needs to consider its responsibility in making users aware of any dangers and blowback they may receive from going live, and of the potential for their videos to be shared widely on Facebook and be re-broadcast by news outlets. Though the company does offer community standards, they are far from specific and are difficult to find.

As Mike Isaac and Sydney Ember reported in the New York Times:

Bruce Shapiro, the executive director of the Dart Center for Journalism and Trauma at the Columbia University Graduate School of Journalism, said companies with live-streaming and large audiences have a responsibility to inform the public and its users about potential repercussions.
“Companies and platforms are going to have to educate people about how their videos are going to have a big impact,” he said.

Recommended steps

How can Facebook better handle these issues? Here are five recommendations:

1) Consider societal/ethical implications prior to launching new technology

It is clear that Facebook was unprepared for live video to be used in such a way. While its innovation is to be admired, Facebook should spend more time considering the potential societal and ethical impacts of new technologies before they are made available to its 1.6 billion users.

2) Hire a public editor/ombudsman

All powerful news organizations need to a public editor/ombudsman to be both a liaison to the public as well as an independent observer and enforcer of journalistic ethics. Facebook is notoriously secretive, and its executives are rarely made available to the media. A public editor would have access to these executives, and would be able to hold them accountable and provide answers to users who question their decisions.

3) Transparency

Facebook’s community standards are not enough. The company needs to be more open about how it determines newsworthiness, what constitutes graphic content, why videos are removed, and if/when government agencies or businesses request videos to be taken down.

One idea is to produce a quarterly report that identifies how many videos are taken down and for what reasons.

It is unacceptable that Facebook refuses to provide an explanation beyond “a technical glitch” for the temporary removal of the Philando Castile video.

4) Create an editorial staff of veteran journalists, on call at all times for when users break news using Facebook’s technology

When news breaks, Facebook needs to be ready to quickly and correctly determine whether a video is newsworthy and appropriate. This work should not be outsourced to an algorithm or to low-level employees who are not experienced in making such decisions.

A team of veteran journalists would be a tremendous asset, and would help users trust Facebook in these situations.

5) Make the newsfeed algorithm less opaque

Along with the original Coke recipe and Colonel Sanders’ 11 herbs and spices, Facebook’s newsfeed algorithm is one of the great mysteries of American business. But after providing a tool for users to break news in emergency situations, it owes users more clarity about what is most likely to appear in feeds and why.

Further reading