Facebook Live: The Pandora’s Box of 2016

Is it possible to bring social streaming under control?

Marta Jurasik
ART + marketing
9 min readJul 19, 2016

--

In April, Facebook launched a new service allowing netizens to publically stream live video, calling it Facebook Live. On the go-live-day, the company’s CEO Mark Zuckerberg said this about his new baby:

“We built this big technology platform so we can go and support whatever the most personal and emotional and raw and visceral ways people want to communicate are as time goes on.”

Little did he know, how emotional, raw and visceral this document would quickly become. Let’s look back at last month’s events:

June 13th, Magnanville, France. Larossi Abballa stabs a policeman and his partner to death in their home. And then shows the crime scene to the whole world on Facebook Live.

July 6th, Falcon Heights, USA. African-American Philando Castile bleeds out in his car shot by a police officer. The world watches the dramatic report of his girlfriend live on Facebook.

July 7th, Dallas, USA. An exchange of fire between police and snipers in Dallas. It’s broadcasted by a passer-by via Facebook Live.

July 13th, Norfolk, USA. 3 men are shot by a gunman who approached them as they listened to music inside a car. It happened while one of the victims was live on Facebook.

And that’s the problem. Facebook is used by 1.09 billion users every day. At this very moment, while some of those people are blamelessly enjoying holidays in the Maldives or catching Pokemons with Pokemon Go, other Facebook users are starving, dying, becoming victims of sexual assault, inciting xenophobia or spreading misogyny. With Facebook Live, each of the 1.09 billion people, no matter who they are and what they desire, has received a tool with which they can spontaneously share their experiences with the world in real time. No matter what those experiences are. And they can go viral in no time at all.

This is a channel with an enormous potential — both good and bad. For the girlfriend of Philando Castile, the man shot four times by a police officer for no apparent reason, Facebook Live was the only means of resistance she had. Not only did she make millions of people all around the planet aware of the incident, she also collected evidence which may eventually help to convict the perpetrator. The man who broadcasted the shooting in Dallas alerted people to the terrifying and dangerous events unfolding around him.

One netizen, commenting on Mark Zuckerberg’s Facebook page, put it this way:

“Facebook platform is not just changing the entertainment world, but it’s dramatically improving peoples’ sense of themselves and their relationship to power.”

But there’s a much darker side that’s been unleashed. Larossi Abballa, who murdered two innocent people on behalf of ISIS, showed that Facebook Live and similar services can be used to terrible affect. Race hatred, pornography, children abuse — these are just some of the evils lurking within this modern-day Pandora’s Box. Do we want to watch such footage? What affect does watching real-life graphic violence in real time have on people? What rights do the victims of such crimes have? And how was it possible that Facebook allowed Larossi Abballa’s footage of the crime scene to go viral?

While advocates of freedom of expression may herald the arrival of an age in which we can finally grasp an uncensored reality, few would deny that this new tool doesn’t pose serious problems. And few would say that there is no case in which a video shouldn’t be taken down from the site.

But who decides? Are there rules to distinguish between content that is fine, even if it shows violence if the dissemination of it can be seen as a public good, and content that is evil? How many people would Facebook have to hire to be able to screen all the Facebook Live videos 24/7? Didn’t anyone at Facebook predict what might happen when they launched a tool that aimed — to use the favourite buzzword in the IT industry — to “disrupt” the way people communicate? It seems not.

During the last couple of weeks the lack of Facebook guidelines in relation to live-streamed content has made headlines around the world. The media keeps asking how Facebook is going to address the tsunami of live reports containing violence, terror propaganda, xenophobic or homophobic content, pornography and all of the other evils of the world.

The other day, I was scrolling down my Twitter feed and boom, my eyes landed on breaking news: “Facebook published its policies on graphic live videos”. Wow, I thought, it’s awesome that companies of this size can adjust so quickly if the world needs something urgently from them.

Well, after having clicked the link it occurred to me, that, in fact, they can’t.

As it turned out, Facebook has merely done its Public Relations homework and responded ASAP (all PR textbooks say in case of crisis you should address it ASAP) with a press release. Apart from a bunch of well-worn statements, I read that the infamous, extremely vague Facebook “Community Standards” apply also for live videos. In short: a Facebook employee will review the video only if it gets reported or goes viral. If they recognize violence, they will decide whether it should be removed or whether an explicit content warning should be added. Okay, nihil novi sub sole.

But there was one very valuable piece of information in the press release:

“For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.”

It’s a description of a real-life situation. It answers the question of how the social media giant will tackle footage of a shooting. Now I know what its moderator will do in such case. More importantly, the moderator will know what to do in such case.

Currently, Facebook moderators act in really unpredictable ways. Sometimes, legal content is taken down, while hate speech or depicted violence ‘does not violate Facebook’s community standards’ (Facebook’s answer to my own reports of racist posts). A weird thing happened for instance to the video registering the dramatic moments after Philando Castile was shot in his car by a police officer. An hour after it had gone live, it vanished from Facebook. When the disappearance raised questions about censorship across social networks, the video came back online. Facebook claims this was due to a technical problem (no further details, though). But incidents like these may be the result of indecisiveness on the part of moderators who don’t have any binding guidelines to walk them through the jungle of incidents they encounter.

I don’t blame Facebook for only releasing a statement. To be honest, I am perfectly aware that the problem they’re facing is too complex to be solved overnight. It’s hard to create rules if nobody, not even Mark Zuckerberg, knows what Facebook actually is. Legally and ontologically. Because Facebook is constantly evolving from its origins as a social network and is nothing if not schizophrenic in its aspirations. On the one hand it doesn’t want to take responsibility for what it broadcasts. On the other hand it aggressively fights for original editorial content. Launch of Facebook live coincided with the global roll-out of Instant Articles — a Facebook function allowing publishers to publish full-length content directly on Facebook, so that the users don’t need to leave the safe confines of their favourite mobile app. But that’s not enough. According to recent media reports, Facebook is paying out more than $50 million to publishers (including the New York Times and CNN) and celebrities who live stream content on Facebook Live.

One may think that the company — which has started to act like a media outlet — will take responsibility for its reporters. What would happen to a TV station if it showed a terrorist inciting violence? In the absolutely best case the producer loses his job. Did anyone lose their job at Facebook after Larossi Abballa went live? We don’t know.

Facebook prefers to be perceived as a platform connecting people with each other, with its role reduced to provider of communication tools. Following this argument, if Apple don’t take responsibility for content recorded with iPhones, why should Mark Zuckerberg’s firm do more, even in the face of overwhelming evidence that playing a greater role is required?

But maybe they’re right, maybe it’s not Mark Zuckerberg’s or his communications officers’ task to define the company’s status and set its responsibilities. Shouldn’t this be undertaken by policymakers? Because this is ultimately a question of Internet censorship, with implications that go much further than Facebook Live.

It is not Facebook’s responsibility to set the rules of what should be censored from the Internet. Reporters Without Borders, an organisation fighting for the freedom of information, has already protested against Facebook’s practices of deleting journalists’ posts which report terrorism. The German wing of the organisation — Reporter ohne Grenzen — calls on German and the European politicians not to leave the war against the hate and terror propaganda only to the administrators of Internet platforms and social networks.

“We have to discuss, in public, the limits of freedom of expression. Such sensitive questions must not depend on obscure processes at companies that put their business interests first, and, no doubt, do not feel as compelled to support press freedom as required in a democracy. In any state founded on the rule of law, what can and cannot be said — no matter online or offline — should be defined by independent courts” — said Matthias Spielkamp of Reporter Ohne Grenzen, as quoted on ROG’s website.

Will politicians make Facebook do something? Will the swelling industry of live video streaming (Facebook’s competitors such as Meerkat and Twitter’s Periscope face similar challenges) be regulated? In the past, both the EU and Germany have taken measures to regulate the digital companies. If they apply the same thinking to the case of Facebook live, we might see either of the following scenarios.

The first scenario, let’s call it “kill’em all”, is what has happened to the sharing economy giants — Airbnb and Uber — as a result of interventions by governments. The former recently had to remove the vast majority of Berlin apartments from its listings because of a new law, introduced out of concern for residents who have faced soaring rent prices. The latter ended up turning into an order-taxi-app because the court in Frankfurt ruled that only licensed drivers can make money from driving people about.

The second scenario is the “lip service scenario”, which would mirror the actions taken in the European fight against online hate speech. At the end of May 2016 Facebook, Twitter, YouTube and Microsoft together with the European Commission announced the Code of Conduct on Illegal Online Hate Speech, which in fact is an obscure list of what IT companies should do to make the virtual world a better place. It’s not legally binding, although I must say on paper it looks pretty. Plus, when you follow Facebook’s activities in Berlin — meetups with civil right organisations, conferences, workshops — you might think that the company is bending over backwards to meet the government’s expectations. But meanwhile, the same company has been taken to court for failing to remove racist or homophobic content and it’s accused of lack of transparency by German journalist on a daily basis.

Will there be a third way, a win-win scenario for Facebook Live? I wish there would. After all, I believe it’s a great tool that will give people a great new power. Fire was an astounding new and powerful invention once upon a time, too. But we all know how playing with fire can end.

--

--