Facebook Admit Their Failure To Stop Incitement Of Violence In Myanmar

BAILEY T. STEEN | THURSDAY, NOVEMBER 8, 2018

Facebook, the social media company focused on “bringing people together”, seems to have another human rights crisis on its hands. On Tuesday, the company published an independent assessment which analysed how their administrative efforts fostered incitement of violence against the Rohingya population during the 2016 genocide in Myanmar.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” wrote Facebook’s public policy manager Alex Warofka in a public statement. “We know we need to do more to ensure we are a force for good in Myanmar.”

Conducted by the Business for Social Responsibility organisation, a non-profit committed to human rights protections, the report found the site was “being used to foment division” that ultimately “resulted in offline violence”, later arguing that Facebook’s newly unspecified changes to their content policies will prevent such events from repeating themselves in the future, according to statements cited by The Verge. The investigation into Facebook’s administrative conduct began near the end of 2017, a few months after independent outlets (such as our own) began shining a light on the horrific conditions in Myanmar left ignored and suppressed by big tech institutions.

In November of last year, TrigTent reported on how Facebook and YouTube were reportedly removing all images documenting the ‘ethnic cleansing and torture’ being conducted by the Myanmar government, removing traces of these political hate crimes without reporting the evidence to proper human rights authorities before their eventual deletion.

“Three years of documentation, just gone, in a moment,” pleaded human rights watchdog Obayda Abo-Al Bara, the manager at the Idlib Media Center, who spoke to The Intercept last year. He was also joined by a Rohingya activist, Mohammad Anwar, who said “I did feel that Facebook was colluding with the Myanmar regime in the Rohingya genocide.”

For critics of the social media empire, such as journalist Avi Asher-Schapiro who also happens to write for The Intercept, these blind image removals were “at best, a destruction of evidence” and “at worst, complicity in the atrocities” overall. He writes: “First-hand accounts of extrajudicial killings, ethnic cleansing, and the targeting of civilians by armies can disappear with little warning, sometimes before investigators notice.

“ When groups do realize potential evidence has been erased,” he continued, “recovering it can be a [nightmarish] ordeal. Facing a variety of pressures — to safeguard user privacy, neuter extremist propaganda, curb harassment and, most recently, combat the spread of so-called fake news — social media companies have over and over again chosen to ignore, and, at times, disrupt the work of human rights groups scrambling to build cases against war criminals.”

In September 2018, a 479-page United Nations report, which was summarised quite accurately by reporters for The Guardian, concluded that Facebook issued a “slow and ineffective response” when their “standard reporting mechanism alerted the company to a post targeting a human rights defender for his alleged cooperation” with the UN. The highlighted post, which investigators found was shared over a thousand times, labelled their unnamed human rights activist as being a “national traitor” commenters thought should be “murdered in the street” due to his religious status as a Rohingya Muslim.

Facebook’s inability to address such genocidal extremism in Myanmar has been a well-covered catastrophe, resulting in a coalition of activists from Myanmar, Syria, and six other countries issuing specific demands for the social media company to increase transparency, enforce standards suitable to prevent extremism all the while maintaining online free speech liberties. These demands were never addressed until this week.

The report suspects the reason the company didn’t crackdown on Myanmar incitement was the estimated 20M increase to overall user engagement from the crisis. “There are deep-rooted and pervasive cultural beliefs in Myanmar that reinforce discrimination and which result in interfaith and communal conflict,” the report said. “Facebook, being used to spread these opinions on an individual basis, as well as by organized groups, gain politically.”

By following the golden rule of capitalism, where profits are above morals, it seems catering to the extremist crowd, who were siphoned for their advertisement revenue and user inflation, was just good business for Facebook. Although Zuckerberg and co have agreed to publish more data on enforcement policy, these are just empty promises tied to no mandate, giving no detail on how regularly these reports will be published, what’s the future process of collecting evidence of war crimes in the future.

Facebook isn’t the total villain, of course. While these problems of unaccountability persist, Zuckerberg has announced their new team of 99 native Myanmar speakers dedicated to addressing the issues of extremism specifically. The Verge details how the group has already taken action on over 64,000 pieces of content violating their policies on incitement of violence and their loosely defined “hate speech”, claiming 63 percent of these posts were manually reviewed. This still comes with the problem of Facebook outsourcing a sizeable portion of their administrating to automated reviewers, though shows a partial commitment to localised oversight.

“There are a lot of people at Facebook who have known for a long time that the company should have done more to prevent the gross misuse of its platform in Myanmar,” said Matthew Smith of Fortify Rights, a non-profit human rights organization that focused on Southeast Asia, who spoke with The New York Times. “This assessment is encouraging and overdue, but the key to any assessment is implementation.”

Thanks for reading! Bailey T. Steen is a journalist, designer and film critic residing in the heart of Victoria, Australia. His articles have been published on TrigTent, Medium, Steemit and Janks Reviews. For updates, follow @atheist_cvnt on either Twitter, Instagram or Gab.Ai, while you can contact him for personal or business reasons directly at bsteen85@gmail.com. Cheers, darlings. 💋