A Twitter exercise: What my students think about Facebook’s leaked guidelines on sex, terrorism and violence
Today, I introduced my students to The Guardian article that revealed what many of us don’t know about Facebook: it moderates content based on an “internal bible” that is as complicated as the Terms & Conditions of Facebook in fine print.
Unknown to most, Facebook has some set of “rules” on how to filter out signal from the noise, the #NSFW from the family-friendly content, The Guardian further reveals.
Facebook eats the world
We all know that Facebook has grown so big over the years — 1.8 billion monthly active users to date. The Guardian story reveals that the social network has been dealing with an overwhelming volume of information. With about 4,500 moderators combing through Facebook content everyday, Facebook admitted that this number wasn’t enough to cover every piece of content, prompting the Internet company to announce lately that they’re getting 3,000 more mods.
The Guardian writes:
One document says Facebook reviews more than 6.5m reports a week relating to potentially fake accounts — known as FNRP (fake, not real person).
Using thousands of slides and pictures, Facebook sets out guidelines that may worry critics who say the service is now a publisher and must do more to remove hateful, hurtful and violent content.
Yet these blueprints may also alarm free speech advocates concerned about Facebook’s de facto role as the world’s largest censor. Both sides are likely to demand greater transparency.
Facebook will allow users to livestream attempts to self-harm because it "doesn't want to censor or punish people in…www.theguardian.com
Facebook's secret rules and guidelines for deciding what its 2 billion users can post on the site are revealed for the…www.theguardian.com
20-minute Twitter paper
Having assigned my student to read and react to the article, I also referred to the same article to design my week 2 lecture on digital publishing.
To help me make a point, I introduced the “20-minute Twitter paper” where students would be posting their reactions to the lecture, 140 characters per sentence, Twitter-style.
(Go search on Twitter the following hashtags #PUBLISHA51 and #PUBLISHA52).
The latest Tweets on #publisha51. Read what people are saying and join the conversation.twitter.com
The latest Tweets on #publisha52. Read what people are saying and join the conversation.twitter.com
As a kicker, I also instructed them to get their friends and followers to like and re-tweet them, and hopefully engage them in a conversation about the discussions in my class.
I got the timer ready. And they had to wait for my go signal to start tweeting.
(Note: I was surprised to still find some students having no Twitter accounts. I guess times have changed).
After 20 minutes, here’s what I have picked up from their tweets.
One student was surprised how powerful Facebook has become.
Some students admitted that Facebook has been their platform for entertainment, and that it has been part of their lives.
Having shown them the darker side of Facebook where sexual content, violence, hate speech and even suicides are now happening Live on Facebook, some students said they were sometimes “traumatized” by what they’ve seen.
Just imagine the Facebook moderators who have to go through such content everyday. As The Guardian discovered, there has been a high job turnover among Facebook’s moderators. They blamed stress on the job as the culprit. Some even admitted that they suffer from anxiety and post-traumatic stress.
Facebook a publisher or not?
Facebook has been denying for so long that it is a publisher instead of being just a platform. However, with the recent The Guardian revelation, it is becoming more apparent that Facebook is taking on the role of a publisher, where it has to draw the line between what people can or cannot publish on its “platform.”
It’s not hard to think that Facebook has to increasingly face the music, and deal with the violent, sexual, fake and hateful content that it generates. The Guardian wrote:
“Facebook cannot keep control of its content,” said one source. “It has grown too big, too quickly.”
Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing.
It was so nice to see student reflect back on their use of this social networking platform. Facebook is now the most dominant force in the planet. Media critic and journalist Emily Bell, in a controversial article, said Facebook is Eating the World and that “Social media hasn’t just swallowed journalism, it has swallowed everything.”
Something really dramatic is happening to our media landscape, the public sphere, and our journalism industry, almost…www.cjr.org
Facebook captures majority of our attention these days. In its 2017 report, We are Social says that social media — mainly Facebook — continues to grow, thanks to Internet-enabled, mobile devices such as smartphones.
Social media use surged by more than 20% in 2016, with Facebook in particular posting impressive increases, despite already being the world’s most popular social platform for the past decade.
Nearly 2.8 billion people around the world now use social media at least once a month, with more than 91% of them doing so via mobile devices.
Today marks a momentous milestone for all things digital, with the new revealing that Digital in 2017 Global Overview…wearesocial.com
Can Facebook fix its current problem of dealing with increasing sexual, hurtful and violent content on its platform?
Facebook has introduced means to flag such content. But is it enough to help clean up the social network that is connecting more people everyday?
From what I have gathered from my students, it is now OUR responsibility to make sure hurtful comments, violence and sexual content are flagged, not shared. It is not an easy road to take. It will bumpy. And mistakes will happen. But we need to start somewhere.
Indeed, Facebook is bigger, has more people in it, and is front and center of people’s political, personal, and social lives.
Facebook faces a tougher battle these days. Now forced to censor hate, violent and sexual content based on its leaked guidelines, it also needs to learn how to be more transparent.
Yes, it respects freedom of speech, but it cannot hold anyone accountable for what they post. All Facebook can do is to decide whether the post should be deleted or not. Or not, especially in cases where content falls under “grey areas” like propaganda, misinformation, and fake news.
A related The Guardian article wrote:
“Although Mark Zuckerberg is being polite about it, there’s absolutely no way that Facebook will start preventing people from sharing what they want to share. That’s the core idea of the site,” said writer and professor Clay Shirky, who studies social networks.
Facebook’s business model relies on people clicking, sharing and engaging with content — photos, memes, opinions, news and gossip — regardless of veracity. “People trade untrue stories that encapsulate things they believe about the world all the time,” he said. “Facebook is in the business of letting people share stuff they are interested in.”
The challenge of fake and misleading news has come to the fore in the wake of the US presidential election. Facebook…www.theguardian.com
Who is policing Facebook?
At the moment, nobody is. Politicians and some governments are trying but Facebook is a global community of people with diverse views. Thus as one of its global policy management chief said, it is tougher for the platform to decide which views and content are deemed OK to share.
Here’s Monika Bickert, Facebook’s head of global policy management:
“We have a really diverse global community and people are going to have very different ideas about what is OK to share. No matter where you draw the line there are always going to be some grey areas. For instance, the line between satire and humor and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not,” she said.
“We feel responsible to our community to keep them safe and we feel very accountable. It’s absolutely our responsibility to keep on top of it. It’s a company commitment. We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards.”
She further stressed that Facebook is a new kind of a technology company that is different from traditional news companies where more transparency and moderation of content are mandated.
“We don’t write the news people read on the platform ,” she said.
In not so many words, she is saying that Facebook will NOT be a media company where editors are the rightful set of eyes to determine whether a content is too violent, sexual and hurtful. However, Facebook will NOT curtail freedom of speech even if it means disturbing the sensibilities of some of its communities ; and hopefully it does not monetize these actions, as some experts have warned.
The nude photo investigation that rocked the Marine Corps this week has shed light on Facebook's secret groups, a…www.nbcnews.com
About the author: He works as a communications manager in a technology firm. On weekdays, he teaches students about digital publishing and the media as we know it. On his spare time, he blogs his thoughts about everything including media, journalism, teaching and politics. Follow him at Twitter via @erwinoliva