Facebook ethics questioned after rulebook leak
All eyes were on Facebook today as The Guardian revealed Facebook’s secret rules and guidelines for deciding how to manage their 2 billion strong online community. Strategically published on Sunday evening to make Monday headlines across other media titles, they illustrate the difficulties of managing a social network larger than some countries.
From reading through the guidelines, and seeing how the media have broadly covered it, there are certainly a few ethical questions raised. Part of this is because subjects such as ‘revenge porn’ must be clearly defined, therefore highlighting what is appropriate. The other issue is confusion around internet culture as a whole, something that the Robin Hood Airport case highlighted from 2009.
Although clearly only having a team of 4,500 moderators (soon to be 7,500 moderators) to manage 2 billion users is clearly a pain-point. When the logical dos and don’ts of a standard operating procedure become public, it does reveal the ethically uncomfortable underbelly of social media.
Pretty much everyone has an experience of witnessing content on social media that is either inappropriate, or at its worst downright harrowing. It’s not too difficult to find investigative journalists and citizens on Twitter who choose to share images from worn-torn areas of human suffering and destruction — always 100 times more graphic than mainstream media.
On Facebook I’ve seen terrorist propaganda showing slaughter, pornographic images being shared, and videos of animals being set alight for fun. It’s the type of content that makes you want to remove social media from your life; it’s the unfiltered reality of life, an Instagram filter can’t solve this.
Beyond content, sometimes it’s the nature of the community that gets called into question. A few years ago I was interviewed by a BBC Radio Sussex to explain what Facebook friends actually mean. This was following somebody announcing a suicide on Facebook, but was ignored by all 2,000 friends. Six years later the situation has become more serious, it’s now possible to live stream a death.
To paraphrase what I said in 2011,
“Facebook’s business model is reliant on the wealth of personal information they hold about us. A company who provides a social network also has the responsibility to safely manage their country of people. You wouldn’t have a city without police, so why is this any different on social media?”
Without wanting to repeat the views shared in The Guardian’s article, it’s clear to say that Facebook is continuing to face a reputational issue. Every country they operate in have a set of cultural principles and laws, it doesn’t make sense for Facebook to follow only their own rules.
There is a clear social responsibility required by technology companies who provide people with the opportunity to communicate and share content in new ways. However, safeguards must be put in place for when the system is abused.