The Good, The Bad & The Ugly about Clean Internet
Image recognition AI software has become so powerful that it can now determine if an image is outlining sexual organs, nudity or blood. Although it is still difficult for it to distinguish pornographic content from holiday beach photos, a movie fight scene from reality, hatred from irony, etc. It’s a job that no one but humans can accurately do, and it’s called content moderating.
So, in this world, there are two kinds of people, those who have access to clean internet, and those who have to clean it — that’s the job of the content moderator.
A content moderator is a person responsible for checking user-generated content submitted to an online platform such as social media. They make sure that the content is shareable, which means it doesn’t include any illegal or offensive elements. It’s up to the content moderator to decide whether to keep the content or delete it.
What’s great about this job, is the mission behind it. Content moderators make sure to keep the internet clean and safe for everyone. So from a certain point of view, content moderators are the internet “gatekeepers”.
However, being a content moderator can be gloomy. In fact, people who work as content moderators often remain anonymous. They work for Facebook, Twitter, and other platforms, but they are usually hired by outsourcing firms for relatively low wages. Before starting this job, they have to sign a non-disclosure agreement that forbids them from talking about or sharing the content they see, the disobedience of which may lead to fines or any other form of penalties. They are almost always under surveillance.
Moreover, the content can be truly shocking: child abuse, necrophilia, crime wars. This can have severe effects on the moderators who have to view it. These images remain in the memory, causing grave psychological trauma that can manifest in various ways, from eating disorders to depression and alcoholism.
Moritz Riesewieck & Hans Block in their documentary ‘The Cleaners’ showed the real face of this job and told the story of a content moderator who had to constantly moderate content containing material relating to suicide attempts — until he committed suicide himself.
The harsh reality is that clean internet comes with a price, a price that more than 15000 people are paying across the globe, most of them are in Manila, the Philippines. They are the ones who bear the scars from trying to polish the digital space.
Moreover, if we want to take a deep dive into this issue, we’ll find ourselves facing a big dilemma. The dilemma is raised when moderators delete content that needs to be seen even though it’s shocking. Content like images of war crimes, for example. If Facebook, Twitter, or Youtube were to take down this type of content, it would become much easier to cover up such activities.
Facebook is giving us more power to decide what we see on their platform and uses algorithms to suggest only content related to the things we like. For example, if someone likes to see cats & dogs, they will keep seeing only cats & dogs videos and could remain blissfully ignorant of what is happening outside their bubble.
With that being said, the question arises, do we want to be so cautious about the internet, that we are protected from disturbing content to the extent that we forget about human integrity and become unaware of social injustice?
Well, the answer can be no, since Facebook is giving us the power to be in control of what we see and what content we get. It’s safe to say that people are becoming more and more aware of what’s happening in the world. Social media definitely exposes things that we weren’t able to see before. It surely made us develop our critical thinking and discover new ways of shaping our mindsets and new causes that can raise our awareness.
We actually witnessed that with so many movements of what we call Hashtag activism, that allows people or organisations to launch a widespread movement through social media platforms. Everyone has probably heard or even supported some like #NeverAgain, or #Metoo, the global movement that reached even China, the country known for its social media censorship. It was actually considered the biggest coordinated student protest movement that had surfaced in China. It started in 2018 when PhD student Luo Xixi posted on the Chinese version of Twitter, Weibo that she has been sexually harassed by her tutor in 2004, which he denied, but the post spread across the world and has been viewed around 3 million times. This post led to thousands of students petitioning for anti-harassment policies because, in China, sexual harassment is pervasive and no one could do anything about it until #Metoo emerged, which resulted in the Chinese government being forced to publish a new civil code with an article about the definition of sexual harassment, and the supreme court admitting that henceforth sexual harassment victims can file lawsuits for the first time in China.
Another example of Hashtag Activism is the newest movement #Blacklivesmatter, which became a social & political movement that resulted in giving the people of colour the power and voice to speak about injustice, police cruelty and racism.
The internet can be good and powerful, it is a great equalizer and accessible to everyone, and social media is a space for sharing our thoughts with the world, it gives us a sense of what we care about, and with the good use of it, social media can become a tool to reduce social injustice, fight racism, raise awareness and give a voice to the marginalized ones.
About this article
This article has been written by a student on the Grenoble Ecole de Management’s Advanced Masters in Digital Strategy Management. As part of a content creation assignment, students are given the task of writing articles based on their digital interests and disseminate the articles online. Articles are marked but we make minimal changes to the content. Thanks for reading! James Barisic, Programme Director, MS DSM.