L1ght Reading
Published in

L1ght Reading

The Hippocratic Oath Vs. The Zuckerberg Legacy

Photo by Piron Guillaume on Unsplash

When paramedics begin training, the first thing they learn is to protect themselves before treating patients. They are imprinted with the understanding that if they are either wounded or hurt, they will not be able to help others, hence putting both themselves and their patients at risk.

That’s why when you see an ambulance rushing to a scene of a car accident or some other disaster, and the paramedics pour out of the vehicle in a hurry with life-saving machines, the first thing they do is make sure they are covered with gloves and masks. They make sure they are protected and not exposed, and only then do they approach the wounded. The same goes for firefighters, or any other job that entails helping others.

As a parent, your initial instinct is to always put your child’s ’ safety before yours, but when you board a plane for a flight, the safety movie clearly states that if something goes wrong , you should put the air mask on yourself first and only after turn around and assist your child.

Both examples clearly emphasize how important it is that if someones’ life depends on you, you should first take care of yourself and make sure you are healthy enough to assist.

The question remains: who trains Internet content moderators? Who are they and what are they doing?

Internet content moderators are the people who work day and night to go over materials marked by other users as “requiring reporting”, and automated services that identify nudity, violence, and other content on social networks. They are the ones that save our children (and us) from toxic content , a little after they hurt some people but before they hurt even more.

These are the people that clear our timelines from movies of ISIS beheadings, pedophilic materials from our chats and stop real-time movies of massacre sprees by racist islamophobes before it hits the wide audience.

How can we help the content moderators stay on top of their game so they can help us? How can we make their jobs (and with that, their lives) easier so they protect us better?

There have been numerous articles discussing the problem of minimum wage employees working in moderation for social networks all over the world. These minimal, spartan environments offer little to no psychological assistance. This, in turn, results in a high percentage of PTSD victims and numerous suicide attempts.

Articles are starting to pop us showing that many companies who outsource their moderation services to social networks are now cutting their profit line because of high rates of internal issues and negative publicity.

To answer the first question, the moderators should not only be provided with a decent office environment but also with mental health assistance that can support, help and treat the mental damage that naturally comes along with the job. Exposure to toxic material does not only exist in science fiction movies, but it’s also dominant in the workplace, and these moderators should be provided with round-the-clock psychological support.

As to what we can do to make their lives easier? Well. There’s good and bad news.

The bad news first ;)

Human nature generates toxicity. We (meaning humanity) create physical garbage at an alarming rate, and it’s only growing. The same goes for digital toxicity. People enjoy hiding behind avatars and having a secret life where they can be, well, assholes. This phenomenon is not going away. It’s getting worse . By now, it’s an epidemic.

The good news is that we are getting better when it comes to predictions. We predict the weather, how our bodies will react to certain stimuli , the reasons why humans behave the way they do, and all this due to advancements in technology, mathematics, and machine learning. These technological advancements can assist in detecting and predicting toxicity, violence, predatory behavior, shaming, bullying and more before it even happens .

But again, the bad news…

To be able to predict these toxic phenomena, we need context. A history of events if you may. Only a chain of events can lead us to an accurate prediction before things become toxic.

You may think this revelation would be gladly embraced by social networks, as it’s t almost too good to be true. This is the beginning of a solution to remove toxicity, without human pain — Not for the moderators and not for the users. So how come they aren’t putting their all into this and solving the problem? That’s the question we should all be asking…




Latest stories and trends in online toxicity

Recommended from Medium

The Friday Roundup — Getting in the Groove for Video and More!

How smartphones are ruining our relationships.

Ranking Intellectual Dark Web (IDW) Twitter Accounts

10 Unwritten Rules of Creating for TikTok

Teenagers Are Making Millions on TikTok — And Celebrities Are Trying to Take Their Spotlight

TikToker’s doing the infamous ‘renegade’ dance now controversial, including TikTok’s famous Charli D’amelio

Meme Competition — Cheemspetition begins!

Instagram and the Worlds that Never Were

Social Media is for idiots

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ron Porat

Ron Porat

CTO & Founder @ L1ght

More from Medium

Why short and by video is a new old trend. And how does the IT market respond to it?

Why Spotify Wins

5 reasons why you need to learn Korean. (it’s not just Kpop and Kdramas!)

Friday Five: How might Elon Musk change Twitter?