Virtual Sweatshops

Aaron Mayer
Impact Labs
Published in
5 min readNov 18, 2019

I recently watched congresswoman Katie Porter grill Mark Zuckerberg during his testimony to the House Financial Services Committee on October 23rd — you can watch the clip for yourself here — and I bristled when I heard about how Facebook’s content monitors are allowed only 9 minutes of supervised breaks per workday.

After some follow-up reading, I learned that the job is a lot worse than it sounds.

The role of a content monitor on a site like Facebook is to watch the never-ending stream of videos uploaded to the site every second and flag inappropriate content to remove it from the platform.

It turns out that “inappropriate” is the understatement of the century.

Some of the videos that content monitors have to watch range from the mundanely disturbing to the criminally obscene — from gruesome and macabre to downright sickening. The monitors watch instances of rape, torture, mutilation, and beheadings. They watch these videos for hours every day — just 9 minutes to punctuate the torrent of horrors — while the content pours in ceaselessly and mercilessly: a deluge of the most vile and putrid acts of depravity that has been loosed from the twisted Pandora’s Box that Facebook has become.

Facebook pays content monitors to watch these videos and remove them because the technology to automate the process isn’t robust enough yet. Computer vision algorithms are very sophisticated, and they’re ordinarily quite good at determining what is happening in a given image, but they’re not perfect, so humans are still needed to pick up whatever falls through the cracks.

Unfortunately, it is still very easy to fool computer vision algorithms. You may have seen this image of a few stickers being affixed to a stop sign, rendering it invisible to self-driving car algorithms. More recently, researchers have developed tee shirts that can shield you from the watchful gaze of AI in surveillance cameras. When it comes to nudity, some people find clever workarounds (like the first 10 seconds of this Sexplanations video) to avoid the watchful eye of the algorithms that flag nudity on YouTube. Because these algorithms aren’t perfect, and because people continually adapt to avoid the algorithms’ scrutiny, human-intelligence like content monitoring will be a necessary component in the architecture of AI-enabled applications for the foreseeable future.

As long as that is the case, content monitors will continue to develop PTSD-like symptoms induced by secondary trauma from watching these videos. Also, since the monitors themselves are contractors, they aren’t entitled to the same benefits as full-time employees, and they remain fractured in a way that makes it difficult to unionize and demand better working conditions.

More than one hundred years ago, working conditions in an entirely unrelated economic sector created a different moral quagmire:

Sweatshops.

The Triangle Shirtwaist Factory in Greenwich VIllage, NYC.

Workers in the garment industry around the turn of the 20th century were often locked into overheated sweatshop factories with scant safety precautions and deplorable air quality. Emergency exits were barred to prevent people from taking breaks, and they were sometimes forced to work upwards of 15-hour shifts, all while being intimidated and bullied so as to prevent them from unionizing and fighting for their rights and dignity.

Sound familiar?

In a cruel parallel of fate, there is another similarity between sweatshop workers and content monitors at Facebook: many of the workers are immigrants or people with a very low socioeconomic status. No one chooses to work in horrendous factories or watch beheadings unless they see the job as one of their only means of earning a livelihood. Most of the contractors working for Facebook come from the Philippines where economic mobility is much more constrained than in the US, and Facebook is easily able to take advantage of the wage arbitrage among a population that shares a language and similar culture with Americans.

As is so often the case, labor rights intersect with classism and neocolonialism in troubling ways. What some people think of as simply an economic issue is an issue of social justice as well.

It was only after the infamous Triangle Shirtwaist fire in 1911 that legislation and policy makers finally stepped in to establish safer standards in the garment industry like proper ventilation, maximum shift lengths, and fire escapes . I shudder to imagine what a modern equivalent of the Triangle Shirtwaist fire would be, and if we wait until we smell smoke, it will already be too late. We shouldn’t have to wait for tragedy to strike before we hold industry leaders accountable. Until we do, today’s caste of virtual sweatshop workers will continue to suffer.

The Triangle Shirtwaist Factory on fire
The 1911 Triangle Shirtwaist Factory fire. 146 workers perished; most were women and girls.

We often think of software as a clean industry. After all, the developers who engineer the programs that create our digital world are only manipulating 1’s and 0’s, and we interact with the products they build through carefully curated screens and finely manicured devices.

Though we don’t witness it firsthand, the ethical hazard of content monitors at Facebook reminds us that there is a significant amount of pain incurred by others in the process of making our experiences painless. And yet, this story extends far beyond Facebook. That pain extends to the poor labor practices in the Chinese factories that manufacture our iPhones, to the extraction and pollution generated by mining the rare earth metals that power them. We have to be wary as users of these products: just because we don’t see their harmful effects directly doesn’t mean they don’t exist.

While conditions have improved for garment workers in the US, one could argue that the misery hasn’t been ameliorated, it’s only been outsourced. There are still tens of thousands of factory workers in sweatshops around the world — some of them much worse than the Triangle Shirtwaist factory — and many human rights activists have done amazing work in exposing what is essentially modern slavery. Unfortunately, from the shirts we wear to the social media platforms we use, we dirty our hands when we support the companies upholding the practices that harm people we’ll never meet.

Facebook’s content monitors pose a moral challenge, but that challenge is not unique to Facebook: how can we be complacent with the goods and services in our lives knowing that so much suffering goes into their production?

--

--