Colonial Repetition in Ghost Work

Joseph Geldman
DataEthics4All
Published in
4 min readAug 30, 2021

Though it can sometimes seem that AI operates in a space beyond human biases and prejudices, we must remember that the operations and activities of artificial intelligence are fundamentally the products of a human intelligence which has, over the centuries, been constantly shaped and re-shaped by colonial prejudices and the West’s domination philosophies. These philosophies — of seeing developing regions as an economic opportunity for cheap labour whose efforts can be ‘erased’ at the point of sale — are repeating in “ghost work” sectors across the Global South.

Karen Hao, in the MIT Technology Review, has written about the theory of “coloniality”, and of how biases arising from colonial structures and attitudes have been repeated, albeit more subtly, in the Information Age. A Google search for “CEO” still mostly presents images of mostly men, and persons of almost exclusively white ethnicity. This is mostly a reflection of reality, mediated through image search results, but the fact remains that the results reinforce and re-assert stereotypes and power structures. This is a fairly simple example of “algorithmic discrimination”. It reflects an unequal reality, but also speaks to inequalities in the field and sub-fields of AI. A 2019 report by AI Now found that “only 18% of authors at leading AI conferences are women”, and that only 2.5% of Google’s workforce is Black.

But the realities of race-based discrimination have material consequences which are more complex and more troubling than the biased results of a Google search or inequalities in an American workforce. More insidious forms of structural inequality are seen in the character of the work ‘behind’ AI, which is poorly paid, personally unfulfilling, and all but ‘invisible’ in itself. The AI scholars Mary Gray and Siddharth Suri have described this sort of employment as “ghost work”.

“Ghost work” is a definition which recognises the contract work which often goes into providing artificially intelligent services. It is, by its nature, invisible: when the user employs the translation functions of Amazon Alexa or Google Assistant, for example, they are given the impression that the translation is entirely the work of a disembodied intelligence. This is an erasure of the human contribution which goes into providing AI services. This work can often be monotonous (as anyone who has captioned a YouTube video will know), but also morally taxing. A BBC article from 2019 reported that one such ghost worker was required to watch videos of animal torture and child pornography, to provide data examples for online content management algorithms.

A large number of these “data associate” roles are based in the West. In the UK, data associates are based in Cambridge and enjoy a “competitive” starting salary of around £22,700 per annum. However, alongside its Western locations, associates are employed in large numbers in the Global South, including in the cities of Chennai and Hyderabad in India. In these cities, data associates are paid around 2.72 lakh rupees (£2632) per year, marking a significant pay inequality which is especially pronounced in the context of the work that these employees still have to do.

These are just the data associates with a full-time contract. A great deal of ghost work is short-term contractual, and the work of translation, correction, and moderation to build AI insights is a frequent side-hustle for graduates who possess the relevant critical skills, but who cannot find full-time work.

All of this exposes the dark underbelly of large, Western-organised projects for the expansion of digital economies in the Global South. A report by the International Finance Corporation (IFC) suggested that the digital economy in Africa might be worth $712 billion by 2050. This is a very impressive development goal, but it uses a measure of prosperity in the form of GDP which notoriously ignores other indicators of development, such as how far individuals will be able to participate in meaningful, well-paid, secure jobs. As it is, the IFC report identifies Africa’s growth as the natural consequence of “entrepreneurship” and seems to imply that the benefits will naturally ‘trickle down’ to the individual.

But as we have seen in other regions in the Global South, the Western corporations and governments leading this drive for new prosperity are practising the same exploitative behaviours as their colonial predecessors: they are providing work, yes, but this is frequently work without the same expectations of security that is present in equivalent jobs in the Global North. “Ghost work” is often a monotonous form of labour which exploits the relative lack of power of developing countries to introduce regimes of work which are both poorly paid and unrewarding. And, given the way that Western AI companies present their products as the pure work of artificial intelligence systems, there is yet another insidious commonality with colonial production — erasing or undermining the efforts of the human workers who keep the systems running.

Other Sources:

https://arxiv.org/pdf/2007.04068.pdf

https://script-ed.org/article/algorithmic-colonization-of-africa/

https://www.amazon.jobs/en/teams/alexa-data-services

https://www.linkedin.com/jobs/view/data-associate-at-amazon-2431577136/?utm_campaign=google_jobs_apply&utm_source=google_jobs_apply&utm_medium=organic&originalSubdomain=uk

https://www.bbc.co.uk/news/world-africa-46520946

--

--

Joseph Geldman
DataEthics4All

MSt English student at Oxford. Interested in ethics, environment, politics.