Pandemic Panopticon: Exporting China’s AI Surveillance State

Lizzie Hughes
surveillance and society
3 min readAug 15, 2024

In this post, Elise Racine reflects on her piece ‘The Far-reaching Implications of China’s AI-powered Surveillance State Post-COVID’, which appeared in the 21(3) special issue of Surveillance & Society on AI & Surveillance.

Anton Grabolle / Better Images of AI / AI Architecture / CC-BY 4.0

As a doctoral candidate at the University of Oxford, my work focuses on the societal impacts, policy implications, and fundamental rights challenges of emerging data sources and technologies within global health. I am thrilled to share some of my findings in “The Far-reaching Implications of China’s AI-powered Surveillance State Post-COVID,” published in Surveillance & Society’s special issue on AI & Surveillance.

The article examines how the COVID-19 pandemic has accelerated the digitalization of public health practices, fostering a new class of AI-powered surveillance technologies. It focuses on China’s Alipay Health Code contact tracing application used in 200+ cities to automatedly determine contagion risk based on personal data. While such tools have legitimate public health uses, there is also the potential for serious harm. These include:

1. Mission creep: Reports suggest the app has been misappropriated by authorities to curtail protesters’ mobility in the Henan province, raising concerns about usage beyond its original purpose and its possible incorporation into China’s extensive surveillance apparatus. This could set dangerous precedents for state-sponsored automated social control, undercutting democratic ideals by curtailing dissent under the guise of promoting collective wellbeing

2. Global implications: Through its Belt and Road Initiative, China is exporting not only AI-enabled surveillance technologies but political know-how and securitizing logic to countries where democracy is fragile or non-existent, essentially propagating its model of digital authoritarianism.

3. Health securitization: The shift in global health governance towards securitization may facilitate the normalization of these surveillance systems beyond the pandemic.

4. Biopolitical consequences: These developments have significant ramifications for power dynamics and individual rights, potentially reinforcing disparities, augmenting inequities, and fueling justifications for discrimination.

I conclude by emphasizing the need for more scholarship on the role of AI systems in current political-economic arrangements, particularly in how they may alter the world order in ways that target marginalized and minoritized groups.

The article is, ultimately, the byproduct of the convergence of two things: my passion for this topic and the exciting opportunity to present on it at the Public Tech Leadership Collaborative (PTLC)’s salon series on the misuse of data/tech for surveillance

My passion stems from over a decade of research, beginning during my undergraduate years at Stanford University. I spent several summers with the Tibetan refugee community in Dharamsala, India investigating the Tibetan self-immolations — including the relationship between protest activity and state-sanctioned violence and repression (e.g., the Chinese government’s attempts at masking human rights violations by controlling ICTs).

I built upon this knowledge at the London School of Economics, analyzing how authorities in China’s Xinjiang province have collected vast amounts of biometric data from the Uyghur population — ostensibly for free health checks — and coopted these systems to surveil and control the ethnic minority. Knowing exceptional circumstances like public health emergencies can accelerate deployment of powerful technologies while bypassing democratic oversight, I have delved deeper into these issues throughout the COVID-19 pandemic and my time at Oxford (e.g., vaccination/immunity passports).

A peer-learning collaborative committed to ensuring that data and technology serve the public interest, PTLC was a natural platform to share this work. Drafting the case study was a wonderfully collaborative experience emerging from conversations with the Program Director, Charley Johnson, and another PTLC member, in which we refined and sharpened the focus. I then presented the case study during a two-part salon series, receiving valuable feedback that illuminated new perspectives. These series provide private spaces for scholars, practitioners, and government leaders to collectively sense-make, cultivate trust-based relationships, and have exploratory, intentionally facilitated discussions about complex issues.

The iterative process inspired me to write the opinion piece, while the far-reaching implications of the developments we deliberated underscore the importance of examining such topics in multistakeholder forums like PTLC. For those interested in gaining a more comprehensive understanding of the pressing challenges posed by AI-enabled surveillance, I encourage you to explore Surveillance & Society. Your engagement can help foster the dialogue necessary to develop solutions that uphold dignity and fundamental rights.

--

--

Lizzie Hughes
surveillance and society

Associate Member Representative, Surveillance Studies Network