Big Tech Won’t Save Us: The Case for Social Transformation over Coronavirus Surveillance

Surveillance and structural inequality go hand-in-hand; in the era of coronavirus it will be no different.

As the COVID-19 pandemic runs its course through the globe, experts in every country are desperate to stop it in its tracks. Anxious about the upcoming economic fallout from an entire country on lockdown, state governments are more eager than ever to partner with tech companies to roll out technology to facilitate containment methods such as contact tracing. In Poland and India, stay-at-home orders are enforced with the help of mandated selfie uploads confirmed by facial recognition software and geotagging, while China is supplementing similar measures with ‘smart’ thermal scanners.In the U.S., Palantir has been contracted to build a database to track the spread of COVID transmission.

The irony of this approach isn’t lost on a lot of people, many of whom are still reeling from a decade marked with tech scandals involving data leaks and mishandling of sensitive information. Many have likened the increased interest in surveillance as a response to the pandemic to the Patriot Act introduced in 2001 after 9/11. While it’s easy to poke at the various ways that surveillance tech unilaterally infringes on civilians’ privacy, what much of the current coverage on this issue fails to consider is 1) the actual efficacy of this technology in the first place, and 2) the underlying structural issues that surveillance will only heighten if implemented.

Despite the pristine advertising, the purported benefits of the kinds of technology implemented for surveillance purposes (whether it’s artificial intelligence (AI) or other algorithms) are commonly overinflated. Facial recognition systems, for example, have failed to overcome fundamental challenges of pose, illumination and occlusion, which contribute to high false positive rates. Deep neural nets mislabel images in ways that reveal a fundamental lack of understanding of abstract concepts such as race, class and gender.

Technologies in this field, much like the humans that create them, invariably have biases and inaccuracies, and these biases and inaccuracies disproportionately impact historically marginalized groups. AI-enabled surveillance technologies used in policing applications, for example, have been demonstrated to systematically misidentify black and brown people as criminals. U.S. Immigration and Customs Enforcement (ICE) risk assessment software reportedly removed the option to release detained individuals. AI-enabled surveillance technologies used in healthcare have been demonstrated to deprioritize the lives of black and brown people seeking medical treatment. It’s no coincidence that there are so many links between neo-nazis, massive digital surveillance, and the tech industry. There is no reason to believe biases will not be present when implemented in the coronavirus response.

In the short and long-term, communities of color disproportionately bear the brunt of the structural flaws that plague the surveillance technologies being proposed today. Digital misidentification or exclusion can result in restricted access to essential resources; put in the context of a pandemic, this may well mean death. This debate is about more than just anonymity and privacy; many computer scientists claim that tech-based interventions like the contact tracing protocols developed by Apple/Google are actually cryptographically sound. The important question facing society is whether a world in which movement is regulated through tech is actually what we want.

The abuse of personal data post-crisis is well-documented in humanitarian interventions. In the context of mapping informal slums in developing countries, increasing the visibility of marginalized communities has led to harassment and policies that force these individuals out of their homes. The same pattern has been documented in the US. In the aftermath of Hurricane Katrina, African Americans who lived in public housing found that their homes had been replaced with condos priced out of their reach by their own Congressman. In light of government and corporate records regarding data-abuse, it is not too far a stretch of the imagination to consider that information that is collected now about the consumer habits, mobility patterns, or social interactions characteristic of marginalized communities might be used to expand state control and profiling for non-health purposes. For communities of color, surveillance technologies offer a dismal risk-to-reward ratio that could fundamentally reshape the relationship between governments and their people.

We must resist the use of dubious technology to cheat our way back to the status quo and instead prioritize the most vulnerable in our communities.

The pro-tech argument generally goes that even though a system doesn’t work perfectly, lives saved are lives saved, and it’s better than nothing. Although some lives might be saved, entire communities are at heightened risk due to the blind-spots produced by the inherent biases in AI-enabled surveillance technologies. For example, a flaw in a proximity tracing app that leads to false positives is likely to lead to more false positives for certain kinds of users, particularly workers in the health or service sectors. Consider the millions of hourly workers such as grocery store employees, nurses, delivery drivers, housekeepers and gig workers on the front lines of the pandemic and high risk for infection and community spread. Despite now being labeled as “essential” (what some consider mythmaking that sugarcoats exploitation), these workers have historically been deemed less valuable and receive low amounts of compensation, which as a result makes them much less financially stable. These front line workers are often members of communities of color and are therefore particularly at risk. A faulty app that generates false negatives has real consequences for these individuals, such as the denial of potentially life-saving services. Surveillance measures that unilaterally target those most likely to interact with COVID carriers inevitably will lead to middle and upper class white communities breathing a sigh of relief as their social distancing measures are relaxed, while working class communities and communities of color continue to struggle without the same resources.

The efficacy of surveillance applications that governments around the world are deploying are limited not only by tech-based glitches, but also by deep-seated social factors. COVID-mitigating applications like health surveys or contact tracing only work if a critical mass of the population uses them. Marginalized groups largely remain excluded by design and by choice. In terms of necessary infrastructure, not everyone has a smartphone. Two out of 10 Americans don’t own one and those who don’t are often representative of COVID-vulnerable groups like the elderly, homeless, or communities of color. Another prerequisite to benefit from technology-based services is trust in the government and in the relevant companies. Many communities of color have a historically entrenched skepticism when it comes to government surveillance or health interventions. A mere nod to the Tuskegee Syphilis experiment or not-so-distant US deportation-rampage supported by Palantir Technologies suffices to explain why members of non-white communities might be reluctant to share their information, much less have their movements tracked. And Palantir, the same company that supercharged mass deportation, is now contracting with the U.S. and the U.K. governments for COVID response.

“Integrating mobile apps that exploit location data to track whether someone may have come in contact with a COVID carrier does not add more hospital beds, doctors or nurses in poverty stricken areas.”

Even if we were to willingly ignore all of these practical issues with this technology and its application, the naive focus on surveillance tech as a panacea to COVID containment fails to address the root of the issue: structural inequality. Reinforcing the apparatus of state surveillance does not change the fact that people of color, in particular black and indigenous communities, are overrepresented in both COVID cases and deaths. Integrating mobile apps that exploit location data to track whether someone may have come in contact with a COVID carrier does not add more hospital beds, doctors or nurses in poverty stricken areas. Pouring funding into technology that can detect COVID infection from vocal signatures, doesn’t provide the social safety nets that allow people to take time off from work when they are ill. All of these issues exacerbate the spread of the current pandemic in ways that fundamentally cannot be addressed by surveilling the population. Surveillance technology relies on robust and easily identifiable definitions of who is at risk and who is a public health threat. The problem is that these definitions are always socially contingent, and will ultimately exploit and exacerbate the existing structural issues in our society.

If the social cost of implementing surveillance technologies to combat coronavirus is too high, this raises the question of alternatives. How do we adequately control the spread of the pandemic without relying on measures that disproportionately harm those most vulnerable? The answer lies in not just an aggressive response to the pandemic, but an aggressive response to structural inequality. Relying on surveillance as a solution neglects the opportunity cost of devoting resources to developing these systems. Those resources could be allocated differently.

“The answer lies in not just an aggressive response to the pandemic, but an aggressive response to structural inequality.”

Experts across the board agree that social distancing is a necessary aspect of the pandemic response, but this is only sustainable long term if people have the social safety nets available that can help incentivize staying home. Without guaranteed paid sick leave, guaranteed housing, free and accessible meals, robust child care and other kinds of social services, people are far less incentivized to shelter in place. There is also a need to push for the expansion of hospital networks and push for free and accessible COVID testing and treatment.

The profound need societies are confronted with are not unique to this pandemic or limited to epidemiology, nor should the search for solutions start and stop here. What we’re seeing is the full horror of historic inequality and marginalization. Merely acknowledging this past is tantamount to the same inaction that has resulted in communities of color experiencing the disproportionate impact of this crisis. All in all, we need to dismantle the crisis of imagination in the global north, that the solution for social problems must be more amenable to profits than the needs of the people. We must resist the use of dubious technology to cheat our way back to the status quo and instead prioritize the most vulnerable in our communities.

Written by Sholei Croom, Franziska Putz, P. M. Krafft, & Nathalie Fernandez

On behalf of the Movement for Anti-Oppressive Computing Practices and NoTechForTyrants.

--

--