Surveillance Pandemic: Privacy Risks and Racial Dynamics of Digital Contact Tracing

The YX Foundation
The YX Foundation Journal
9 min readOct 8, 2020

by Sakshi Garg

Mentor: Prof. Anita Say Chan; Student Editor: Alexis Queen

At the beginning of August, the Harvard Crimson reported that the university had begun piloting “Tracefi,” a Wi-Fi-based contact tracing program, in preparation for the upcoming semester (Cobb). Although Tracefi’s website stresses that all data collected contains no personally identifiable information unless a patient tests positive for COVID-19, many cybersecurity experts raised concerns about the invasiveness of the program (“How It Works”). In interviews with the Crimson, Georgetown research fellow Tim Hwang noted that Wi-Fi tracing is “notoriously vulnerable to cybersecurity problems” and questioned whether Tracefi’s massive amounts of data could be utilized for other purposes, while Harvard faculty defended the program relative to the contact tracing efforts of other universities.

These concerns exemplify the current debate on efforts to digitize contact tracing. While contact tracing is a necessary measure to combat the spread of coronavirus, especially with its rapid transmissibility, is it possible to create such tracing technology without it being instrumentalized for invasive and racialized surveillance practices?

We have already begun seeing the criminalization of Black populations within COVID-19 mitigation efforts. Leaked data from the New York City Mayor’s Office showed that more than 90 percent of the people arrested by the NYPD for violating social distancing orders were Black or Latinx (Offenhartz). The Brooklyn District Attorney’s Office also released data showing that, of 40 social distancing arrests, only one person was white (Burns).

Likewise, in April, Detroit’s Assistant Police Chief stated law enforcement would be using risk terrain modeling, a predictive policing technique created by Rutgers University to expand the city’s pre-existing surveillance program, Project Greenlight (PGL), after the city’s mayor announced $1000 fines or six months in jail for people flouting Gov. Gretchen Whitmer’s shelter-in-place orders (Fussell; A Critical Summary). Created in 2016, PGL has had local businesses install surveillance cameras feeding directly to Detroit’s Real Time Crime Center, heavily equipped with facial recognition software and reportedly possessing ID photos of nearly every single resident of Michigan (Egan). Now, this software will be utilized to virtually patrol impoverished neighborhoods for congregating crowds, digitally identify faces, and then issue tickets and arrests.

Several civil liberty advocates have spoken out about the slippery slope of accepting such surveillance programs during a pandemic, but are also concerned with issues of equity as these arrests, fines, and surveillance technologies, disproportionately affecting people of color, come in sharp contrast to nationwide (and heavily white) armed demonstrations and protests against those very same social distancing orders (Gross).

These racial disparities in the policing of public health have immense potential to transfer over to tech-assisted contact tracing programs. Mobile contact tracing has already begun overseas, and so have violations of privacy. An Amnesty International study of 11 countries flagged apps in Norway, Bahrain, and Kuwait as particularly invasive due to their collection and storage of GPS data and identifying information. The Norwegian app even continuously uploaded user data to a central server, requiring its users to also agree to their personal information being used for research purposes. It has since been suspended.

Similarly, in mid-March, the Israeli counterterrorism agency Shin Bet launched a digital contact tracing app relying on a classified database known as “the Tool,” that collects metadata (location, calls, texts) from cellular providers on every person using phone services in Israel (Amit et al. 1127). Based on “the Tool,” the Israeli Health Ministry sends a text to everyone who has come within two meters of a patient during the 14 days preceding the diagnosis. The Tool had been in existence for 18 years, gathering data on Israeli citizens, but was only revealed to the public after its deployment in the contact tracing program. Considering decentralized alternatives like the Hamagen app had also been developed for contact tracing, researchers at the Israel Democracy Institute concluded that there was no justification for the Tool’s use and its “massive blow to democracy and to human rights in Israel” (Altshuler).

Similar violations would be all too possible in the United States, where app development is still underway. The blurring of lines between contact tracing and surveillance was made especially evident during the early days of Black Lives Matter protests in response to the police killing of George Floyd in Minneapolis. Minnesota law enforcement claimed, and later retracted, to be using contact tracing to track protestors (Hu). Whether or not this statement was true, Julie Bartkey, a spokesperson for the Minnesota Department of Health (MDH), told Vox in an interview that the MDH has no policies explicitly forbidding law enforcement from accessing information collected by contact tracers or tools (Morrison). Thus, the lack of legislation and accountability measures surrounding these contact tracing developments is especially concerning due to our current climate with heightened fears of police violence and targeted surveillance.

These fears are only compounded when examining which private employers and tech companies have a hand in the development of tech-assisted contact tracing. In Michigan, a $1 million dollar contract to run the state’s contact tracing efforts was recently awarded to Rock Connections LLC, a company backed by billionaire and private security contractor Dan Gilbert (Ciak). Gilbert’s 500-camera mass surveillance system in downtown Detroit served as the very model for Project Greenlight’s development in 2016. Likewise, digital contact tracing efforts could serve as a prime opportunity for private corporations, who financially profit off of increased incarceration and policing in Black communities, to expand surveillance networks across American cities.

Although well-meaning, there are very material risks attached to the deployment of contact tracing apps absent refined accountability and oversight mechanisms. From Facebook sharing user data with Cambridge Analytica to Edward Snowden’s exposure of uninhibited government surveillance, concern for privacy breaches and misuse of data is at an all-time high (Gurumurthy). If data from contact tracing apps lives on after the pandemic, private corporations, governments, and law enforcement could have easy access to information on people’s whereabouts or healthcare needs. Current privacy rules regarding healthcare data don’t protect Americans from such intrusions, and typical regulations are even being eased to give physicians flexibility to deal with rapidly increasing cases of COVID-19 (Halper). This would enable tech giants to curate profiles of people’s medical history using healthcare data, exposing them to targeted marketing and potentially unintended discrimination from insurance companies looking to raise their health premiums.

This scope of data sharing could have severe racialized implications as well. Relying on contact tracing apps could enable law enforcement to investigate those who may have COVID-19, providing another avenue with which to increase policing, detain undocumented immigrants, and surveil (and silence) protestors. Some politicians have taken action to prevent such weaponization, with the New York State Senate passing legislation prohibiting police and immigration enforcement from viewing contact tracing data (“Civil Rights”).

While such legislation is definitely a step in the right direction, it is imperative for cities to entirely reconsider their use of surveillance technologies given their impact on communities already facing widening disparities amidst the coronavirus pandemic. Current responses have turned issues of public health and wellbeing into law enforcement problems by criminalizing communities for being under-resourced. Cities like Detroit have invested millions of dollars into surveillance technology to punish those in low-income communities who aren’t social distancing, rather than addressing the factors causing people to be outside. “Where else are you going to go if you go to a home that doesn’t have resources? That doesn’t have water, may not have electricity, no access to real food, no income coming in?” Tawana Petty, director of the Data Justice Program for the Detroit Community Technology Project, remarked in an interview with WXYZ Detroit (Gross). The bias magnified by these COVID-19 responses is especially apparent when predominantly white conservatives publicly protest wearing masks with no legal repercussions. Instead, these surveillance programs end up punishing Black communities for existing structural disparities and years of disinvestment from local governments.

Alternatively, some large cities are operating highly effective contact tracing programs absent the use of technology. In Massachusetts, the prominent international nonprofit Partners in Health has developed the Community Tracing Collaborative (Wallace-Wells). Also known as CTC, the organization borrows much of its model from PIH’s work training community-health workers to build durable medical systems in some of the poorest areas of the world. The program emphasizes building trust within communities and even provides care-resource coordinators to help solve issues like the need for food and shelter that often prevent people from isolating.

The Harvard campus is not immune to this culture of policing. The Harvard University Police Department (HUPD) has a well-documented history of racist and sexist conduct both internally and externally, with reports of excessive force against Black people on campus as recently as February of this year (Schumer; Xu). With HUPD’s heavy presence on campus, it is vital for students to think critically about the implications of digital contact tracing initiatives and associated privacy concerns. But beyond the university, amidst an increasingly digital world ripe with opportunities for abusive surveillance, we must all pay close consideration to the interactions of budding technologies, privacy rights, and policing in order to work towards the most equitable future.

Sources

--

--

The YX Foundation
The YX Foundation Journal

The YX Foundation is a coalition dedicated to community engagement at the intersection of deep technology and critical race theory.