Pandemic Politics: SKYNET, protests and the misuse of wartime surveillance technologies

John R. Emery

Image for post
Image for post
The COVID-19 pandemic has brought existing shortcomings of governance to the fore, with the White House at the centre of questions on whether the US government will be able to adapt to ongoing challenges. Image Credit: John Loo via Flickr.

In August, International Affairs has teamed up with the Future Strategy Forum for the ‘Pandemic Politics’ series on US politics and the COVID-19 pandemic. International Affairs’ 50:50 in 2020 initiative is partnering with the FSF to support its mission of amplifying the expertise of women and share the insights of PhD students on COVID-19 and grand strategy; the military; and democracy.

This week in Pandemic Politics, Julie George’s introduction, as well as Katrina Ponti’s, Dakota Foster’s, and John R. Emery’s blogposts discuss COVID-19, democracy and governance.

SKYNET was once best known as the artificial general superintelligence system that serves as the main antagonist in the 1984 film The Terminator, but the title has taken on a new relevance in the past decade. SKYNET refers to an actual machine-learning algorithm which was utilized in the covert CIA targeted-killing program in Pakistan, specifically determining ‘legitimate targets’ for drone strikes. This program was brought to light by The Intercept in 2015 after the publication received leaked documents including an NSA PowerPoint from 2012 which highlighted their role in a cloud-based behavior analytics program drawing on Pakistani mobile phone metadata. In a morbid example of techno-fetishization, the program is underpinned by a ‘ridiculously optimistic’ machine-learning algorithm that could, it was claimed, predict someones’s probability of ‘terroristness’. SKYNET pieces together daily routines (where you travel, who you travel with, whether you power down or swap SIM cards frequently) and then utilizes an algorithm to ‘predict’ whether your behaviour makes you statistically worthy of targeting in drone strikes, based solely on your ‘pattern of life’. These means of war are disturbing enough when used internationally, but when SKYNET comes home, it produces the prospect of near total surveillance of protests.

Communities of colour in the US have long been the testing ground for ‘predictive policing’ technologies that have failed spectacularly and contribute to the disproportionate targeting of the black community. This pattern has continued in the official response to the recent Black Lives Matter (BLM) protests after the murder of George Floyd. In particular we have witnessed an increased willingness to deploy SKYNET-style surveillance technologies which were previously reserved for overseas military use.

One of the most visible of these developments has been a sharp spike in aerial surveillance of protestors. During the BLM protests, US Customs and Border Protection (CBP) flew Predator drones over a number of cities, including Minneapolis. Reminiscent of SKYNET practices, it was reported that the FBI may have flown a Cessna Citation jet ‘equipped with “dirtboxes”, equipment that can collect cell phone location data,’ over Washington DC. This type of aerial mass data collection is more than likely paired with machine-learning algorithms that correlate the images with social media posts and (controversially) facial recognition software to determine one’s liability for arrest. Indeed, Buzzfeed News obtained documents that show the Drug Enforcement Agency (DEA) was given sweeping authority to ‘conduct covert surveillance’ and collect intelligence on protestors. Thus, with SKYNET and geofence warrants — reverse location searches that sweep up information on any device in the area — freedom of assembly became illegal and mass surveillance ensued.

In recent years, broad data collection in the US has increasingly come under the spotlight. Many practices are unregulated, if only because of their novelty. However, a 2018 Supreme Court ruling Carpenter v. United States barred the government from accessing mobile phone location data without a warrant. Supreme Court Chief Justice John Roberts wrote in the decision that ‘we decline to grant the state unrestricted access to a wireless carrier’s database of physical location information.’

Nevertheless, in the case of immigration and border enforcement, the Trump administration has sought to circumvent the spirit of the law by buying access to user location data on the open market. According to the Wall Street Journal, the federal government ‘has bought access to a commercial database that maps the movements of millions of cellphones in America’ from apps such as weather, games, and e-commerce, acting as if CBP and ICE were a marketing company. This ‘near perfect surveillance’ was the subject of a disturbing New York Times investigation that demonstrates the perils of circumventing judicial oversight. Location data are crude measurements devoid of context. Relying upon them to assess wrong-doing can easily lead to false accusations which disrupt peaceful protests by identifying, tracking, and even arresting protestors after they return home.

It is difficult to disentangle the role of the US empire abroad and racial injustice at home. Mass data collection on communities of colour is not ‘objective and neutral’, but perpetuates the systemic injustice upon which these algorithms of oppression are built. While we must draw boundaries for new technology as it arises, humans and technology interact in complex webs of socio-technical interactions.

Mobile phone videos have been critical in shedding light on police brutality, but given the rise of military-style surveillance in the domestic sphere, one of the few weapons in the protestors’ arsenal could now lead to algorithm-driven persecution. SKYNET has resulted in innocent deaths when applied abroad, and we should be deeply worried about its use at home. It is up to us to push for more democratic accountability to dismantle and abolish the means of war being turned on our own communities. Assumptions of technological superiority ignore the complexity of socio-technical interactions and the inability to deduce individual innocence or guilt using Big Data alone.

John R. Emery received his PhD in political science from the University of California, Irvine. He is an incoming Stanton Nuclear Security Fellow at Stanford University’s Center for International Security and Cooperation (CISAC). His work on drones, counter-terrorism, and ethics has been published in Critical Military Studies, Ethics & International Affairs, and Peace Review.

In August, International Affairs has teamed up with the Future Strategy Forum for the ‘Pandemic Politics’ series on US politics and the COVID-19 pandemic. This series is made possible by The Center for Strategic and International Studies (CSIS), the Henry A. Kissinger Center for Global Affairs at the Johns Hopkins School of Advanced International Studies (SAIS) and the Bridging the Gap Project (BtG).

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium