Diversity & Inclusion in Surveillance AI

Code Societies 2018, Day 7, with Sarah Aoun

written by ruth.wtf

Working at the intersection of tech, data and human rights, Sarah Aoun has joined us for a session of Code Societies to actuate our understanding of surveillance in contemporary culture. Living between Brooklyn and Beirut, Sarah is currently a 2017–2018 Ford Mozilla Open Web Fellow.

Sarah began by posing the question: what is facial recognition technology?

https://www.youtube.com/watch?v=t4DT3tQqgRM

Reviewing the reading material assigned for this session, we familiarized ourselves with the work of Joy Buolamwini, the founder and head of Algorithmic Justice League, a collective aiming to highlight algorithmic bias. Buolamwini refers to the racial biases in facial recognition technology as “the coded gaze” and notes that “calls for tech inclusion often miss the bias that is embedded in written code”. This is likely because many (perhaps not us at Code Societies) believe technology to be an objective and unbiased entity in comparison to humanity. However, it’s evident that machine bias stems from non-diverse datasets which is a human error. Unregulated and widespread use of these systems which have not been audited for accuracy is easily possible due to shared and universal libraries and software.

https://www.youtube.com/watch?v=UG_X_7g63rY

An interesting alternative to Joy’s work is detailed in Nabil Hassein’s article Against Black Inclusion in Facial Recognition. Nabil questions “whose interests would truly be served by the deployment of automated systems capable of reliably identifying Black people” and is confident that “no technology under police control will be used to hold police accountable or to benefit Black folks or other oppressed people”. He supports this with the example of biased machine software used in the American justice system which attempts to predict the likelyhood of incarcerated folks committing future crimes.

Where there is a shift in who is operating new technologies at large, tools that were once used for good can be perverted. Softwares that were once intended to unlock equality too often end up being manipulated to unlock great wealth and, correspondingly, power.

Sarah presents three tenets of software creation:

  • who codes matters
  • how we code matters
  • why we code matters

We began discussion with the prompt: when we’re thinking about how to make things inclusive, we should ask the question — what harm can the technology do?

This discourse generated some answers and some more questions:

  • Changing the narrative around surveillance. stronger, more resilient communities
  • Could we have AI motivated by care and love?
  • Assuming AI is borderless, how does it resolve global conflict? How does it interface with the entire world?
  • The importance of different people in the room- the more diverse the group, the less likely unconscious bias will come up. It also encourages smaller upstarts to try AI implementations
  • Destruction/hacking of systems vs inclusion and re-massaging of system

In order to navigate how to operate A.I. systems responsibly, we looked to the history of surveillance as not just a post-Snowden occurrence but as a foundation that the US was built on, woven into history by codes such as the Blood Quantum Laws, a set of race based territory redistribution systems. Also noted was the practice of having slaves in Louisiana been overseen by white working class people. By giving these people who might previously have had solidarity with slaves a position to exercise power over them, they were able keep them and the slaves under better control. Another example raised was the 18th century Lantern Law, requiring non-white people in the city of New York to always be carrying a lantern after sunset. This is uncomfortably reminiscent of a currently increasing practice of installing all night floodlights in low- and moderate-income areas whose residents are overwhelmingly black or latino.

left: minutes of meeting for Lantern Laws, right: floodlights installed in low-income areas

We also studied the more recent case of the New York ID cards. Not being a New Yorker or even an American myself, I wasn’t familiar with this situation and hearing about it felt like a heaviness I hadn’t been expecting. In 2015 the mayor of New York City set up a municipal identity card system in order to help undocumented people get a foothold in the city without needing to have formal ID. In the enrolment process the city administration held the details associated with all those who applied for the card. These included photocopies of passports, driving licenses, birth certificates and other similar personal documents. Since the Trump administration came into office it has been attempting to access this data- presumably to serve as a hit-list for deportation.

launch of the IDNYC cards in 2015

After further group discussion of these issues, the session ends with a quote:

“The master’s tools will never dismantle the master’s house” — Audre Lorde