Bits and Behavior
Published in

Bits and Behavior

An interpretation of Rodin’s “The Thinker”, a top Ruha Benjamin’s stunning book.
An interpretation of Rodin’s “The Thinker”, a top Ruha Benjamin’s stunning book.

Reflections on Ruha Benjamin’s “Race After Technology”

“The animating force of the New Jim Code is that tech designers encode judgements into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled — magnified and buried under layers of digital denial.” (p. 11)

Ironically, this problem of misrecognition actually reflects a solution to a difficult coding challenge. A computer’s ability to parse Roman numerals, interpreting an “X” as “ten,” was a hard-won design achievement. That is, from a strictly technical standpoint, “Malcom Ten Boulevard” would garner cheers. (p. 79)

“If we consider that institutional racism in this country is an ongoing unnatural disaster, then crime prediction algorithms should more accurately be called crime production algorithms. (p. 83)

“The ethnoracial makeup of the software design team, the test photo databases, and the larger population of users influence the algorithms’ capacity for recognition, through not in any straightforward sense.” (p. 112)

“There are already reports of citizens being denied welfare services, including children unable to receive school lunches when their Aadhaar could not be authenticated. In this way the New Jim Code gives rise to digital untouchables.” (p. 133)

“So, are robots racist? Not if by “racism” we only mean white hoods and racial slurs. Too often people assume that racism and other forms of bias must be triggered by an explicit intent to harm; for example, linguist John McWhorter argued in Time magazine that “[m]achines cannot, themselves be racists. Even equipped with artificial intelligence, they have neither brains nor intention.” But this assumes that self-conscious intention is what makes something racist.” (p. 59)

  1. Arguing technology rises above subjectivity (even though technology is itself subjective),
  2. Arguing that personalization frees us from stereotyping (even though personalization techniques rely on stereotyping),
  3. Advocating for merit over prejudice (even though methods of measuring merit are themselves prejudiced),
  4. Elevating prediction as a tool for social progress (even though prediction techniques amplify historical inequities by relying on historical data).

“The power of the New Jim Code is that it allows racist habits and logics to enter through the backdoor of tech design, in which the humans who create the algorithms are hidden from view.” (p. 160)

“Justice, in this sense, is not a static value but an ongoing methodology that can and should be incorporated into tech design. For this reason, too, it is vital that people engaged in tech development partner with those who do important sociocultural work honing narrative tools through the arts, humanities, and social justice organizing.” (p. 193)

  1. The historical integrity of the archive would be eroded.” (Technology rises above subjectivity). Publishers view the academic archive as a kind of objective record, something that is independent and separate from the subjectivity of society and its future. This is of course, no different than the argument that Southerners make for keeping racist statues of confederates civil war leaders: our racist history is more important than your inclusion and voice.
  2. We won’t let you change your name, but you get the privilege of choosing between having separate identities, linked identities, or just using your initials.” (Personalization frees us from stereotyping.) Just as with the personalization of race, control is an illusion: the algorithm, the database schema, and the developer remain in charge of our identities, rather than ourselves. This is no different than the developers of racist recidivism prediction algorithms saying, “No, you don’t get a say in the predictions we make, but don’t worry: we’re basing our predictions on people just like you.”
  3. We can’t change your name because then academic search engines would get your citation counts wrong.” (Merit over prejudice.) Here, the publishers argue that my merit as a scholar—the reductive, biased measure of my citation count—is more important than letting me use my name, as if the counts aren’t already wrong because I’m publishing under my new name. This is no different than college admissions committees saying to a Black student, “Don’t worry, we only rely on objective measures of intelligence, like the SAT and ACT”, as if such tests aren’t themselves prejudiced.
  4. We can’t change your name, because then the web of citations will break, limiting our ability to mine it for insights.” (Prediction as a tool for social progress.) Here, the publishers envision the graph of citations as a tool for progress: for helping scholars find links, for helping search engines organize knowledge, and even for helping algorithms make new discoveries, as if these forms of predictive data mining are somehow a path to my justice. This is no different than telling a Black prisoner, “We can’t shorten your sentence, because then who would do all of this sub-minimum wage labor that funds the prisons?

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Amy J. Ko

Professor of programming + learning + design + justice at the University of Washington Information School. Trans; she/her. #BlackLivesMatter.