An interpretation of Rodin’s “The Thinker”, a top Ruha Benjamin’s stunning book.
An interpretation of Rodin’s “The Thinker”, a top Ruha Benjamin’s stunning book.

Reflections on Ruha Benjamin’s “Race After Technology”

Amy J. Ko
Bits and Behavior

--

For many years, the intersection of race and technology has been a personal, but not professional curiosity. I’d sigh every time a web form asked for my racial or ethnic identity, but wouldn’t let me express my Scandinavian/Asian identity. I’d chuckle every time someone on Twitter would call me a White person because the lighting in my headshot hid my color. I’d bristle at discussions of racial representation in technology, with well-meaning people talking about the predominance of White and Asian people, but forgetting there are White/Asian people too, who are often excluded and othered by both groups. My personal experience was one of mild inconvenience, a lingering sense of not being seen for who I am, and a lifelong sense of the racial and ethnic erasure that came with my immigrant families’ early twentieth century assimilation.

But over the past few years, especially as I began to accept my transness, I started to notice more of the seams in my largely White-identified experience. It wasn’t just that web forms excluded my race, but my gender identity too. When I came out as a woman, strangers online started started angrily dismissing my assertiveness as bitchiness, apparently because of my name and longer hair. I battled a thousand databases and their keepers for the right to change my name. Stressing the gender dimension to my intersectional position had fragmented my technological experiences further, revealing the exclusionary structures woven through society and its tools. I began to see, through my own eyes, that the bias against my womanhood, my transness, my sexuality, and my racial identity, were not separate social phenomena, but one: a bias towards White, straight, cisgender men.

Now, I’m not the kind of social justice warrior that wants to frame White, straight, cisgender men as the enemy. I have so many friends who are White, straight, cisgender men! (Sorry, bad racism joke.) So let me explain further: it’s not that such men are inherently racist, sexist, or transphobic. Most are just ignorant. And not ignorant as in unintelligent, but ignorant as in simply unaware of how the world is built and optimized for them and by them, and few others. This ignorance is unfortunate for many reasons: they control the world, they design the world, they police the world, and they get most of the world’s attention and praise. And because they grow up seeing that men like them own the world, many are also gifted an immense (over)confidence in the rightness of their views and actions. Throughout history, this has created an immensely challenging social tension: the very people who are doing the excluding do not know they are doing it and are resistant to demands for change.

One reason I understand this experience is because I presented as a White, straight, cisgender man for decades. I know what it felt like to feel like the world was built for me; well, actually, I didn’t, because I didn’t feel it at all. The world just was, and I was in it, and everything I seemed to want or aspire to seemed to be available to me. I didn’t look Asian enough to experience much racism except in hot childhood summers when I darkened, and I hid my femininity well enough that it only led to a bit of bullying as a child. And so the world was everything I wanted it to be: opportunity, an audience, and a playground for my ideas. Moving from cis to trans, hetero to homo, and man to woman broke all of that, and made the world’s bias against me visible, in technology, in law, and in society.

Ruha Benjamin’s book, Race After Technology attempts to make this visible, but in a much more sweeping and objective way, weaving a quilt of culture, media, experience, and research that covers the entirety of racial phenomena in technology. She shows, in ways I only understood subjectively, how the central bias in the origins of the United States has been built into the nearly every structure imaginable and in ways that have never been more invisible.

While the book does not convey a central thesis explicitly, it conveys one nonetheless. It begins by building upon a few key histories of race in America. The first, slavery, needs little discussion, as a racist sin or as the foundation of America’s economic prosperity and the poison seed in it’s constitutional framing. After emancipation, slavery evolved into Jim Crow laws, which systematically denied nearly every human right to Black Americans including voting, property, jobs, equal access to public services, and even their freedom, for crimes as simple loitering. After Jim Crow was dismantled by the Civil Rights movement came the New Jim Crow, where political leaders, no longer able to explicitly deny Black Americans rights, turned to implicitly denying them, through unequal policing, mandatory sentencing, and the war on drugs, robbing the freedom of 1 in 3 Black men in their lifetimes. While Benjamin does not dwell on these histories, she clearly invokes them, suggesting how America’s law and culture has always, at best, had anti-Black disregard, if not anti-Black disdain and disgust.

What Benjamin adds to this history is what she calls a New Jim Code. If slavery was total racism, Jim Crow was explicit legal racism, and the New Jim Crow was implicit legal racism, the New Jim Code is computational racism. Her argument, in essence, is that in the exact same ways that racism has been encoded into behavior and law, it is now built into technology. Except: unlike the earlier forms of racism in our country, and much like the New Jim Code, people write racism into code often unknowingly. And unknowingly, like I addressed before, because the people writing code are often cis hetero White men, ignorant of who they are excluding by mindlessly creating things that only serve themselves.

But Benjamin’s idea is much more subtle than just a critique of who codes. A key idea lurking in her tapestry is that the people who create technology follow a common pattern of using what we see in people, what we hear about people, and what we observe them do, as a proxy for who they really are and how they’ll behave in the future. It’s the abstractions in computation inherent in data and algorithms in which the racism lurks, erasing identity, erasing difference, and erasing agency. The book is therefore not a critique of software developers, but a critique of software itself, as a medium that is inescapably unjust in its stereotyping, modeling, and computation of human diversity. In her words:

“The animating force of the New Jim Code is that tech designers encode judgements into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled — magnified and buried under layers of digital denial.” (p. 11)

The book is bursting with examples. In natural language processing for Google Maps lurks an ignorance of Black history:

Ironically, this problem of misrecognition actually reflects a solution to a difficult coding challenge. A computer’s ability to parse Roman numerals, interpreting an “X” as “ten,” was a hard-won design achievement. That is, from a strictly technical standpoint, “Malcom Ten Boulevard” would garner cheers. (p. 79)

Crime prediction algorithms, when used to inform policing, create a tight loop of Black incarceration:

“If we consider that institutional racism in this country is an ongoing unnatural disaster, then crime prediction algorithms should more accurately be called crime production algorithms. (p. 83)

Photography, in all its attempts to capture reality, is calibrated for Whiteness:

“The ethnoracial makeup of the software design team, the test photo databases, and the larger population of users influence the algorithms’ capacity for recognition, through not in any straightforward sense.” (p. 112)

The exclusionary caste system of India is reified in India’s government identity system, Aadhaar:

“There are already reports of citizens being denied welfare services, including children unable to receive school lunches when their Aadhaar could not be authenticated. In this way the New Jim Code gives rise to digital untouchables.” (p. 133)

These examples, and the hundreds of others in the book, are less about the specific injustices encoded into technology, and more about the pervasiveness of White ignorance of difference. As Benjamin says in her introduction:

“So, are robots racist? Not if by “racism” we only mean white hoods and racial slurs. Too often people assume that racism and other forms of bias must be triggered by an explicit intent to harm; for example, linguist John McWhorter argued in Time magazine that “[m]achines cannot, themselves be racists. Even equipped with artificial intelligence, they have neither brains nor intention.” But this assumes that self-conscious intention is what makes something racist.” (p. 59)

Benjamin ends the book by observing, across all of the examples she analyzes, that the New Jim Code emerges from four rhetorical moves:

  1. Arguing technology rises above subjectivity (even though technology is itself subjective),
  2. Arguing that personalization frees us from stereotyping (even though personalization techniques rely on stereotyping),
  3. Advocating for merit over prejudice (even though methods of measuring merit are themselves prejudiced),
  4. Elevating prediction as a tool for social progress (even though prediction techniques amplify historical inequities by relying on historical data).

Of course, all four of these ideas do more than fail to avoid racism—they encapsulate it, masking it from view, behind private enterprise, inside impenetrable algorithms:

“The power of the New Jim Code is that it allows racist habits and logics to enter through the backdoor of tech design, in which the humans who create the algorithms are hidden from view.” (p. 160)

Benjamin ends the book with a review of critical justice-focused tools for abolishing these racist beliefs: an approach to design that prioritizes equity over efficiency, social good over market imperatives, justice audits of designs, data usage guidelines to combat bias. She leaves with an approach:

“Justice, in this sense, is not a static value but an ongoing methodology that can and should be incorporated into tech design. For this reason, too, it is vital that people engaged in tech development partner with those who do important sociocultural work honing narrative tools through the arts, humanities, and social justice organizing.” (p. 193)

You might be wondering why I started this reflection talking about myself. After all, much of what I’ve learned about injustice from being out about my gender, sexuality, and transness was not about race: I am still White and Asian, and still perceived as such (whatever sloppy social construct “such” is). How could Benjamin’s observations about race and technology possibly have anything to do with my gendered marginalization?

This is why Benjamin’s book is so powerful: while the dominant force behind injustice and exclusion in America is race, the methods of the New Jim Code also apply equally to these other facets of identity, and often simultaneously. For example, consider the injustice of not being able to change my name on prior publications with most academic publishers (including ACM, though it has plans). This injustice, while it affects people who have changed their name for other reasons, harms transgender scholars most. However, why are publishers so reluctant to allow name changes? Their rationale has little to do with trans rights. In fact, their arguments against allowing change align with Benjamin’s four rhetorical moves of the New Jim Code:

  1. The historical integrity of the archive would be eroded.” (Technology rises above subjectivity). Publishers view the academic archive as a kind of objective record, something that is independent and separate from the subjectivity of society and its future. This is of course, no different than the argument that Southerners make for keeping racist statues of confederates civil war leaders: our racist history is more important than your inclusion and voice.
  2. We won’t let you change your name, but you get the privilege of choosing between having separate identities, linked identities, or just using your initials.” (Personalization frees us from stereotyping.) Just as with the personalization of race, control is an illusion: the algorithm, the database schema, and the developer remain in charge of our identities, rather than ourselves. This is no different than the developers of racist recidivism prediction algorithms saying, “No, you don’t get a say in the predictions we make, but don’t worry: we’re basing our predictions on people just like you.”
  3. We can’t change your name because then academic search engines would get your citation counts wrong.” (Merit over prejudice.) Here, the publishers argue that my merit as a scholar—the reductive, biased measure of my citation count—is more important than letting me use my name, as if the counts aren’t already wrong because I’m publishing under my new name. This is no different than college admissions committees saying to a Black student, “Don’t worry, we only rely on objective measures of intelligence, like the SAT and ACT”, as if such tests aren’t themselves prejudiced.
  4. We can’t change your name, because then the web of citations will break, limiting our ability to mine it for insights.” (Prediction as a tool for social progress.) Here, the publishers envision the graph of citations as a tool for progress: for helping scholars find links, for helping search engines organize knowledge, and even for helping algorithms make new discoveries, as if these forms of predictive data mining are somehow a path to my justice. This is no different than telling a Black prisoner, “We can’t shorten your sentence, because then who would do all of this sub-minimum wage labor that funds the prisons?

Thus, the key arguments behind the racism of the New Jim Code are the very same tools that exclude and oppress me. In this way, resisting the New Jim Code is not fundamentally about resisting the specific choices that technologists make (although that work is critical), but about resisting the New Jim Code’s ideas about what constitutes progress. My oppression comes from these oppressive ideas, therefore Black liberation is my liberation.

Of course, while I can find affinity with the racial justice movement, I am not the greatest victim of racism. I was not born into it. I do not live it’s everyday oppressions. It was not woven through my every interaction with people and technology. I am merely someone that can now see the privilege I had because I gave it up, and will spend the rest of my life trying to get it back. Because of Benjamin’s book, it’s increasingly clear to me that the shortest path to this restoration is demanding racial justice.

#BlackLivesMatter

--

--

Amy J. Ko
Bits and Behavior

Professor, University of Washington iSchool (she/her). Code, learning, design, justice. Trans, queer, parent, and lover of learning.