Race After Technology by Ruha Benjamin

Jasmine Phelps
DesignThinkingfall
Published in
5 min readNov 29, 2021

As the weather cools down and the option to be indoors under a blanket looks more and more promising, I find myself scrolling through the myriad of streaming services I am subscribed to looking for something to watch. Naturally, I gravitate towards my tried and true, Netflix. Part of the appeal, albeit a little creepy, is the fact that the algorithm continues to learn my viewing habits and suggests content I could potentially enjoy, but have you ever considered that the show thumbnail you see, won’t necessarily look the same on others profiles? Subconsciously, I think I began to notice the differences, but it was truly brought to light for me regarding the show Maid. I recalled seeing the below show artwork in my personal feed and heard buzzing of the show on social media, but highlighting other characters.

As I sat at a friend’s house getting ready to binge as much as my body would allow, I went to search for the show and noticed something that had previously been in my subconscious, the difference in artwork. Was the cover art I see on my dashboard, somehow being marketed to me based on my race? While I have previously enjoyed Anika Noni Rose’s (the woman in the thumbnail) work before, certainly not to the point of using her photo to get me to watch the show. The same thing happened with the series You, where I was shown a cover photo of Tati Gabrielle aka Marienne, rather than the main cast members, despite the fact that I had already watched seasons 1 & 2 and likely wouldn’t need much of a push for season 3. While this is something that is not unique to me, Netflix explained “Given the enormous diversity in taste and preferences, wouldn’t it be better if we could find the best artwork for each of our members to highlight the aspects of a title that are specifically relevant to them?” Sure, your gut may want you to say yes, but ultimately logic tells me that way of thinking can be a slippery slope.

Like the example above that only begins to raise an eyebrow, Race After Technology by Professor Ruha Benjamin of Princeton University, as she put “examines the relationship between machine bias and systemic racism, analyzing specific cases of “discriminatory design” and offering tools for a socially-conscious approach to tech development”. I was immediately drawn to the title of this book, which prompted a quick Google search to determine the subject matter. Currently, I work in the tech education space, specifically around serving those who have limited access to high-quality STEM education, which often means working with students from demographic groups that are underrepresented in STEM fields. Race After Technology appeared to be an interesting blend of my passion for DE&I in STEM, especially when it comes to preparing students to have careers in those fields. I felt that this book could provide insight into design practices that perhaps I was not privy to. Further, my purpose in pursuing my Master’s was to harmonize my business background with the language of engineering and eventually providing programs to students that speak both languages as technology becomes increasingly interwoven in business.

Overall, I really enjoyed the book and found it to be useful, especially going forward in my career. There were, however, points where the book felt like a research paper with esoteric language, but I still walked away feeling like I learned a lot and expanded my thinking. Ruha Benjamin was able to break down this extensive topic as she calls it, The New Jim Code, into 4 dimensions: Engineered Inequity, Default Discrimination, Coded Exposure and Technological Benevolence. These explore the idea that a variety of technologies are superior due to their ability to be neutral, objective and without bias. Benjamin counters that argument citing, with a myriad of examples from both the past and present, that the application of these new technologies tend to reflect existing inequities. For the purposes of this post, I will be highlighting 2 of the dimensions.

Default discrimination was explored in the book and loosely defined as how discrimination can develop from socially and historically ignorant design practices. An example in the book discussed a situation in 2013 where a driver was using Google Maps and the voice navigation said to “turn right on Malcolm Ten Boulevard”, interpreting the X in the street name as the Roman numeral, rather than the Civil Rights leader the street was justifiable named after. “This illustrates how innovations reflect the priorities and concerns of those who frame the problems to be solved, and how such solutions may reinforce forms of social dismissal, regardless of the intentions of the individual programmers” (pg. 79) Rather than treating this as an isolated glitch in the system, examples like these should be looked at from a much broader perspective as a systemic problem. Secondly, coded exposure scrutinizes the use of surveillance technology “being watched (but not seen)” (pg. 47) In contrast to the aforementioned example indicating the invisibility of black culture in technology, coded exposure can be the opposite with a hyper-visibility of black people and people of color. Black people are disproportionately targeted and observed by police and law enforcement. These racial surveillance methods and numerous others of using AI to determine potential criminality based on someone’s appearance as the book cites are used by police, but ultimately poor at actually distinguishing Black faces. Ultimately, Benjamin writes, “This is why we must separate “intentionality” from its strictly negative connotation in the context of racist practices and examine how aiming to “do good” can very well coexist with forms of malice and neglect” (pg. 61)

All in all, this book was an enlightening and modern look at the intersection of race and technology, that I would encourage anyone in business, STEM or otherwise to read.

--

--