Gender, Race, and Power in AI

A Playlist

Gender, Race, and Power in AI is the product of a year-long survey of literature at the nexus of gender, race, and power in the field of artificial intelligence. Our study surfaced some astonishing gaps, but it also made clear that scholars of diverse gender and racial backgrounds have been sounding the alarm about inequity and discrimination in artificial intelligence for decades.

We are concerned that in the rush to diagnose and solve ‘new’ problems, this critical scholarship is deserving of greater attention. So, we’re offering up what we like to think of as a playlist — some of the greatest hits and deep cuts from the literature on gender, race and power in AI — by sharing the work that has inspired us, we hope that others might read along with us.

Lastly, there is a large field of literature on these topics, so this is only a small sample of favorites. If you would like to signal boost the work of others (or your own!), please fill out this form. We’ll do our best to add relevant work periodically to keep it fresh as our research continues.


Adam, A. (1998). Artificial Knowing: Gender and the thinking machine. London: Routledge.

Ahmed, S. (2012). On Being Included: Racism and Diversity in Institutional Life. Durham: Duke University Press.

Amaro, R. (n.d.). As if. _E-Flux_, Retrieved from https://www.e-flux.com/architecture/becoming-digital/248073/as-if/.

Banet-Weiser, S. (2018). Empowered: Popular Feminism and Popular Misogyny. Durham: Duke University Press.

Bivens, R. (2017) The gender binary will not be deprogrammed: Ten years of coding gender on Facebook. New Media & Society, 19(6): 880–898.

Bolukbasi, T., Chang, K., Zou, J., Saligrama, V., and Kalai, A. (2016). Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. Retrieved from: https://arxiv.org/abs/1607.06520.

Browne, S. (2015). Dark Matters: on the Surveillance of
Blackness. Durham: Duke University Press.

Bruce, M. and Adam, A. (1989). Expert Systems and Women’s Lives: A Technology Assessment. Futures, 21(5): 480–497.

Buolamwini, J.A. (2017). Gender Shades: Intersectional Phenotypic and Demographic Evaluation of Face Datasets and Gender Classifiers. Retrieved from https://dam-prod.media.mit.edu/x/2018/02/05/buolamwini-ms-17_WtMjoGY.pdf

Buolamwini, J. and Gebru, T. (2018).Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research. 81:1–15.

Chun, W.H.K. (forthcoming). Queerying Homophily, in Apprich, C., Chun, W.H.K. Cramer, F., and Steyerl, H. (Eds). Pattern Discrimination. Minneapolis: University of Minnesota Press.

Cockburn, C. (1981). The Material of Male Power. Feminist Review, 9: 41–59.

Costanza-Chock, S. (2018, Jul. 27). Design Justice, A.I., and Escape from the Matrix of Domination. Journal of Design and Science. Retrieved from https://jods.mitpress.mit.edu/pub/costanza-chock

Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color. Stanford Law Review, 43(6): 1241–1299.

Criado Perez, C. (2019). Invisible Women: Data Bias in a World Designed for Men. New York: Abrams Press.

Daniels, J. (2019, Apr. 3). “Color-blindness” is a bad approach to solving bias in algorithms. Quartz. Retrieved from https://qz.com/1585645/color-blindness-is-a-bad-approach-to-solving-bias-in-algorithms/.

Edwards, L. (1995). Modelling Law Using a Feminist Theoretical Perspective. Law, Computers, & Artificial Intelligence, 4(1): 95–111.

Edwards, P. (1990) The Army and the Microworld: Computers and the Politics of Gender Identity. Signs, 16(1): 102–127.

Ensmenger, N. (2015). Beards, Sandals, and Other Signs of Rugged Individualism: Masculine Culture within the Computing Professions. Osiris, 30(1): 38–65.

Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: Macmillan Books.

Forsythe, D. (2002) Studying Those Who Study Us: An Anthropologist in the World of Artificial Intelligence. Palo Alto: Stanford University Press.

Gaboury, J. (2018). Becoming NULL: Queer relations in the excluded middle. Women and Performance, 28(2). Retrieved from https://www.womenandperformance.org/bonus-articles-1/jacob-gaboury-28-2/.

Gould, S.J. (1981). The Mismeasure of Man. New York: W.W. Norton & Company.

Greenbaum, J. (1995). Windows on the Workplace: Computers, Jobs, and the Organization of Work in the 20th Century. New York: Monthly Press.

Halberstam, J. (1991). Automating Gender; Postmodern Feminism in the Age of the Intelligent Machine, Feminist Studies, 17: 439–461.

Hall, W. and Lovegrove, G. (1988). Women and AI. AI & Society.

Hamidi, F., Scheuerman, M.K. and Branham, S.M. (2018). Gender Recognition or Gender Reductionism? The Social Implications of Automatic Gender Recognition Systems. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Retrieved from https://dl.acm.org/citation.cfm?id=3173582.

Haraway, D. (1988) Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies, 14(3): 575.

Harding, S. (1986). The Science Question in Feminism. Ithaca: Cornell University Press.

Hicks, M. (2017). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge: MIT Press.

Hutchinson, J. (2001) Culture, Communication, and an Information Age Madonna. IEEE Professional Communication Society Newsletter, 45(3): 1–7. http://www.lenna.org/pcs_mirror/may_june01.pdf

Keyes, O. (2018). The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition. Retrieved from https://ironholds.org/resources/papers/agr_paper.pdf

Klein, L. and D’Ignazio, C. (2019). Data Feminism. Cambridge: MIT Press. Retrieved from https://bookbook.pubpub.org/pub/dgv16l22

Leavy, S. (2018). Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning. 2018 ACM/IEEE 1st International Workshop on Gender Equality in Software Engineering. Retrieved from

McGlotten, S. (2017). Black Data, in Johnson, P. (Ed.) No Tea No Shade: New Writings in Black Queer Studies. Durham: Duke University Press.

McPherson, T. (2012). “U.S. Operating Systems at Mid-Century: The Intertwining of Race and UNIX,” in Nakamura, L. and Chow-White, P. (Eds.) Race After the Internet. London: Routledge.

Miltner, K. (2018). Girls Who Coded. Science, Technology and Human Values, 44(1): 161–176.

Mishra, V. and Srikumar, M. (2017). Predatory Data: Gender Bias in Artificial Intelligence, in Saran, S. (Ed.) Digital Debates: CyFy Journal 2017. New Delhi: Observer Research Foundation.

Nakamura, L. (2014) Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture. American Quarterly, 66(4): 919–941.

Nelson, A. (2016) THE SOCIAL LIFE OF DNA: Race, Reparations, and Reconciliation After the Genome. Beacon Press.

Noble, S.U. (2018) Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.

Oldenziel, R. (1997) Boys and Their Toys: The Fisher Body Craftsman’s Guild, 1930–1968, and the Making of a Male Technical Domain. Technology and Culture, 38(1): 60–96.

Oldenziel, R. (1999) Making Technology Masculine: Men, Women, and Modern Machines in America, 1870–1945. Amsterdam: Amsterdam University Press.

Parsheera, S. (2018). A gendered perspective on artificial intelligence. Proceedings of ITU Kaleidoscope 2018 — Machine Learning for a 5G Future. Retrieved from https://ieeexplore.ieee.org/document/8597618.

Parvin, N. (2019). Look Up and Smile: Seeing through Alexa’s Algorithmic Gaze. Catalyst: Feminism, Theory, Technoscience 5(1): 1–11.

Phillips, A. and Taylor, B. (1980). Sex and Skill: Notes towards a Feminist Economics. Feminist Review, 6: 79–88.

Prescod-Weinstein, C. Diversity is a Dangerous Set-up. Retrieved from https://medium.com/space-anthropology/diversity-is-a-dangerous-set-up-8cee942e7f22.

Roberts, D.E. (2012). Fatal Invention: How Science, Politics, and Big Business Re-create Race in the Twenty-first Century. New York: The New Press.

Scott, J.W. (1986). Gender: A Useful Category of Historical Analysis. The American Historical Review, 91(5): 1053–1075.

Sengers, P. (1999). Practices for Machine Culture: A Case Study of Integrating Artificial Intelligence and Cultural Theory”. Surfaces, 8.

Spiel, K., Keyes, O. and Barlas, P. (2019). Patching Gender: Non-binary Utopias in HCI. CHI’19 Extended Abstracts, May 4–9, 2019, Glasgow, Scotland UK. Retrieved from https://katta.mere.st/wp-content/uploads/2019/04/non_binary_preprint.pdf.

Strok, D. (1992). Women in AI. IEEE Expert, 7(4): 7–22.

Sweeney, L. (2013). Discrimination in Online Ad Delivery. arXiv. Retrieved May 20, 2018 from https://arxiv.org/abs/1301.6822.

Suchman, L. (2003) Located Accountabilities in Technology Production. Scandinavian Journal of Information Systems, 14(2): 91–105.

Taylor, A. (2018). The Automation Charade. Logic, Retrieved from https://logicmag.io/05-the-automation-charade/.

Taylor, K.Y. (2017). How We Get Free: Black Feminism and the Combahee River Collective. Chicago: Haymarket Books.

Varma, R. (2007). Women in Computing: The Role of Geek Culture. Science as Culture, 16(4): 359–376.

Varma, R. (2010). Why so few women enroll in computing? Gender and ethnic differences in students’ perception. Computer Science Education, 20(4): 301–316.

Wachter-Boettcher, S. (2017) Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. New York: WW Norton and Co.

Wacjman, J. (1991) Feminism Confronts Technology. State College: Penn State University Press.

Wilson, E.A. (2010). Affect & Artificial Intelligence. Seattle: University of Washington Press.

Woods, H.S. (2018). Asking more of Siri and Alexa: feminine persona in service of surveillance capitalism. Critical Studies in Media Communication.

Wynn, A. and Correll, S. (2018). Puncturing the pipeline: Do technology companies alienate women in recruiting sessions? Social Studies of Science, 48(1) 149–164.

Zdenek, S. (2007). ’Just Roll Your Mouse Over Me’: Designing Virtual Women for Customer Service on the Web.’ Technical Communication Quarterly, 16(4): 397–430.