‘Achievement and acclaim, we have learned many times over, do not provide immunity to racism, sexism, misogynoir, intimidation, censorship, or haterade’

Image for post
Image for post
Joy Buolamwini (Poet of Code), Dr. Timnit Gebru, and Deborah Raji receive the 2020 EFF Pioneer Award. See full acceptance remarks here https://youtu.be/ZVzldpfEhpk?t=7334

Dr. Timnit Gebru, a leading A.I. researcher, was let go from her job at Google earlier this week. This response is written by Joy Buolamwini, the founder of the Algorithmic Justice League, a group focused on equitable and accountable A.I.

Before the headlines, the covers, the blockbuster papers, the awards, and Coded Bias, the feature-length film that glimpses our friendship, Dr. Timnit Gebru, Deborah Raji, and I locked arms in sisterhood. This was a sisterhood formed knowing that as outsiders in academic institutions and emerging researchers exploring the limitations of artificial intelligence, we would need each other.

This week Timnit was ousted from Google for demanding research integrity and Deborah was featured in the 2021 Forbes 30 under 30. These cases are examples of how as highly visible and accomplished Black women we live at the intersections of privilege and oppression, praise and evisceration. The contrast of the Forbes recognition and Google’s Gebrugate reminded me of how we were attacked by Amazon for showing that they, like their peers, sold biased A.I. products despite the impact and recognition of our prior research. …


The Algorithmic Justice League commends IBM’s decision to stop providing general purpose facial recognition technologies, and calls for next steps: systematic change requires resources. IBM should lead its industry peers, and each company should commit to provide at least one million dollars to support racial justice in the tech sector.

Yesterday, IBM put action behind written principles when they announced a decision to stop providing general purpose facial recognition and analysis technology. In the announcement, IBM CEO Arvind Krishna stated:

“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values.”


An urgent letter from the Algorithmic Justice League

Image for post
Image for post
re(Sisters) Unite. Photo: $han

The Algorithmic Justice League is an organization that combines art and research to illuminate the social implications and harms of artificial intelligence. Our mission is to raise public awareness about the impacts of A.I., equip advocates with empirical research to bolster campaigns, build the voice and choice of most impacted communities, and galvanize researchers, policymakers, and industry practitioners to mitigate A.I. bias and harms. More at https://ajlunited.org.

AJL Family,

We are holding space to grieve, to mourn, and we are also full of righteous anger. The murders of George Floyd, Breonna Taylor, Ahmaud Arbery, Nina Pop, and Tony McDade are only the latest in what feels like an endless chain of police and vigilante violence against Black men, women, children, trans folks, and nonbinary people. …


Pressure on the neck,

protests in the streets,

people standing up,

people shouting down,

rest cannot be found

when the system pulls the trigger

then asks you to be bigger.

Peace will not abound when cries of pain meet tears of gas,

when civil demands for life meet rubber bullets and batons,

when a jog on the block ends your clock,

when bedtime becomes a date with death,

when the crowd watches your killing

because your darker hue is the cue

to dismiss your humanity.

It is time to kill the indifference that permits racism and brutality alongside monumental statements of freedom and justice for all. …


Image for post
Image for post

Thank you Chairman Cummings, Ranking Member Jordan, and committee members for the opportunity to testify today.

I am an algorithmic bias researcher based at MIT. I’ve conducted studies showing some of the largest recorded gender and skin type biases in AI systems sold by companies including IBM, Microsoft, and Amazon.

You’ve probably heard facial recognition and related technologies have some flaws.

In one test I ran, Amazon Rekognition even failed on the face of Oprah Winfrey, labeling her male.


Image for post
Image for post

Today my op-ed on racial and gender bias in facial analysis and recognition technology was published by TIME magazine. Again, I highlighted that Amazon like their peers demonstrated racial and gender bias in the Amazon Rekogntion gender classification feature we audited in August 2018. We did this audit months after sending preliminary audit results to the company in June 2018 and receiving no response from the company at the time. The methodology for the audit study, the parameters used, and full instructions to replicate the dataset have been available since last year.


Image for post
Image for post

Update: “ Twenty-six researchers, including Yoshua Bengio, a recent winner of the Turing Award, the industry’s highest honor, called for Amazon to stop selling its Rekognition AI service to police departments in a post on Wednesday. Bengio was joined by Anima Anandkumar, a former principal scientist at Amazon’s cloud division, and staffers from Google, Microsoft Corp., Facebook Inc. and several universities.

The group defended the work of two other AI researchers who found Amazon’s software had much higher error rates when predicting the gender of darker-skinned women in images, compared with lighter-skinned men. Amazon had argued against the results and methodology of that study, authored by the University of Toronto’s Inioluwa Deborah Raji and Joy Buolamwini, a researcher at Massachusetts Institute of Technology.” …


Image for post
Image for post
www.safefacepledge.org

Since publishing my MIT research findings on racial and gender bias in facial analysis technology sold by IBM, Microsoft, and Megvii (Face++), technology executives, startup founders, and senior scientists have solicited my help in improving their products and research projects.

I have intentionally focused my efforts on using conclusive research findings to call out issues with facial analysis technology instead of working on short- term technical patches that do not mitigate potential abuses. …


https://www.forbes.com/sites/igorbosilkovski/2018/11/13/30-under-30-in-enterprise-tech-drone-school-ai-accountability-and-buying-cars-from-home/#7ee71d0855c9

As American Thanksgiving approaches, I have much to be grateful for. The week of November 11 has been especially kind to me. I was named as a Forbes 30 under 30 honoree on the Enterprise Technology list. As part of their #PublistInterestTech campaign, Ford Foundation released a video highlighting my MIT thesis findings on gender and racial bias in facial analysis technology from IBM, Microsoft, and Face++. I also had the honor of sharing my research findings and recommendations to the Federal Trade Commission during their 7th hearings on Competition and Consumer Protection in the 21st Century.

https://www.fordfoundation.org/about/library/multimedia/fighting-the-coded-gaze-how-we-make-technology-benefit-all/

To end the week, I organized an outing for members of the Center for Civic Media (my academic home) and the Comparative Media Studies program, to view the gallery debut of “AI, Ain’t I A Woman?”. The piece is on display at the Cooper Gallery as part of the “Nine Moments for Now” exhibition curated by Dell Hamilton. …


Image for post
Image for post
Soledad O’Brien interviews Joy Buolamwini asking about police use of Amazon Rekognition. View full interview on Matter of Fact TV — https://matteroffact.tv/artificial-intelligence-is-biased-shes-working-to-fix-it/

THE AMAZONIANS WILL NOT SAVE US FROM FML — FAILED MACHINE LEARNING

Reuters broke the story. Sources shared that Amazon built a hiring tool with significant bias against women. If the word “women’s” and certain women’s colleges appeared in a candidate’s resume, they were ranked lower. Here was a case of “FML” — Failed Machine Learning — that had the potential to negatively influence job prospects for women while breaking anti-discrimination laws. Concerned Amazonians emerged from rivers of data to provide evidence of bias contamination and efforts towards containment. In this case, they recognized their failed machine learning and acted appropriately by halting the project. The contamination extends beyond Amazon’s internal hiring AI tool. …

About

Joy Buolamwini

Founder Algorithmic Justice League. www.ajl.org | www.poetofcode.com | Telling stories that make daughters of diasporas dream and sons of privilege pause

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store