Three ways data can shed light on institutional racism

Examples from the Call for Code for Racial Justice

--

“Darkness cannot drive out darkness, only light can do that. Hate cannot drive out hate, only love can do that.”
– Martin Luther King, Washington National Cathedral, March 31, 1968

Darkness is a good way to describe the range of emotion triggered in the last year by high-profile cases of racial injustice in the US and across the globe. Anger, frustration and even despondency that through the deaths of members of the black community like George Floyd and Ahmaud Arbery, we see little has changed in the 50+ years since MLK spoke in Washington.

This same darkness was seen on internal networks in IBM as the Black Lives Matter movement gathered steam. Stories were shared by the Black community on the racism they had faced and they were clear that this was not just relegated to one segment of the community — people with PhDs shared stories of being stopped on the highway or receiving different treatment by Customs officials when traveling with white colleagues.

Yet, there was a glimmer of light.

The conversation switched towards the question of what role technology, and specifically data and AI, could play in addressing institutional racial bias. A grassroots initiative emerged. Teams were formed and solutions were sketched out on IBM’s internal cloud, using open source principles like sharing code on GitHub to promote collaboration. In October 2020, the most mature solutions were released through the Call for Code global initiative, which asks developers and problem solvers to fight against the most pressing issues we as a society face.

As part of the Call for Code team, I’ve had the honor of being able to work on the plans for how these projects should come to life. The role of data and AI in these projects is fundamental and significant. Let me share with you some of the remarkable ways the Call for Code for Racial Justice is using this technology for real societal impact.

Analysing multiple viewpoints when a crime occurs

Often after a crime, many people will have a different perspective on what actually occurred. Witness and police accounts may tell a different story, with each group making assumptions based on their vantage point and past experience. These differences will be magnified if there is racial bias from the police or law enforcement agency.

The Incident Accuracy Reporting System shows how natural language processing can collect interpretations of a crime in the format that’s easiest for the end user, such as via speech, rather than writing out a lengthy incident report. The app converts this into text and analyzes all testimonies, hunting for disparities. These are highlighted on a dashboard that can alert legal staff and the community to the most contentious accounts of crimes.

Spotting judicial bias

On being arrested for a minor crime, the rate of incarceration for black youths in the US is typically three times that of white youths. Understanding this judicial bias can be useful for defense lawyers to directly address racial disparities in the judicial system when representing their clients.

The Open Sentencing application ingests data from multiple sources including the US Sentencing Commission and uses the IBM AI Fairness 360 open source toolkit for examining bias in machine learning models, refined to specifically isolate disparity in Federal sentencing outcomes of black vs. white defendants. The system can take into account data from different US states and look at the different stages in the complex criminal justice system: from pretrial to adjudication to sentencing and incarceration.

Navigating complex policies

The voting process can be difficult to navigate even for the most highly-educated. Restrictive local processes, changing requirements, regulations, an inability to access the correct voting location, and a lack of information means that millions of minorities, including the Black community, have had their votes go uncounted, purged, or simply not submitted due to lack of faith and trust in the system.

The Five Fifths Voter solution focuses on addressing the key areas of voter suppression such as voter registration, voter ID laws, voter registration restrictions, voter purging, felony disenfranchisement, and gerrymandering. It scours sources from vote.org to the Google Civic Information API and presents personalized information via a simple web app.

Obviously there are many more ways data and AI can help us realize the ideas put forward by Martin Luther King and the American civil rights movement. Through these examples I hope you have a clearer sense of the role of technologists in turning darkness into light in the fight against racial injustice.

Check out this iHeartRadio podcast for more on the inception of Call for Code for Racial Justice and find more stories like this on the Call for Code Digest.

--

--

Daryl Pereira
IBM Data Science in Practice

A senior content strategist with a passion for sustainability and tech focused on the intersection of marketing, media and education.