Trial by code?

James Smith
SI 410: Ethics and Information Technology
3 min readFeb 12, 2021
img from Forbes.com

The U.S. has the highest prison population rate in the entire world! With this huge problem, the justice system has turned to use algorithms to assess and automate the risk within the cash bail system. Some courts are going as far as using these algorithms for sentencing and parole charges. Providing an accurate and fair assessment of people’s past and future behavior is no easy feat, especially for computers.

Prisoner count from statista.com

Advocates of this system primarily draw their argument upon the idea that numbers don’t lie. The computer will be able to make a fair assessment of people regardless of their backgrounds. In Critical Questions for Big Data, Anderson exclaims that “with enough data, the numbers speak for themselves.” On the surface, this seems to make logical and coherent sense. It’s something we’ve learned in every high school stats class, by increasing your sample size (data), your experiment will be more accurate. While this is to some degree true, human error can draw correlations due to the absence of raw unbiased data. As Boyd & Crawford point out “too often, Big Data enables the practice of apophenia: seeing patterns where none actually exist, simply because enormous quantities of data can offer connections that radiate in all directions”. Such critical and life-changing decisions such as court rulings are too fragile to be influenced by biases and must be considered thoroughly and fairly. This software in a sense is even more biased than us as it carries a plethora of biases from the data into its decision-making.

Critics of this system point out a lot of valid concerts that suggest that this system is too immature to be adopted. A lot of the time these machine learning algorithms are trained to identify patterns for the future based on a history of data. Unfortunately, more often than not prejudice and racism bleeds into the data, so when the models train on this data it amplifies these biases. The way crime data is reported suggests the information is more telling of the behavior of police and other law enforcement than of the constituents it’s applied to. ProPublic.org mentions that a huge concern of the software used to predict these trials is “biased against blacks” because historically more blacks have been arrested and imprisoned than their white counterparts. This doesn’t mean black people commit more crimes, rather police decide to arrest and charge a higher number of black people. The proposed algorithms can’t tell the difference and thus play into a dangerous bias with racial implications, ultimately harming people of color. Sure we can use AI and big data to make funny Instagram filters, but let’s hold off on its role in the justice system for now.

--

--