Last year in Wisconsin, a man who pled guilty to several charges for his involvement in a drive-by shooting was sentenced to six years in prison. This is not unusual — the guy has priors and he’s a registered sex offender — but what was unusual was the judge’s method at determining the sentence.
The judge said he had arrived at his sentencing decision in part because of Mr. Loomis’s rating on the Compas assessment, a secret algorithm used in the Wisconsin justice system to calculate the likelihood that someone will commit another crime.
Compas is a computer software system that was “designed to determine [the prisoner’s] risk and needs and inform dynamic case plans that will guide the offender throughout his or her lifecycle in the criminal justice system.” What that mean is that it helps determine the criminal’s risk of repeating their offense or something similar.
The case is now heading to the Supreme Court, where justices will see if machine-assisted sentences like this one hold up to the rights of due process.
Is this a bad thing?
Plenty of industries rely on computers to help them sort through complex data sets, and I’m sure some prisoners’ wrap sheets are complex. Rather than relying solely on their discretion, their unconscious biases, and their mood, judges could use these kind of programs to treat convictions more fairly…the key word being could.
The scary thing about a computer determining a prisoner’s sentence is that this computer system uses a proprietary algorithm that some accuse of being racist. If there’s a black-box algorithm adding to our mass incarceration problem, then we need to go back to the drawing board.
I think that if judges are going to use a tool like this, they should have to know how it works and what its shortcomings are. This is a case where public good trumps proprietary technology.