Risk-assessment algorithms challenged in bail, sentencing and parole decisions

Jason Tashea
1 min readFeb 22, 2017

--

Eric Loomis, 35, was arrested in 2013 for his involvement in a drive-by shooting in La Crosse, Wisconsin. No one was hit, but Loomis faced prison time on a number of charges, including driving a stolen vehicle. He pleaded no contest, and the judge sentenced him to seven years, saying he was “high risk.” The judge based this analysis, in part, on the risk assessment score given by Compas, a secret and privately held algorithmic tool used routinely by the Wisconsin Department of Corrections.

Michael Rosenberg, Loomis’ attorney for his trial and appeal, argued that Compas — which is short for Correctional Offender Management Profiling for Alternative Sanctions — violated Loomis’ right to due process because the proprietary nature of the algorithm made it impossible to test its scientific validity and because the tool improperly considers gender in determining risk.

Last July, the Wisconsin Supreme Court affirmed the lower court’s decision that the risk assessment may be considered as one factor among many used in sentencing. The unanimous court also concluded that the tool did not violate Loomis’ due process right to not be sentenced on the basis of gender. Rosenberg declined an interview request.

The case of Wisconsin v. Loomis reflects an ongoing national debate about the use of algorithms in bail, sentencing and parole decisions. With increased adoption of these tools, defense attorneys raise due process concerns, policymakers struggle to provide meaningful oversight, and data scientists grapple with ethical questions regarding fairness and accuracy.

Continue reading at www.abajournal.com.

--

--

Jason Tashea

Tech, data, & the legal system. Founder @JusticeCodes, Staff Writer @ABAJournal, & Adjunct Prof @GeorgetownLaw.