AI plays a larger role in pretrial recommendations in New York

Yasmin Alameddine
Bias In AI
Published in
8 min readMar 13, 2020

New York’s cash bail reform means fewer people are sitting in prison, but judges must rely more on artificial intelligence

Rikers Island, New York City’s main jail | Photo from Shutterstock

On Jan. 8, William Soto, 33, appeared in the Bronx County Criminal Court to determine his bail for four violent felony charges. One month earlier, Soto allegedly choked his grandmother, nearly causing her death.

Judge Audrey E. Stone started the hearing by detailing complaints filed by Soto’s grandmother dating back to 2015, and Soto’s two failed court appearances.

Judge Stone then shared what the court’s release assessment, powered by artificial intelligence, suggested for Soto. He scored in the second-highest risk category, which only four percent of people fall into. Still, the assessment recommended that Soto be allowed to go until trial, under supervised release.

Judge Stone said the assessment had factored in Soto’s stable housing since his last arrest. But, Stone noted, the assessment did not factor in that Soto lived with his grandmother, so returning home would endanger her.

Still, Judge Stone concluded that Soto should be released until his trial date. Soto does not have to put up cash bail, but has to volunteer for Bronx Community Solutions, where he is supposed to help pick up trash in a park, or wash walls marred by graffiti.

If Soto had been charged a few months earlier, his fate may have been different.

He might have been charged a large amount for bail, and if he could not afford it, he would have joined the other 8,500 inmates at Rikers Island, New York City’s main jail.

As of Jan. 1, New York state has eliminated cash bail for most felonies. This law changed how judges make release decisions, and heightened the role artificial intelligence plays in release assessments. AI critics argue this New York law represents a greater criminal justice reform movement in the United States, and should lead to an eradication of all AI powering pretrial risk assessments.

AI is ubiquitous in pretrial risk assessment

Artificial intelligence is commonly used in pretrial risk assessment algorithms.

These tools incorporate socio-economic status, family background, neighborhood crime, employment status and other data to predict someone’s likelihood of appearing at trial, or committing another crime.

Pre-trial risk assessment tools are meant to “inform, not replace” judicial decisions, according to a report by AI Now Institute, a research institute studying the implications of artificial intelligence.

“AI is coded by people, and people have biases,” says Jonas Shaende, chief economist at the Fiscal Policy Institute. Shaende warns that in applying AI to a sensitive area like the criminal justice system, we “do not know the potential consequences of this technology.”

Because AI tools are built on previous crime data, they can compound and reinforce bias patterns, according to EPIC’s report.

For example, African-Americans are incarcerated at nearly six times the rate of whites, and Hispanics at nearly twice the rate of whites, according to the Sentencing Project, an advocacy center working to reduce incarceration in the United States.

An AI tool designed to reduce prison populations led some judges to impose harsher sentences for young or black defendants, and more lenient ones for rapists in Virginia. An AI-enabled risk assessment tool proved twice as likely to flag blacks as future criminals than whites in a county in Florida. The COMPAS tool placed younger defendants at higher risk of committing a crime than older defendants.

Despite evidence that AI tools can produce biased results, governments often are not examining the implications, says Liz O’Sullivan, Co-founder of Arthur AI, a startup that helps clients maintain control over their AI systems. As for New York, “the city and the NYPD loves its toys,” says O’Sullivan.

“There is no real transparency on what jurisdiction has what tools,” says Ben Winters, a fellow at EPIC. Winters says because most of the algorithms are proprietary, some are not subject to open-government laws tools and differ by state, city and county. Some AI-driven tools have been in place for several years, while others are still being tested.

The bail reform law’s effect on New York

On Jan. 1, New York joined New Jersey and California in reducing cash bail. Under the new bail reform law, judges must release accused people, or impose non-monetary restrictions on those charged with misdemeanours and nonviolent felonies. Judges can still impose bail for violent felonies, if they show that non-monetary conditions will not be enough to guarantee a person will show up at their trial date. Overall, judges are encouraged to seek the least restrictive methods for the accused before their trial dates.

Before this law, someone who was arrested in New York City underwent a pretrial interview to determine the likelihood they would return for their court date. They would either be held in jail until they were able to pay bail or put under supervised release, often with electronic monitoring.

More than 67 percent of people in New York jails were held in pretrial, often because they could not afford bail, according to the Center for Court Innovation’s 2019 report. “The blunt ugly reality is that too often, if you can make bail you are set free, and if you are too poor to make bail, you are punished,” said Gov. Andrew Cuomo in a press conference in 2018.

The law asks judges to consider the least restrictive conditions necessary to ensure that someone returns to the trial date. Even if an assessment determines that the individual is a flight risk, the judge is encouraged to set conditions like phone calls to court supervisors, text messages to court supervisors, curfews, or taking away passports or firearms.

Now, 90 percent of people arrested and charged but not yet convicted of a crime will stay out of jail. This will reduce Rikers’ population from 21,000 to less than 3,000 people in pretrial detention centers, according to the Center for Court Innovation’s 2019 bail reform report.

The law also demands more transparent and strict measures: Deadlines for trial — three months for misdemeanors and six month for felonies — will be better enforced,. Prosecutors and defense lawyers will share material to be used during trial within 15 days of an individual’s arraignment, and judges are meant to explain their release decisions on the record, according to a report conducted by Data Collaborative for Justice.

The move to eliminate bail has angered some, particularly as the New York City has experienced its highest crime rate in five years, according to the New York Police Department’s data set. Serious felonies, like auto theft, burglary and homicide- are up more than 16 percent compared to the same period in 2019.

Aubrey Fox, executive director of New York’s Criminal Justice Agency | Photo from Criminal Justice Agency

The bail reform law’s effect heightened use of AI-powered assessments

New York’s bail reform law is one of many pushes to reform criminal justice, from a bipartisan criminal justice bill passed in 2018, to cash bail reform in California and New Jersey, to the election of progressive district attorneys in Boston, Philadelphia, San Francisco, and Dallas.

Bail reform policy “can and should affect the way these tools are being used,” says Aki Younge, who co-created Automating NYC, a resource created for New Yorkers to learn to use AI for good.

New York’s Criminal Justice Agency aims to be a one stop shop for lawyers, judges, family of those arrested, and people who have been arrested. CJA shares research, offers resources to navigate the cash bail system, sends reminders to people awaiting trial, and manages supervised release programs.

The CJA’s updated release assessment is based on a data set of 1.5 million cases from 2009 to 2015. The assessment is meant to be a “decision-making aid, not a decider” for judges when evaluating a person’s criminal history. The assessment recommends that 88.6 percent of individuals be released on recognizance — no restrictions — while they await their court date, 4.2 percent be supervised release options, and 7.2 percent be given a release with “other recommendation” in addition to supervised release, like community service hours or removal of firearms.

CJA is the only publicly available, interactive version of a release assessment tool in the country, according to Aubrey Fox, executive director of New York’s Criminal Justice Agency. This gives the public a “look into the black box” and the “data-based justice system the evaluation it needs,” says Fox.

“We were like wow, is what we spent the last two years doing still relevant?” Fox recalls his team’s reaction to the New York bail reform law. But, he says since the law was passed he has seen a lot of interest in how the court uses release assessments.

The CJA projects that 11 percent of individuals will be given a recommendation which may include some type of supervised release. Judges now use the assessment’s percentages and recommendations to help determine and set a baseline level of supervision.

For example, depending on the individual’s level or risk the judge will decide whether to order frequent supervision, like weekly in-person meetings with a court supervisor, or infrequent supervision, like a monthly text message reminder.

Defense attorneys can also request updated release assessments from the CJA, if their case is still pending and the arrest occurred before the new bail reform law was passed.

As criminal justice reform sweeps the U.S., some groups wish to eradicate AI assessments altogether

“Pretrial risk assessment tools will fail to achieve and may frustrate the aims of bail reform,” said a national coalition of 120 civil rights organizations in 2018. The coalition did not want to use risk assessment instruments in pretrial decision making, and suggested moving towards ending cash bail entirely, according to a MacArthur Foundation report.

Other critics say that using pretrial risk assessments “may distract from other reforms.” Alternative reforms include increasing support services to help incarcerated people succeed upon release or requiring a meaningful hearing before preventative detention can be imposed, according to the MacArthur report.

Some organizations are already moving away from using the release assessment tools.

The Pretrial Justice Institute in Maryland, which have been advocating for pretrial changes for 50 years, announced they are no longer advocating for the use of these tools. As well, Arnold Ventures, which assisted many jurisdictions in implementing these tools, has walked back their support. They said in a press release, “Implementing an assessment alone cannot and will not result in pretrial justice goals we seek to achieve”

Looking towards what is next for AI risk assessments in New York

In its concluding report released in November, New York City’s Automated Decision Systems Task Force — created by Mayor Bill de Blasio in May 2018 to provide a framework for policy around AI-enabled tools — suggested a centralized, city-wide structure for education around AI-enabled tools, and clear reporting of the information that came from these tools.

But the tools cannot erase any prejudices or biases a judge and magistrate has, according to Media Justice’s report. “It’s about the human in the loop,” says O’Sullivan.

“The ideal state is not: no more algorithms forever,” says Deepra Yusuf, a co-creator of Automating NYC. She emphasizes that lawmakers and developers need to set up an unbiased, transparent framework for AI assessment tools, so they can work “towards equity and justice.”

--

--