If you are booked into jail in New Jersey, a judge will decide whether to hold you until trial, or set you free. One factor the judge must weigh: the result from an algorithm called PSA that estimates how likely you are to skip court or commit another crime.

New Jersey adopted algorithmic risk assessment in 2014 at the urging, in part, of the nonprofit Pretrial Justice Institute. The influential Baltimore organization has for years advocated use of algorithms in place of cash bail, helping them spread to most states in the nation.

Then, earlier this month, PJI suddenly reversed itself. In a statement posted online, the group said risk assessment tools like those it previously promoted have no place in pretrial justice because they perpetuate racial inequities.

“We saw in jurisdictions that use the tools and saw jail populations decrease that they were not able to see disparities decrease, and in some cases they saw disparities increase,” says Tenille Patterson, an executive partner at PJI.

Asked to name a state where risk assessment tools didn’t work out, she pointed to New Jersey. State figures released last year show jail populations fell nearly by half after the changes, which took effect in 2017, eliminating cash bail and introducing the PSA algorithm. But the demographics of defendants stuck in jail stayed largely the same: about 50 percent black, and 30 percent white.

Pete McAleer, a spokesperson for New Jersey’s Administrative Office of the Courts, said PSA alone could not be expected to eliminate centuries-old inequities and that the state’s reforms had prevented many black and Hispanic defendants from being detained. State officials are seeking ways to eliminate remaining disparities in its justice system, he said.

PJI’s switch from advocate to opponent of risk assessment algorithms reflects growing concerns about the role of algorithms in criminal justice, and other arenas.

In an open letter last July, 27 prominent academics suggested pretrial risk assessments be abandoned. The researchers said the tools are often built on data that reflects racial and ethnic disparities in policing, charging, and judicial decisions. “These problems cannot be resolved with technical fixes,” it said.

Last month, concerns about racial bias prompted Ohio’s Supreme Court to delete from a list of proposed bail reforms a recommendation that the state adopt risk assessment tools. The court’s recommendations will become law automatically this summer unless both chambers of the state legislature block them. In December, a Massachusetts commission rejected risk assessment tools in that state’s report on bail reform, and cited potential racial bias. California voters will decide in November whether to repeal a new state law that would eliminate cash bail and require use of risk assessments.

Beyond bail systems, critics have cautioned about blindly trusting algorithms in areas as diverse as facial recognition, where many algorithms have higher error rates for darker skin; healthcare, where researchers found evidence a widely used care management system pushed black patients to the back of the line; and online recommendations, accused of amplifying conspiracy theories and hate speech.

Keep Reading

Risk assessment algorithms like PSA have spread across the US as part of efforts to scrap or minimize use of cash bail to decide who gets to go home while awaiting a court date. Criminal justice reformers view bail as unfair to people from poorer, minority, communities, who often are jailed for only minor charges. Instead of a single judge making such decisions alone, potentially introducing personal biases, combining human judgement with mathematical formulas based on statistics from past cases was thought to be fairer. A recent report from Philadelphia nonprofit the Media Mobilization Project and Oakland nonprofit MediaJustice found such tools are in use in at least parts of 46 states and the District of Columbia.

You May Also Like

Blood test could identify pregnant women at risk of pre-eclampsia, scientists say

A simple blood test can identify pregnant women at risk of pre-eclampsia…

The Chip Shortage Is Driving Up Tech Prices—Starting With TVs

Televisions, laptops, and tablets have been in high demand during the Covid-19…

iPhone users are just noticing they’ve been using popular app completely wrong – there’s a much easier way

YOU’VE been wasting your time in one of the iPhone’s most used…

iPhone 14 launch: Apple debuts iPhones, Apple Watch Series 8, Apple Watch Ultra, AirPods Pro

Here was Daily Mail’s live blog for all the updates on Apple’s…