Criminal Justice

Risk-assessment algorithms challenged in bail, sentencing and parole decisions

  •  
  •  
  •  
  • Print

risk

Illustrations by Bob Fernandez

Eric Loomis, 35, was arrested in 2013 for his involvement in a drive-by shooting in La Crosse, Wisconsin. No one was hit, but Loomis faced prison time on a number of charges, including driving a stolen vehicle. He pleaded no contest, and the judge sentenced him to seven years, saying he was “high risk.” The judge based this analysis, in part, on the risk assessment score given by Compas, a secret and privately held algorithmic tool used routinely by the Wisconsin Department of Corrections.


Michael Rosenberg, Loomis’ attorney for his trial and appeal, argued that Compas—which is short for Correctional Offender Management Profiling for Alternative Sanctions—violated Loomis’ right to due process because the proprietary nature of the algorithm made it impossible to test its scientific validity and because the tool improperly considers gender in determining risk.

Last July, the Wisconsin Supreme Court affirmed the lower court’s decision that the risk assessment may be considered as one factor among many used in sentencing. The unanimous court also concluded that the tool did not violate Loomis’ due process right to not be sentenced on the basis of gender. Rosenberg declined an interview request.

The case of Wisconsin v. Loomis reflects an ongoing national debate about the use of algorithms in bail, sentencing and parole decisions. With increased adoption of these tools, defense attorneys raise due process concerns, policymakers struggle to provide meaningful oversight, and data scientists grapple with ethical questions regarding fairness and accuracy.

In 2014, Eric Holder, then the U.S. attorney general, articulated the uncertainty swirling around these tools in a speech given to the National Association of Criminal Defense Lawyers’ 57th Annual Meeting. “Although these [risk assessment] measures were crafted with the best of intentions, I am concerned that they may inadvertently undermine our efforts to ensure individualized and equal justice,” he said. “They may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”

Angel Ilarraza, director of consulting and business development at Northpointe Inc., the Michigan-based company that created Compas, thinks that this concern is ill-founded. “There’s no secret sauce to what we do; it’s just not clearly understood,” Ilarraza says.

ALGORITHMS AT WORK

Compas uses an algorithm, a term Ilarraza does not like because he thinks it is confusing, that assesses 137 questions answered by the charged person and supplemented by his or her criminal records. These inputs are plugged in to the algorithm, which is a set order of operations like a math equation. Based on this process, the person’s likelihood of committing a future crime (the output) is pegged on a scale of 1 (low risk) to 10 (high risk). Beyond Wisconsin, Compas also is used in California, Michigan and New York, among other jurisdictions.

The questionnaire covers the gamut of a person’s criminal history and personal background as a way to decipher risk. Questions include whether an alleged offender experienced his or her parent’s divorce or has a telephone at home, and whether the screener thinks the defendant is a suspected or admitted gang member.

Ilarraza, supporting the Wisconsin Supreme Court view, is quick to point out that the tool is meant to inform decision-making. “It facilitates the implementation of evidence-based practices,” he says.

Christine Remington, the Wisconsin assistant attorney general who argued Loomis for the state in the supreme court, agrees. “I don’t think there’s any question that [Compas] is a good thing,” she says. It allows the corrections department to “tailor limited resources in the best way possible.”

Compas recently came under scrutiny by ProPublica, an investigative journalism organization. Assessing the tool’s outputs in Broward County, Florida, ProPublica found that it was 61 percent predictive of rearrest, “somewhat more accurate than a coin flip.” The algorithm was likely to indicate black defendants as “future criminals” at almost twice the rate as white defendants.

Northpointe disputes ProPublica’s findings. The back-and-forth can be read in full on ProPublica’s website.

This clash illustrates a newfound popular interest in these tools. But using math to guide decision-making in the criminal justice system is not new. According to Richard Berk, a professor of criminology and statistics at the University of Pennsylvania, an Illinois parole board started to use algorithms in the 1920s.

“In the ‘20s, parole boards were worried about what parole boards are worried about today: If I release somebody, are they going to commit a horrible act?” Berk explains. Back then, the tools were simple mathematical tabulations that assessed risk by comparing people up for parole to those previously released.

Since then, the math behind these tools has improved accuracy, and technological advancement allows for statisticians to wrestle with bigger data sets through computers. However, the point remains: U.S. criminal justice systems have used math to guide decision-making for about a century.

Even with this history, how these tools affect equal protection and due process of defendants remains unresolved.

Gender factors

Sonja B. Starr, a professor at the University of Michigan Law School, says it “is a matter of what factors go into these instruments” and not the instruments per se. For example, she argues that using gender as an input “counts against men to be men, and that is a pretty straightforward violation to Supreme Court precedent.”

risk

Although the issue of gender was not an equal protection claim in Loomis, the court wrote of Compas: “If the inclusion of gender promotes accuracy, it serves the interests of institutions and defendants, rather than a discriminatory purpose.”

Starr says the U.S. Supreme Court “rejected that very reasoning” in the 1976 case Craig v. Boren. The court had reviewed an Oklahoma law that banned men younger than 21 from buying certain alcoholic beverages. The state supported this policy with statistical evidence that showed that young men were almost 10 times more likely than women to be arrested for drunken driving. The court ultimately found that “prior cases have consistently rejected the use of sex as a decision-making factor, even though the statutes in question certainly rested on far more predictive empirical relationships than this.”

Going further, Starr thinks other inputs raise issues for indigent defendants. She says providing equal opportunity under the law regardless of socio-economic status “is nothing less than the central goal of the criminal justice system.” However, some tools, including Compas, use factors such as how often people change addresses or whether they have trouble paying bills, which rely on statistical generalizations that underprivileged people are more likely to commit crimes. This, Starr argues, flies in the face of established law.

If defense attorneys wanted to make either of Starr’s arguments in court, they would have to know the algorithm’s factors and how they are weighed. Like the risk assessment in Loomis, some of the tools being used by government agencies are proprietary and “black boxed,” which means no capability or limited capability to review the math exists, and therefore they cannot be independently challenged. Being used in bail decisions and sentencing, these tools do not fall under the usual evidentiary rules of discovery.

“There’s never justification for secrecy of the algorithm” in the criminal justice system, says Frank Pasquale, a professor at the University of Maryland Francis King Carey School of Law and author of The Black Box Society: The Secret Algorithms that Control Money and Information.

Remington, who argued for the state in Loomis, has a different view. “We don’t know what’s going on in a judge’s head; it’s a black box, too,” she says. She thinks, although the math is hidden, “Compas will help give a little more transparency.”

One risk assessment tool being used in bail decisions may avoid many of the critiques that Starr, Pasquale and Holder articulate. Developed by the Laura and John Arnold Foundation, a Houston-based philanthropic organization, the Public Safety Assessment-Court tool is not black-boxed and does not rely on gender or socio-economic factors.

Currently in 30 jurisdictions, the PSA-Court tool considers nine factors related to a person’s criminal history without a questionnaire. It provides a risk assessment on how likely that person is to fail to appear for a court date and commit a new crime or a violent crime while on release.

Those factors include previous misdemeanor and felony convictions, prior failures to appear for a court date, and the defendant’s age at the time of arrest.

In discussing the choice to make the factors open to public scrutiny, Matt Alsdorf, the vice president of criminal justice at the Arnold Foundation, says that “it’s important from a fairness per-spective for all the parties to understand what goes into a risk assessment.”

The Arnold Foundation is funding studies to track the tool’s impact. Results from Lucas County, Ohio, which adopted the tool in January 2015, found that outcomes did not show a race or gender bias. The number of people being released without the need for bail increased from 14 percent to almost 28 percent. The percentage of pretrial defendants arrested for other crimes while out on release has been cut in half—from 20 percent to 10. The percentage of pretrial defendants arrested for violent crimes while out on release also has decreased—from 5 to 3 percent.

risk

According to the foundation, the early successes in Ohio can be attributed to the tool’s capability to help judges make informed decisions that better allocate resources instead of judges being motivated by emergency-release rules to alleviate jail overcrowding.

Even while promoting the tool’s openness, Alsdorf, an attorney, is uncertain whether a legal imperative to open algorithms in the criminal justice system exists. However, he does say it is important for “a lot of researchers” to be “poking and prodding.”

This point of view conflicts with those who run businesses built around their protected intellectual property, which raises challenges for policymakers who try to strike a balance between private sector innovation and the rights of defendants.

“Right now, it’s the Wild West,” Berk says. “It’s a mess.” At the federal level, that mess does not show signs of improving and leaves numerous issues unresolved.

“There is a very real danger that these tools and the appeal of ‘objective risk scores’ will silently codify racial disparities in bail determinations under a veneer of scientific rigor,” says Scott Levy, director of the Fundamental Fairness Project at the Bronx Defenders, a legal aid organization. “It is essential that appropriate oversight and transparency mechanisms are in place.”

ATTEMPTED REGULATION

Although there is a lack of current law that tackles the issues Levy raises, attempts to regulate algorithms have been made. In 2012, President Barack Obama proposed the Consumer Privacy Bill of Rights to allow people to correct information used by algorithms in a similar way to changing incorrect information in a credit report. The proposal never got congressional approval.

Pasquale thinks the former president’s proposed solution could fix “really basic errors.” But the role algorithms play in society are beyond this policy prescription. To inform domestic policy, he is monitoring the European Union’s General Data Protection Regulation, which takes effect in May 2018 and would create a legal right to challenge decisions made by algorithms, including in the criminal justice system.

Another potential solution, Berk says, could be modeled on the way the Food and Drug Administration regulates pharmaceuticals. In Berk’s proposal, an algorithm’s developer “would be required to submit the code and any data used to evaluate the code” to the new agency for testing, similar to how prescription drugs are tested. The agency’s process would strike a balance that permits public inspection of algorithms while protecting intellectual property.

While the merits of these proposals are debated, policymakers also have ethical questions to grapple with. Chief among them are the trade-offs between accuracy and fairness. However, “until the various parties expressing strong opinions about the merits of criminal justice risk assessments clarify what they mean by ‘fairness,’ no progress can possibly be made,” Berk says.

Furthermore, Berk says that “even if an algorithm is equally accurate for all, more blacks and males will be classified as high risk” because African-Americans and men are more likely to be arrested for a violent crime.

When Berk brings up these challenging ethical trade-offs with government officials who are interested in building a risk assessment tool, he sees “a lot of hand-wringing.” Wisconsin attorney Remington notes a similar stalemate: “This issue is not resolved.”

In October, Loomis’ attorney filed a petition with the U.S. Supreme Court to overturn the state court’s decision, arguing that the use of Compas violated his 14th Amendment rights to due process.


Jason Tashea is a freelance writer based in Baltimore and the founder of Justice Codes, a criminal justice and technology consultancy.


This article originally appeared in the March 2017 issue of the ABA Journal with this headline: "Calculating Crime: Attorneys are challenging the use of algorithms to help determine bail, sentencing and parole decisions."

Give us feedback, share a story tip or update, or report an error.