Artificial Intelligence & Robotics

California Innocence Project harnesses generative AI for work to free wrongfully convicted

  •  
  •  
  •  
  • Print

artificial intelligence

“It’s always taken a human eye to read through the case file and go through every single page and figure out what was missed,” says attorney Michael Semanchik. Image from Shutterstock.

Updated: The journey from conviction to exoneration can take decades, and the men and women who have been freed by the California Innocence Project collectively have spent more than 570 years in prison. Time is of the essence, yet the pro bono group has finite resources.

So when the organization’s then-managing attorney Michael Semanchik beta-tested Casetext’s AI legal assistant, CoCounsel, before its March release, he saw the potential to save hours of time. He says he was impressed by how quickly the technology can identify patterns in legal documents for several of the group’s ongoing cases, including inconsistencies in witness statements. As the technology evolves, he hopes he can use it to draw out cases with the strongest evidence of innocence.

“It’s always taken a human eye to read through the case file and go through every single page and figure out what was missed,” says Semanchik, who left the California Innocence Project on July 28 to become executive director of a new group called the Innocence Center where he intends to keep using the software. “We are spending a lot of our resources and time trying to figure out which cases deserve investigation. If AI can just tell me which ones to focus on, we can focus on the investigation and litigation of getting people out of prison.”

Since OpenAI’s generative AI chatbot ChatGPT was released in November, the technology’s dangers and risks have dominated coverage, with tech CEOs and experts, including OpenAI founder Sam Altman and Bill Gates, warning AI poses an existential threat. But for some access-to-justice advocates, it’s clear how legal aid organizations can use the tech to better serve people without easy access to legal services.

“I think that this is a profound opportunity for the legal profession to live up to its ideals,” Pablo Arredondo, the co-founder and chief technology officer of Casetext, says of AI’s potential to close the justice gap. Thomson Reuters announced in June that it would purchase Casetext for a reported $650 million.

Several major law firms are using generative AI chatbots, including CoCounsel and Harvey, for their work. But as more and more lawyers adopt the new technology to maintain a competitive edge, Arredondo sees an opportunity to deliver speedy and affordable legal services to the public. More than 90% of Americans get no or not enough help for legal problems related to basic needs such as housing, education, health care, income and safety, according to a 2022 Justice Gap report by the Legal Services Corporation.

CoCounsel is built on Open AI’s GPT-4 technology and does legal research, deposition preparation and document review. Others besides the California Innocence Project are using the tech. Greg Siskind, an immigration lawyer who filed a class action lawsuit for more than 100,000 Ukrainian refugees seeking work authorization, said at ABA Techshow 2023 that CoCounsel helped him create a 20-page memo with links to relevant cases, citations and summaries.

During his time at the California Innocence Project, Semanchik used CoCounsel to draft emails and memos and for legal research. He says the technology could be most valuable in the future at the case review stage when the group decides which prisoners in Southern California have new and compelling evidence.

According to the National Registry of Exonerations 2022 Annual Report, there had been 3,284 exonerations in the United States since 1989. The registry recorded 233 exonerations in 2022, with those wrongfully convicted each spending an average of 9.6 years in prison, a total of 2,245 years. According to the report, innocence groups and conviction integrity units at prosecutors’ offices were responsible for 171 exonerations, or 74% of the total number.

The work is painstaking. The California Innocence Project, a clinical program at the California Western School of Law, has a full-time staff of only 10, and with interns and clinic students, it usually has between 25 to 30 people working on its cases. The group screens between 800 and 1,000 cases each year and reviews police reports, court filings and trial transcripts; it also investigates about 150 cases and actively litigates up to 60 cases annually.

CoCounsel helped Semanchik narrow his focus on issues he commonly sees, such as witness misidentification, in wrongful conviction cases. It also helped him comb through documents in databases he set up for several of the group’s ongoing cases to identify inconsistencies in witness statements or testimony. Semanchik says the technology isn’t at a place where it can review a case at the intake stage to make a determination about the strength of new evidence, but he hopes it will in time.

“Maybe we get to the point where it’s able to just take a look at an entire case, flag some of the wrongful conviction causes that we normally see and send it over for attorney review so we’re not spinning our wheels trying to find the good cases; the good cases come to us through AI,” Semanchik says.

Warning signs

During his time at the California Innocence Project, Semanchick did not go all-in on CoCounsel and adds that he drew from a “closed universe of facts” and from a limited number of case documents he put through the software. But he is aware of how human bias is baked into some large language models.

“That’s something that we need to be on the lookout for and protect against as best we can,” he says.

In the past, AI bias has been a concern in the criminal justice system. In a 2016 article, ProPublica investigated and analyzed COMPAS artificial intelligence software and the recidivism algorithm that state courts used to decide the risk of defendants reoffending. The algorithm was almost twice as likely to produce false positives for Black defendants (45%) than for white defendants (23%). The investigation found the tool was no better at risk assessment than a group of random people polled on the Internet.

Statistician and machine learning expert Kristian Lum, a research associate professor at the University of Chicago Data Science Institute, is “cautiously optimistic” about the potential to help underresourced legal aid organizations. But because gender and racial bias have beset large language models, she says there could be an impact on the people they serve.

“I hope the organizations that are using [generative AI] are aware of some of the pitfalls and are doing some analysis to make sure it’s not missing whole categories of clients or leaving some clients behind,” Lum says.

E-discovery company Relativity is using generative AI in its product Translate, which is part of the RelativityOne software it makes freely available to legal aid organizations as part of its Justice for Change program.

Aron Ahmadia, Relativity’s senior director of applied science and AI, says the company is not presently using GPT-based chatbots but is “working to explore how we can leverage the latest generation of generative AI capabilities, particularly GPT-4.”

He notes, however, that the company is aware of the risks. Relativity has published AI principles on its website for responsible use, and it tests the models it uses for potential bias.

Relativity’s program manager of social impact, Johnathan Hill, is spearheading the Justice for Change program, which has provided support to the Innocence Projects in states including New York and Hawaii. In January, a judge in Hilo, Hawaii, vacated the conviction of Albert Ian Schweitzer, who was convicted in 2000 for the 1991 rape and murder of Dana Ireland. New DNA evidence exonerated him after he appealed his conviction on the bases of false testimony and confession, and ineffective assistance of counsel.

Hill says RelativityOne helped the Innocence Project sift through thousands of documents, including jailhouse testimony, as it built a timeline for the case. Illinois Innocence Project used the company’s Translate plugin for its Latino Innocence Initiative and to translate hours of witness testimony and interviews from Spanish to English, he says.

Even though concerns about accuracy, bias and misinformation swirl around AI, Arredondo says that isn’t what keeps him up at night. He’s more concerned that the chance to use the tech to open up the justice system will be squandered.

“There are people in jail right now who should not be, and that’s a profound problem,” Arredondo says, citing the oft-quoted maxim “Justice delayed is justice denied.” “We have a moral obligation to help as many people as we can.”

Updated Aug. 15 at 5:58 p.m. to indicate throughout that Michael Semanchik is the former managing attorney of the California Innocence Project. Updated Sept. 1 at 12:34 p.m. to add that the California Innocence Project is a clinical program at the California Western School of Law.

Give us feedback, share a story tip or update, or report an error.