Legal Technology

Judicata adds color to case law, highlights artificial-intelligence barriers

  •  
  •  
  •  
  • Print

Judicata logo.

When doing legal research, most lawyers first must figure out what cases are relevant and which precedents judges still enforce.

Case-law website Judicata on Wednesday unveiled a new color-coding system to help researchers find which cases to cite and which to avoid. Using machine learning to analyze cases, as well as human editors to check the work, Judicata utilizes a system reminiscent of a law student’s highlighted textbook.

Background colors indicate the strength of a particular case, its relevance to a user’s case, and whether its holding is still good law or has been disputed or even overturned.

The color-coding process was quite time consuming. In an interview, Itai Gurari, chief executive of San Francisco-based Judicata, discussed the challenges in making practical use of artificial intelligence.

“It’s very much a slow, plodding progress,” says Gurari. The Judicata site tested its technology with a narrow focus on California employment law, then grew to cover all California civil cases. Gurari says he’s considering either adding federal civil cases or expanding to New York or other jurisdictions. In the next few weeks, he intends to offer analytics on judges and lawyers, a feature that legal publishers are adopting quickly.

Gurari says that the color highlighting feature offers a quick read on important parts of a decision, something that lawyers should find valuable. “Law school teaches lawyers to bury the lead,” Gurari says. “They teach the acronym IRAC: issue, rule, analysis, conclusion. Lawyers will tell you what’s going on, and only at the end tell you what the decision is.”

Judicata’s coding of court citations indicates with green highlights that cited case is good law and binding precedent. Following the traffic-light analogy, a yellow highlighted link cautions that the case is not binding and red flags cases that are no longer good law. Still, rulings defy coding like a stoplight: There’s also orange to indicate a possible precedent, while blue citations are offered simply as context.

“It’s not just highlighting for color,” says product manager Beth Hoover in an interview. “It helps convey what’s happening in the case.” The Judicata software must read the text for contextual cues like “properly pleaded material facts,” “at odds with,” “argued,” “rejected” or “distinguished” to sort out whether the cited case is at issue. “There’s a lot of information in the law,” Hoover says, “and nobody should have to redo the work every time.”

Judicata highlighting.

In Judicata’s announcement of the color feature, Hoover wrote that highlighting is applied to “many of the most viewed and important cases on Judicata.” The scope is limited because Gurari or other staff lawyers must review the computer analysis of each case for accuracy.

“For just the basic parsing of the sentence structure, the state of the art today is only about 94 percent accurate,” Gurari says. “That’s based on documents written at a ninth or 10th grade level. In California, legal opinions are written at a 19th or 20th grade level.”

The multiple checks involved in comparing citations can lower the computer’s pass rate to nearly 50 percent—not good enough for writing a brief. As such, human lawyers must complete the “Shepardizing” tasks, and scaling the business would require hiring a team of lawyer editors, Gurari says.

“The quality of the data is critical,” Gurari says. “Think about autonomous cars. They need maps that are very granular–that show were the curbs are, where the streetlights are. And errors can be very costly.”

Computers improve their track record with training, though. In the next few months, Judicata plans to let users search the site for causes of action or parties involved in a case, or upload their briefs for a similar contextual analysis.

But there’s no single way to present case law, which adds to the difficulty of providing relevant results. “Think how different each judge and county can be, even in California,” Hoover says. “One reason we spent so much time on accurate parsing is we could use that for other documents to make much more intelligent suggestions.”

To expand beyond California jurisdictions, the machine learning tool will need additional tweaking. While computers follow a schema–a framework for classifying their data–judges don’t. If courts could agree on a labeling framework, editors still would have to verify that clerks were following it.

All of this slows progress for Judicata, which was established in 2012 on a “freemium” revenue model. Gurari says 1,000 California lawyers use an “affordable” subscription service; occasional users still have free access to search cases.

“This is not a replacement but a supplement to West or Lexis,” Guari says. “Our goal is to be fairly strategic regarding the insights we provide.”

Gurari offers no timetable for when the site will become self-supporting. Judicata is still drawing down $8 million in funding from high-profile backers like former Square operations chief Keith Rabois and PayPal co-founder Peter Thiel.

“There are very few companies in Silicon Valley that can afford to build for five years without revenue,” Gurari said. “We were fairly unique in that sense.”

Give us feedback, share a story tip or update, or report an error.