Feature

Some law schools already are using ChatGPT to teach legal research and writing

  •  
  •  
  •  
  • Print

Robot breaking a student's pencil

(Photo illustration by Sara Wadford/ABA Journal)

ChatGPT, an artificial intelligence chatbot that can speak and write like humans, can be weak on facts. But it may already be a better wordsmith than some attorneys, says David Kemp, an adjunct professor at Rutgers Law School.

“If you’re asking it to organize several concepts or are struggling to explain something in a way that’s really understandable, it can help,” says Kemp, who also is the managing editor of Oyez, a multimedia website focused on U.S. Supreme Court opinions.

DavidDavid Kemp is an adjunct professor at Rutgers Law School.

The technology seems to prefer active voice, as does Kemp. He introduced ChatGPT in an advanced legal writing class and plans to include it in a summer course about emerging technology.

Various law schools are following suit. Legal writing faculty interviewed by the ABA Journal agree that ChatGPT writing can model good sentence and paragraph structure. But some fear that it could detract from students learning good writing skills.

“If students do not know how to produce their own well-written analysis, they will not pass the bar exam,” says April Dawson, a professor and associate dean of technology and innovation at the North Carolina Central University School of Law.

Additionally, using tools such as ChatGPT for graded assessment assignments may be an ethics violation if students are not producing their own work, Dawson adds.

Regarding the accuracy issue, some academics think ChatGPT could get better with time.

“It doesn’t have access to legal research platforms at the moment like LexisNexis and Westlaw, so it doesn’t know caselaw that only exists in those databases,” says Ashley Armstrong, an assistant clinical professor at the University of Connecticut School of Law.

AshleyAshley Armstrong, an assistant clinical professor at the University of Connecticut School of Law.

Armstrong wrote an academic paper titled “Who’s Afraid of ChatGPT? An Examination of ChatGPT’s Implications for Legal Writing.” Her research includes asking for a series of legal research and writing tasks, and she says some of the responses were impressive.

For instance, her paper noted that ChatGPT was able to identify “logical flaws” in contract clauses. Additionally, she wrote, it did a “pretty good job” of summarizing facts and wrote text that sounded lawyerly.

But accuracy was an issue, including in its answers for questions that she submitted about Connecticut’s Recreational Land Use Statute.

“I asked it to give me 10 cases I should look into. It did, all of which don’t exist,” says Armstrong, who used Westlaw and LexisNexis to check the cites provided.

Dyane O’Leary, an associate professor of legal writing at Suffolk University Law School, recently gave students in an upper-division practice skills class an assignment to draft an email from a law clerk advising a judge on whether a motion should be granted. In class, after students did their research, they prompted the same legal question into ChatGPT and evaluated whether its responses were reliable research.

Related article: How ChatGPT and other AI platforms could dramatically reshape the legal industry

“A student noted that the ChatGPT answers were great at fluff,” says O’Leary, who heads the law school’s legal innovation and technology concentration. “As a class, we discussed that it had a lot of words in the right ballpark, but on this particular prompt, the answer was wrong.”

At Northwestern University Pritzker School of Law, Daniel Linna Jr. gave students in his class focused on the law of AI and robotics an assignment to sign up for ChatGPT, try it out and share their thoughts on the discussion board.

“Almost everyone recognized it’s bad with facts but really good at writing prose,” says Linna, a senior lecturer.

Linna has a joint appointment as director of law and technology at the law school and the university’s Robert R. McCormick School of Engineering and Applied Science, and he is also a former equity partner at Honigman Miller Schwartz and Cohn. He says law firms already use tools powered by technology that is similar to ChatGPT.

“I have no doubt that lawyers who use these tools are drafting better contracts,” says Linna, who is also an affiliated faculty member at CodeX: the Stanford Center for Legal Informatics. “As we improve the tools, they will help us write better contracts faster. It’s not just about efficiency; it’s about drafting terms that improve the speed of getting the deal done, which adds value for clients.”

This story originally appeared on ABAJournal.com on March 6.

Give us feedback, share a story tip or update, or report an error.