Lawyers should take these precautions when using artificial intelligence, Florida ethics opinion says
Generative artificial intelligence—which can generate new content based on a prompt—has the potential to “dramatically improve the efficiency of a lawyer’s practice,” but it also can pose ethical concerns. (Image from Shutterstock)
Generative artificial intelligence—which can generate new content based on a prompt—has the potential to “dramatically improve the efficiency of a lawyer’s practice,” but it also can pose ethical concerns, according to a Florida Bar ethics opinion approved Jan. 19.
One of the pitfalls is that generative AI can “hallucinate” or create “inaccurate answers that sound convincing,” the opinion says, citing an October 2023 article from ABAJournal.com.
Because of the concerns, lawyers using generative AI should develop policies for reasonable oversight of the technology, the opinion says.
“Lawyers are ultimately responsible for the work product that they create regardless of whether that work product was originally drafted or researched by a nonlawyer or generative AI,” the opinion says.
The opinion also warns that lawyers may not delegate any act that could constitute the practice of law to generative AI. Acts that can’t be delegated would include the negotiation of claims or other functions that require a lawyer’s personal judgment and participation.
Nonlawyers are allowed to conduct initial interviews with a prospective client, but using an “overly welcoming” generative AI chatbot for this function could pose problems, the opinion says. The chatbot could wrongly offer legal advice, fail to identify itself as a chatbot at the outset, and fail to include disclaimers limiting formation of an attorney-client relationship.
The opinion also says that lawyers should:
• Preserve the confidentiality of client information. Self-learning AI programs continue to develop responses with new input. The danger is that a client’s information revealed in a lawyer query could be stored in the AI program and revealed in response to future inquiries by third parties.
“It is recommended that a lawyer obtain the affected client’s informed consent prior to utilizing a third-party generative AI program if the utilization would involve the disclosure of any confidential information,” the opinion says.
In-house generative AI programs may mitigate confidentiality concerns. If a third party does not host or store AI data, a lawyer is not required to obtain the client’s informed consent, the opinion concludes.
• Ensure that fees and costs are reasonable. Generative AI programs may make a lawyer’s work more efficient, but lawyers should not use the efficiency to falsely inflate claims of billable time.
“Lawyers may want to consider adopting contingent fee arrangements or flat billing rates for specific services, so that the benefits of increased efficiency accrue to the lawyer and client alike,” the opinion says.
• Comply with applicable ethics and advertising regulations. Lawyers can’t advertise that their generative AI is better than technology used by other lawyers unless the claim is verifiable. Lawyers who use AI chatbots for advertising and intake will be responsible if the chatbot provides misleading information to prospective clients or if its communications are “inappropriately intrusive or coercive,” the opinion says.
The opinion warns that AI is still in its infancy, and the ethical concerns addressed should not be treated as exhaustive.
Florida Bar News, Reuters and Bloomberg Law covered the ethics opinion.
Reuters noted that California has issued AI guidance that is also based on ethics obligations. And bar associations in at least six other states are also considering recommendations for lawyers’ responsible use of AI, the article says.
Bloomberg Law spoke with Brian David Burgoon, chair of the ethics committee that developed the opinion.
“It’s a game-changer in the practice of law,” Burgoon told Bloomberg Law.
AI can provide a competitive edge to lawyers who use it responsibly, but ethical guidance is needed.
“There’s good tools out there, but there are some bad problems that can come with them,” he told the publication.