Your Voice

Narcissistic Lawyers and Artificial Intelligence: A band of dysfunctional brothers?

  •  
  •  
  •  
  • Print

Jennifer Gibbs

Jennifer Gibbs. (Photo courtesy of Zelle)

“Your honor, no artificial intelligence was used to generate content in this document. However, we would also like to disclose that there is someone in the courtroom who exhibits highly narcissistic traits and, therefore, is likely to lie, project, confabulate and inflate the truth.” With the rise of narcissism and societal celebration and reward of narcissistic traits, is this where we are headed? Should we consider such disclosure requirements?

There are striking and eerie similarities between narcissism and artificial intelligence, begging the question of whether the legal system needs warnings and disclosure requirements (or at least judge- and lawyer-specific training) when dealing with lawyers plagued with the increasingly prevalent disorder known as narcissism.

Disclosure

It all began with one bad trip. In a now-infamous case, Mata v. Avianca, lawyers submitted a briefing that relied upon six “hallucinated” legal authorities.

In that case, lawyers employed ChatGPT to conduct legal research to identify cases that supported their client’s legal position. The lawyers admitted that they did not read the cited cases (which were not real) and apparently were under the assumption that ChatGPT was a search engine. The Avianca case captivated the technology world, where there has been an increasing debate about the dangers—even an existential threat to humanity—posed by AI and has also caught the attention of lawyers and judges worldwide.

In response, several courts issued standing orders to prevent such conduct by parties and their counsel, requiring attorneys and pro se litigants to file a certificate indicating whether any portion of their filings was drafted using generative AI tools.

With existing rules addressing competent representation and candor with the tribunal, what is at the heart of these standing orders regarding AI is that we just don’t trust it. And we shouldn’t. The same could be said for narcissists, of which there appears to be no absence of in the legal community.

Human intelligence vs. artificial intelligence

In evaluating human intelligence versus AI, most (humans) agree that human intelligence is superior to AI in contexts and tasks that require empathy, judgment, intuition, subtle forms of communication and imagination. Human intelligence is often considered superior in adapting to new and unexpected situations and can provide ethical and moral considerations when making decisions.

But when human intelligence is combined with narcissistic personality disorder—or even someone with highly narcissistic traits—the distinctions become less clear.

Empathy and emotional intelligence

Empathy is typically regarded as a uniquely human trait that includes the ability to understand the emotions of others, to monitor oneself and to maintain and regulate self-other awareness. Emotional intelligence allows for the formation of relationships because most human beings can understand and respond to emotions, show empathy and react accordingly.

In the context of AI, empathy is, just as the form of intelligence, artificial. Artificial empathy is able to mimic and mirror certain aspects of human behavior, but it cannot express emotions and certainly can’t understand them.

One of the hallmark traits of narcissism is the lack of empathy. And although some narcissists are able to scan people and imitate their behavior, narcissists do not feel or understand the emotions of others—exactly like AI.

In the legal context, attorneys who exhibit a lack of empathy are much more likely to face a malpractice claim and could be perceived as a huge financial risk for law firms and malpractice insurers. The use of empathy is also an important part of a lawyer’s analytical skills, and an increase of empathy among individual lawyers may benefit the overall image of the profession.

Judgment and candor

One aspect of human intelligence that most think cannot be artificially constructed is that of judgment. Lawyer judgment and discernment can refer to the ability of a judge to evaluate the evidence presented in court, assess the credibility of witnesses, and make a sound judgment or decision regarding the case at hand. Discernment and judgment often involve careful consideration of all relevant facts and legal principles to arrive at a fair and just outcome.

AI systems and models, however, tend to engage in certain behaviors, such as manipulation, deception or power-seeking, frighteningly and surprisingly without prompting meant to directly induce such behavior. For example, there is evidence that current-generation AI models have a propensity to be sycophantic and tell users what they want to hear, rather than the truth.

Similarly, when AI systems come across limitations of the training datasets, they exit their training area and start to lie or “hallucinate.” Some say AI is a delusion just like psychotic disorders. AI “hallucinations,” as described above, have triggered an influx of standing orders regarding the disclosure of AI in legal tribunals all over the world.

And speaking of AI hallucinations, when unable to provide an answer, narcissists will lie, confabulate and inflate the truth. Narcissists and their constant preoccupation with the fantasy world not only reinforces their sense of grandiosity but essentially replaces the narcissist’s genuine self with an imaginary, idealized version. Sort of like one’s own imaginary friend.

Because the law holds lawyers to a more demanding standard of conduct than other people when it comes to honesty, ethics and fiduciary duties, narcissistic lawyers can become as much of a problem to the legal process (and arguably more) as hallucinating AI systems.

Disciplined lawyers

Statistics show that solo practitioners make up 30% of the legal profession but receive 56% of disciplinary measures, and lawyers at large firms make up 10% of the profession but receive only 2% of disciplinary measures.

Although each of these statistics is attributable to various factors, it is not a far stretch to speculate that if the disciplined lawyers are narcissists, or those with highly narcissistic traits, one of the likely causes of sanctioned conduct by solo practitioners is the absence of a proper dataset—or colleagues with emotional intelligence, empathy and ethics by which to model and reinforce norms. Hence, the AI-like intelligence cannot pull from its training data and “hallucinates” proper behavior, landing the attorney (and often the client) in hot water.

Takeaways

Human intelligence is considered superior to AI in many ways, and it appears that the court system currently agrees. And although AI has imitating abilities, because AI can hallucinate, manipulate and deceive, it is best used in closed management systems in which the rules are clear and not influenced by outside forces.

Narcissists (and those with highly narcissistic traits) employ a similar type of behavior in which emotional intelligence, empathy, judgment and candor are simply mimicked.

Thus, absent a screening tool to identify narcissism (although there are AI models in the works for that), educating the public and tribunal about the traits, characteristics and dangers of narcissistic lawyers is more important than ever in maintaining the dignity and civility of the civil and criminal justice system.


Jennifer Gibbs is a partner in the Dallas office of Zelle, where her practice focuses on first-party property insurance coverage disputes resulting from catastrophic losses, such as hail, wind, ice, water damage, collapse, fire and explosions.


ABAJournal.com is accepting queries for original, thoughtful, nonpromotional articles and commentary by unpaid contributors to run in the Your Voice section. Details and submission guidelines are posted at “Your Submissions, Your Voice.”


This column reflects the opinions of the author and not necessarily the views of the ABA Journal—or the American Bar Association.

Give us feedback, share a story tip or update, or report an error.