Sunday, December 3, 2023
The Fifth Circuit has published for comment the first federal appellate rule on the use of artificial intelligence (AI) in filings. It would require that the person filing “certify that no generative artificial intelligence program was used in drafting the document presented for filing, or to the extent such a program was used, all generated text, including all citations and legal analysis, has been reviewed for accuracy and approved by a human” as part of the certificate of compliance.
A number of recent incidents have highlighted the danger of reliance on AI. As has been widely reported, two New York lawyers were sanctioned for using AI to draft a brief that contained seemingly valid citations precisely making the point they wanted to for the court, but turned out to be entirely made up. Recently, the Washington Post reported that a young, overextended Colorado attorney relied upon AI for a brief that also included fictious citations, was sanctioned by the court, and fired from his position at the law firm. In another instance described in the same article, a Los Angeles law firm was called out for a similar offense by opposing counsel and fined $999 by the court; it blamed a young lawyer who resigned from the firm after the fictious cases were discovered.
The Washington Post article quoted a Brown University computer scientist that what is “surprising is that [AI programs] ever produce anything remotely accurate.” The scientist, Suresh Venkatasubramanian, explained to the Post that these programs are designed to mimic conversation by developing seemingly realistic responses to whatever inquiry is submitted. It realizes that a legal brief includes citations to precedent, but does not read or synthesize the actual cases, so it creates its own.
The topic was part of the discussion with state chief justices at a National Center for State Courts meeting I was privileged to moderate just before Thanksgiving. As one chief justice expressed to me in private conversation afterwards, she was surprised that it happens at all because she could not imagine a lawyer filing a brief that relies on a case that had not been read by the attorney.
The Fifth Circuit’s proposed rule appears to make that the standard. Within the Fifth Circuit, a judge, Brantley Starr of the U.S. District Court for the Northern District of Texas, has already amended the rules for filings in his court, to require a certificate attesting that the filing contains nothing drafted by AI or that a human being checked any language drafted by AI for accuracy. The judge’s rule calls AI platforms “incredibly powerful” and useful for and have many uses in the law for “form divorces, discovery requests, suggested errors in documents, anticipated questions at oral argument.” One thing he insists it is not useful for is legal briefs.
Judge Starr explains that, at least as currently devised, AI is “prone to hallucinations and bias.” To put it plainly, he says “they make stuff up—even quotes and citations.”
Judge Starr also worries about bias in the programming. He explains that “attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients.” Neither a computer program nor those who devised adhere to an oath. He states,”[t]hese systems hold no allegiance to any client, the rule of law, or the laws and Constitution of the United States (or, as addressed above, the truth).”
The judge is prepared to be convinced otherwise. He has put out a challenge: “Any party believing a platform has the requisite accuracy and reliability for legal briefing may move for leave and explain why.” Until that happens, the judge “will strike any filing from a party who fails to file a certificate” and is prepared to Rule 11 sanctions for an inaccurate filing.
These early rules proposals are likely to proliferate, particularly because online legal research systems, such as Westlaw, Lexis, and Casetext, now also offer AI-based research assistance that may blur the lines between lawyer and computer in ways that may not be predictable. Appellate practice is changing once again.