Is Disclosure and Certification of the Use of Generative AI Really Necessary?

Is Disclosure and Certification of the Use of Generative AI Really Necessary?
Image: Kaylee Walstad, EDRM

[Editor’s Note: This article is to appear in Judicature, Vol. 107, No. 2, October 2023 (Forthcoming), published here with permission. Download a PDF version here. The opinions and positions are those of the authors.]

Abstract

Stories about Generative AI (“GenAI”) applications such as ChatGPT have dominated the news for much of 2023. Whether this technology is a blessing or a curse is still open to debate; what is not, however, is that GenAI is here to stay and that billions of dollars are pouring into the development of new applications. While panic spread through the educational community over fear that students would use GenAI to complete their assignments and examinations, there were also some mortifying and highly publicized misuses of GenAI in the legal profession, including attorneys filing pleadings citing fictitious legal authority which was the product of GenAI hallucinations. Concerns about the misuse of GenAI in their own courtrooms prompted several judges in North America to issue individual standing orders requiring disclosure of and certifications related to the use of GenAI specifically, or AI more generally, in connection with legal filings. While an understandable reaction, these orders have lacked consistency in what they cover or require, have been over-broad in scope, and have the potential to cause uncertainty and confusion within the bar, as well to chill legitimate uses of GenAI to increase access to the courts by self-represented litigants, and to reduce the costs and burdens associated with legal research and writing.

Now that counsel have been warned in two highly publicized recent cases, a lawyer who does not adequately understand the risks inherent in using GenAI to produce either factual or legal content to be included in a court filing, and who fails to independently verify the accuracy of factual maters and/or legal authority obtained from GenAI has failed to represent their client competently. 

Grossman, Maura and Grimm, Paul and Brown, Dan, Is Disclosure and Certification of the Use of Generative AI Really Necessary? (August 11, 2023). Judicature, Vol. 107, No. 2, October 2023 (Forthcoming).


This article addresses issues related to the use of GenAI in the justice system and the proactive efforts of individual judges to prevent its misuse in their courtrooms. We focus on the professional lapses that prompted the courts’ reactions, provide examples of the types of orders that judges have issued, explain how GenAI applications operate, why they can hallucinate, and discuss the potential problems, including confusion, increased costs, and the potential chilling effects that accompany such standing orders. We explain why ad-hoc orders may discourage appropriate use of GenAI to make the courts more accessible and the practice of law more efficient. We argue that existing rules of practice and procedure and rules of professional conduct already prohibit this misconduct, and that existing authority contains an equivalent if not stronger deterrent for the misuse of GenAI, without the concomitant downsides of the standing orders. As an alternative, we recommend that courts consider adopting local rules—enacted after public notice and an opportunity for comment—that would apply court-wide, instead of the rapidly developing mosaic of individual standing orders for individual courtrooms. Local rules could address the problem in a more nuanced way without the unintended consequences. Finally, we explain how courts can address the public at large and pro-se litigants in particular, through their websites, to explain the proper and improper use of AI and GenAI applications in court cases. 

Read the entire paper by hovering over the PDF and scroll to advance the pages or download the PDF at the bottom of this post:

Authors

  • Dr. Maura R. Grossman

    Maura R. Grossman, J.D., Ph.D. is a professor in the David R. Cheriton School of Computer Science at the University of Waterloo. Dr. Grossman is also an adjunct professor at Osgoode Hall Law School of York University and an affiliate faculty member of the Vector Institute of Artificial Intelligence.

  • Hon. Paul Grimm (ret.)

    Paul W. Grimm is the David F. Levi Professor of the Practice of Law and Director of the Bolch Judicial Institute at Duke Law School. From December 2012 until his retirement in December 2022, he served as a district judge of the United States District Court for the District of Maryland, with chambers in Greenbelt, Maryland. From 1997 to 2012, he was a magistrate judge in the same court, serving as chief magistrate judge from 2006 through 2012. He is an elected member of the American Law Institute and has served as an adjunct professor of law at the University of Baltimore School of Law and the University of Maryland Carey School of Law, where he taught courses on evidence and discovery. He also has written extensively and taught courses for lawyers and judges in the United States and around the world on topics relating to e-discovery, technology and law, and evidence. Judge Grimm served on the Advisory Committee for the Federal Rules of Civil Procedure from 2009 to 2015 and chaired its discovery subcommittee, which crafted, in part, the 2015 amendments to the Federal Rules of Civil Procedure. He graduated with an A.B. (with highest honors) from the University of California–Davis in 1973. He received his J.D., magna cum laude and Order of the Coif, from the University of New Mexico in 1976, and an LL.M. (Master of Judicial Studies) from Duke University in 2016. Judge Grimm served both on active duty and in the Army Reserve as a Judge Advocate General’s Corps officer and retired in the rank of lieutenant colonel.

  • Daniel G. Brown, Ph.D

    Dr. Daniel G. Brown is a professor in the David R. Cheriton School of Computer Science at the University of Waterloo. He.performs research on computational creativity, music information retrieval and bioinformatics. I