“That escalated quickly!” A Texas judge has published a “Mandatory Certification Regarding Generative Artificial Intelligence” on his website to require filings drafted or checked by humans.
Texas District Judge Brantley Starr from the Northern District of Texas has published this at the top of the Judge Specific Requirements on his website:
Mandatory Certification Regarding Generative Artificial Intelligence
“All attorneys appearing before the Court must file on the docket a certificate attesting either that no portion of the filing was drafted by generative artificial intelligence (such as ChatGPT, Harvey.AI, or Google Bard) or that any language drafted by generative artificial intelligence was checked for accuracy, using print reporters or traditional legal databases, by a human being. These platforms are incredibly powerful and have many uses in the law: form divorces, discovery requests, suggested errors in documents, anticipated questions at oral argument. But legal briefing is not one of them. Here’s why. These platforms in their current states are prone to hallucinations and bias. On hallucinations, they make stuff up—even quotes and citations. Another issue is reliability or bias. While attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients, generative artificial intelligence is the product of programming devised by humans who did not have to swear such an oath. As such, these systems hold no allegiance to any client, the rule of law, or the laws and Constitution of the United States (or, as addressed above, the truth). Unbound by any sense of duty, honor, or justice, such programs act according to computer code rather than conviction, based on programming rather than principle. Any party believing a platform has the requisite accuracy and reliability for legal briefing may move for leave and explain why. Accordingly, the Court will strike any filing from an attorney who fails to file a certificate on the docket attesting that the attorney has read the Court’s judge-specific requirements and understands that he or she will be held responsible under Rule 11 for the contents of any filing that he or she signs and submits to the Court, regardless of whether generative artificial intelligence drafted any portion of that filing.”
He also provides a link to “A template Certificate Regarding Judge-Specific Requirements”, which automatically downloads a Word document (so be prepared for it) that enables the attorney to enter in the parties, case number and attorney name to certify that the attorney is complying with “all judge-specific requirements for Judge Brantley Starr, United States District Judge for the Northern District of Texas”.
It has only been a matter of days since it was reported that Steven Schwartz, an attorney with Levidow, Levidow & Oberman with over three decades of experience, submitted six bogus cases that were provided by ChatGPT in a filing where he tried to confirm they were real cases…by asking ChatGPT. I guess one judge’s “Mandatory Certification Regarding Generative Artificial Intelligence” is his way of making sure it doesn’t happen in his court.
So, what do you think? Do you expect more judges to announce similar requirements? Please share any comments you might have or if you’d like to know more about a particular topic.
Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.
[…] some of the judicial standing orders posing requirements and limitations on generative AI use (like this one published in a Texas court) in court filings might do more harm than […]
[…] focuses on the fact that judges are issuing individual standing orders for their courts (like this one) that may intentionally or unintentionally curtail the use of GenAI in connection with court […]
[…] it was the courts, where at least four US courts (including this one) and some Canadian courts either issued or considered standing orders that require litigants to […]
[…] at least 14 federal trial courts have adopted AI-related rules of some sort (including this one), this appears to be the first instance of such a rule being considered by a federal appeals […]