AI Hallucination Cases

AI Hallucination Cases: A Compiled List: Artificial Intelligence Trends

How many AI hallucination cases do you think there are? 15? 25? 45? The answer is WAY higher than that, according to this compiled list!

Someone had to start compiling a list of AI hallucination cases at some point and that “someone” is Damien Charlotin, who is an independent practitioner at Pelekan Data Consulting in Paris(!). He’s also a Research Fellow through HEC’s Smart Law Hub directing research teams on academic projects related to large language models (LLMs) and other AI topics.

Charlotin has created a database here that tracks legal decisions in cases where generative AI produced hallucinated content – typically fake citations, but also other types of arguments – or at least where GenAI is suspected to be responsible for fake content. It’s not just limited to the US – there are several international cases in there too. He also notes that “it is a work in progress and will expand as new examples emerge.”

Advertisement
Nextpoint

Indeed, it will! How many cases are there so far (as of last night)? 149 cases identified so far! There were 39 new cases identified in May alone!

Oh, and believe it or not, the Mata v. Avianca case isn’t the oldest one on the list! There was a ruling 8 days earlier than that case in June 14, 2023 for the case Scott v. Federal National Mortgage Association, where the pro se plaintiff submitted an opposition to defendant’s motion to dismiss which had several fake case citations and quotations, which the Court surmised to be generated by AI.

Charlotin’s list includes the case name (with a link to the actual case ruling involving AI hallucinations in most cases), the court/jurisdiction, the date of the ruling, the type of party using AI (e.g., lawyer, pro se litigant, expert, etc.), the AI tool implicated (if known), the nature of the hallucination, the outcome/sanction, the monetary penalty (if any) and a link for additional details for most cases. You can also download a current list of the cases in CSV format.

I say “current” because the list is expanding – rapidly. My colleague, Judge Andrew Peck, brought the list to my attention on Thursday (hat tip!), and at the time he sent me the link, there were 138 cases on the list, so we’ve seen 11 new cases added to the list in three days!

Advertisement
ReVia

You can contact Charlotin here if you know of a case that should be included. He also says on his main page: “If I don’t reply within 5 minutes, I’m either dead, or I don’t like you (you’ll soon find out).”

People had asked me a few times recently if I knew of a comprehensive list of AI hallucination cases. Now I do – and you do – thanks to Charlotin and Judge Peck! Here’s a link to it!

So, what do you think? Are you surprised there are so many AI hallucination cases? Please share any comments you might have or if you’d like to know more about a particular topic.

Image created using Microsoft Designer, using the term “robot lawyer slipping on a small banana peel in court”.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

3 comments

Leave a Reply