AI hallucinations cases are not slowing down. Far from it. They’re actually speeding up; in fact, they’re speeding up a lot.
Since I first covered the AI hallucination cases site maintained by Damien Charlotin on June 9th this year (a little over six months ago), the number of reported cases with AI hallucinations has risen (as of yesterday) from 149 to 690! That’s 541 cases identified in a little over six months!
Some of you might think: “We’re doing a better job of identifying the cases now.” Perhaps, but why haven’t we identified a bunch of cases from last year – or the year before in 2023 when Mata v. Avianca was one of the first cases identified?
The numbers for the past three years are startling. In 2023, there were only 15 AI hallucination cases identified. Last year, that number more than tripled to 53 AI hallucination cases.
Could that number triple again? Not even close. This year, there have been a whopping 622 cases identified so far – and the year isn’t even over yet. That’s more than 11 times the number of cases from last year! This graph illustrates just how dramatic the rise has been.

As I mentioned in this post, sanctions (even severe sanctions) have provided no deterrent. Back then (just over two months ago), the number of AI hallucinations cases was “only” at 439. We’ve seen 251 identified cases in just the last couple of months.
The numbers will continue to rise until we change our thinking about what causes these AI hallucination case filings and understand the root cause of the problem. This is an automation bias problem – training and education is the only way to make a difference.
So, what do you think will reduce the number of AI hallucinations cases? Please share any comments you might have or if you’d like to know more about a particular topic.
Image created using Microsoft Designer, using the term “bewildered robots driving into a lake in a car while looking at GPS”.
Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.
Discover more from eDiscovery Today by Doug Austin
Subscribe to get the latest posts sent to your email.




[…] recent eDiscovery Today article reported that cases involving hallucinated citations and excerpts are not just continuing to occur, […]