Suit Filed Against ChatGPT

Suit Filed Against ChatGPT for Libel Due to Hallucination: Artificial Intelligence Trends

Hat tip to Ralph Losey for reporting on likely the first suit filed against ChatGPT for libel when it reportedly accused a party of fraud that wasn’t even part of the case.

As Ralph notes in his excellent e-Discovery Team® blog (News Flash – First Suit Filed Against ChatGPT For Hallucinatory Libel, available here), the allegations are that ChatGPT (version not identified) libeled the plaintiff by hallucinating and telling a journalist about a law suit that did not, and does not exist. In this suit filed in Georgia, the plaintiff, Mark Walters (who is actually a radio host and founder of Armed American Radio, whose website describes him as “the loudest voice in America fighting for gun rights”), had supposedly been sued for “breach of fiduciary duty, fraud, and other claims arising from Walters’ misappropriation of SAF’s funds and assets for his own benefit, and his manipulation of SAF’s financial records and bank statements to conceal his activities.”

Here’s a link to the libel complaint from Ralph’s site.


This information was provided by ChatGPT, according to journalist Fred Riehl, who is editor of, when he asked ChatGPT on May 4 to summarize Second Amendment Foundation v. Ferguson, a case filed in Washington federal court accusing the state’s Attorney General Bob Ferguson of abusing his power with regard to the gun rights foundation. According to the complaint, Riehl provided “a (correct) URL of a link to the complaint on the Second Amendment Foundation’s web site,”

As Walters’ complaint states, Riehl asked ChatGPT to provide a summary of the accusations in the complaint and ChatGPT responded that the document “[I]s a legal complaint filed by Alan Gottlieb, the founder and executive vice president of the Second Amendment Foundation (SAF), against Mark Walters, who is accused of defrauding and embezzling funds from the SAF. The complaint alleges that Walters, who served as the organization’s treasurer and chief financial officer, misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF’s leadership. The plaintiff seeks various forms of relief, including the recovery of misappropriated funds, damages for breach of fiduciary duty and fraud, and removal of Walters from his position as a member of the SAF’s board of directors.”

The problem? As Walters’ complaint states: “Every statement of fact in the summary pertaining to Walters is false”, including his inclusion in the Washington case as a party at all or even that he had any employment or official relationship with SAF, much less misappropriated funds from them.

Walter’s complaint states that Riehl asked ChatGPT to provide him with a copy of the portion of the complaint related to Walters, which it did, and he then asked ChatGPT to provide the entire text of the complaint, which it did (which is Exhibit 1 to Walters’ complaint). Riehl contacted Alan Gottlieb (plaintiff in the Washington case) regarding ChatGPT’s allegations concerning Walters, and Gottlieb confirmed that they were false.


Walters seeks general and punitive damages in an amount to be determined at trial.

I looked but couldn’t find any article from Riehl where he reported on the information provided by ChatGPT and Riehl is not a party in the lawsuit per the complaint in the link above.

As I said, this is likely the first suit filed against ChatGPT for libel (at least that’s what is being reported widely). A mayor in Australia threatened to sue OpenAI for defamation over false claims that he served time in prison for bribery, but I don’t know that he ever did. It will be interesting to see if OpenAI will rely on its strongly worded indemnification clause for protection here.

So, what do you think? Are you surprised about the suit filed against ChatGPT for libel? Or are you surprised it took this long? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


  1. Australia: OpenAI received a further extension beyond the original 28 days to respond to Hood’s concerns. If not satisfactory, then Hood would be able to take the company to court under Australian law.

    But while Australian law is friendly to defamation plaintiffs, Hood’s chances of success are by no means clear. It is a complicated set of issues around how the problem even arose in the first place, your options for protecting your reputation, and how you can avoid accidentally defaming someone with generative AI. It is quite complex.

    I will try to explain the Hood case, the Walters case, etc. in a detailed blog post next week. But my summary points are available in the “Comments” section to Ralph’s blog post.

  2. Thanks, Gregory! I had a feeling you would have more info about Hood’s case! 🙂 I look forward to reading your in-depth discussion of the cases!

Leave a Reply