Anthropic Expert

Anthropic Expert Accused of Using AI-Fabricated Source: Artificial Intelligence Trends

Just another fake citation story? This time, it’s an Anthropic expert that’s accused of using an AI-fabricated source in her declaration.

As reported by Reuters (Anthropic expert accused of using AI-fabricated source in copyright case, written by Blake Brittain and available here), a lawyer representing Universal Music Group, Concord and ABKCO in a lawsuit over Anthropic’s alleged misuse of their lyrics to train its chatbot Claude told California Magistrate Judge Susan van Keulen at a hearing that an Anthropic data scientist cited a nonexistent academic article to bolster the company’s argument in a dispute over evidence.

Judge Van Keulen asked Anthropic to respond by Thursday to the accusation, which the company said appeared to be an inadvertent citation error. He rejected the music companies’ request to immediately question the expert but said the allegation presented “a very serious and grave issue,” and that there was “a world of difference between a missed citation and a hallucination generated by AI.”

Advertisement
CloudNine

The expert’s filing cited an article from the journal American Statistician to argue for specific parameters for determining how often Claude reproduces copyrighted song lyrics, which Anthropic calls a “rare event.”

The music companies’ attorney, Matt Oppenheim of Oppenheim + Zebrak, said during the hearing that he confirmed with one of the supposed authors and the journal itself that the article did not exist. He called the citation a “complete fabrication.”

Oppenheim said he did not presume the expert, Olivia Chen, intentionally fabricated the citation, “but we do believe it is likely that Ms. Chen used Anthropic’s AI tool Claude to develop her argument and authority to support it.”

Anthropic attorney Sy Damle of Latham & Watkins complained at the hearing that the plaintiffs were “sandbagging” them by not raising the accusation earlier. He said the citation was incorrect but appeared to refer to the correct article.

Advertisement
S2|DATA

The relevant link in the filing directs to a separate American Statistician article with a different title and different authors.

“Clearly, there was something that was a mis-citation, and that’s what we believe right now,” Damle said.

No comment from Chen (at least as of the article). But, hey, if Anthropic isn’t going to trust their own tool, who will? 🤣

Seriously, though, it will be interesting to see how this develops, since Anthropic’s counsel seems to be trying to position it as an erroneous link and not an AI-generated one. Errors are errors and links should be checked, but most of the other fake case citation filings have multiple fake case citations that were obvious AI errors. This one seems a bit less obvious since it’s only one and cites to a non-existent paper, not a fake case. However, if Chen admits it was AI-generated, that changes things.

Hat tip to Judge Andrew Peck and Kelly Twigger for the simultaneous heads-up on this story!

In the meantime, Bob Ambrogi reports two different new fake case citation occurrences in his blog here. And Kelly shared a new standing order via Jeff Bazinet in the Southern District of Texas (my own backyard!) stating: “Any attorney or self-represented litigant who signs a pleading, written motion, or other paper submitted to the Court will be held responsible for the contents of that filing under Rule 11, regardless of whether generative artificial intelligence drafted any portion of that filing.” So, the Courts are getting fed up, even as they are telling lawyers something they should already know. 🙄

So, what do you think? Was the source by the Anthropic expert AI-generated? Will we ever see an end to fake AI-generated sources in filings? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

2 comments

  1. I’m trying to decide if these lawyers who rely on hallucinated citations, caselaw or otherwise, are lazy, stupid, or just unprofessional. Maybe is a combination of all three?

Leave a Reply