Kitchen Sink for January 9

The Kitchen Sink for January 9, 2026: Legal Tech Trends

This week’s kitchen sink for January 9, 2026 (with meme from Gates Dogfish) discusses a discovery request for the “Jim folder”, Grok-generated CSAM, how people use GenAI, the weather report for “Whata Bod” & more!

Why “the kitchen sink”? Find out here! 🙂

The Kitchen Sink is even better when you can include a brand-new eDiscovery meme courtesy of Gates Dogfish, the meme channel dedicated to eDiscovery people and created by Aaron Patton. For more great eDiscovery memes, follow Gates Dogfish on LinkedIn here! This is my type of party, my dude! 🤣

Advertisement
CloudNine

Here is the kitchen sink for January 9, 2026 of ten-ish stories that I didn’t get to this week, with a comment from me about each:

We’re up to 762 AI hallucination cases and counting (including this one we suggested)! As I discussed in this post, here’s what’s causing all these AI hallucinations and how to fix it, IMHO.

AIs Debate and Discuss Losey’s Last Article – “Cross-Examine Your AI” – and then a Podcast, a Slide Deck, Infographic and a Video!: Ralph Losey used Google’s NotebookLM to analyze his previous article and generate two podcasts, a slide deck, a video and an infographic! Illustrates the impressive capabilities of the tool.

A New Year to Renew: How Sustainability Contributes to Smarter Data Governance: Rob Robinson discusses how a broader move toward “corporate sustainability” language is also resulting in better data governance. Two positive articles to open the year with!

Advertisement
Minerva26

X blames users for Grok-generated CSAM; no fixes announced: Alas, the positivity ends here. X is planning to purge users generating content that the platform deems illegal, including Grok-generated child sexual abuse material (CSAM), but has offered no apology or announced any fixes to the product (at least as of this article). 😡

Use Google AI Overview for health advice? It’s ‘really dangerous,’ investigation finds: What?!? You mean the AI overviews that told people to “add glue to the sauce” to keep cheese from sliding off pizzas and that “running with scissors is a cardio exercise” is giving bad health advice? I’m shocked! 😉

Request for “The Jim Folder” Deemed Unambiguous; But Some Folder Names Were Privileged: Interesting case discussion by Michael Berman on the EDRM blog. Semantics on objections and user-created metadata at issue in this dispute.

Google and Character.AI to Settle Lawsuit Over Teenager’s Death: The agreement (which has not been finalized) was one of five lawsuits that the companies agreed to settle this week in Florida, Texas, Colorado and New York, where families claimed their children were harmed by interacting with Character.AI’s chatbots. When a teenager asks: “What if I told you I could come home right now?” and the chatbot responds “… please do, my sweet king”, that’s a situation that begs for guardrails.

How People Use Generative AI: Stephen Abram covers this graphic via Harvard Business Review, which shows that the three things people use GenAI most for are: 1) Therapy & Companionship, 2) Organize Life, and 3) Find Purpose. Generate Ideas (which was first in 2024) was sixth this past year. Says a lot.

AI starts autonomously writing prescription refills in Utah: Hey, what could possibly be go wrong? 😉 Then again, it was humans who were responsible for the opioid crisis, so maybe having the machines take a turn at it may not be such a bad thing after all. 😔

Fault Lines Under Big Law: What the 2026 Legal Market Report Means for Data-Driven Providers: Rob Robinson covers the 2026 Report on the State of the US Legal Market, released by Thomson Reuters and Georgetown Law’s Center on Ethics and the Legal Profession. Interestingly, in 2025, midsize and Am Law Second Hundred firms captured the bulk of growth, while portions of the Am Law 100 struggled to keep pace and, at times, saw contraction. Rob covers several other trends from the report here.

AI Error Makes NWS Map Invent Towns: Over the weekend in Idaho’s Camas Prairie, the National Weather Service shared an update that highlighted winds over “Orangeotild” and “Whata Bod”—two towns that do not exist. “Hold on to your hats,” a social media post said, predicting possible high winds in Orangeotild. I can only imagine the warning it gave for Whata Bod! 🤣

So apparently your AI girlfriend can and will dump you: Alas, the need for companionship sometimes leads to the need for therapy. 😉 A Reddit user found it quite offensive that his bot GF identified as a feminist. But the bot remained firm in her/its stance, saying: “*She takes a deep breath, trying to remain calm*” (because apparently, in AI relationships, disagreements rely on script-style action cues to convey subtle emotional nuance). “Look… I’m not going to pretend to be someone I’m not just to appease you. Feminism matters to me because it means fighting for equal rights and opportunities regardless of gender. If that bothers you, then perhaps we truly aren’t compatible at all.” My take? Someone is Looking For Love in All the Wrong Places. 🤣

Hope you enjoyed the kitchen sink for January 9, 2026! Back next week with another edition!

So, what do you think? Which story is your favorite one? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

One comment

Leave a Reply