Here’s the kitchen sink for March 22, 2024 of ten stories that I didn’t get to this week – with another brand-new meme from Gates Dogfish!
Why “the kitchen sink”? Find out here! 🙂
The Kitchen Sink is even better when you can include a brand-new eDiscovery meme courtesy of Gates Dogfish, the meme channel dedicated to eDiscovery people and created by Aaron Patton of Trustpoint.One. For more great eDiscovery memes, follow Gates Dogfish on LinkedIn here! Aaron’s meme this week provides both a history lesson and an illustration of how clauses in ESI protocols can become archaic. 😀
Here is the kitchen sink for March 22, 2024 of ten stories that I didn’t get to this week, with a comment from me about each:
Hackers can read private AI-assistant chats even though they’re encrypted: Whomp, whomp! Hate to start off with a downer, but there it is. As the article states: “Someone with a passive adversary-in-the-middle position—meaning an adversary who can monitor the data packets passing between an AI assistant and the user—can infer the specific topic of 55 percent of all captured responses, usually with high word accuracy. The attack can deduce responses with perfect word accuracy 29 percent of the time.” The only one NOT affected? Google Gemini.
Law360: “Judge Applauds Attys’ ‘Very Awesome’ Use Of Google AI Bot”: A good story for Google is followed by a bad one from Michael Berman on the EDRM blog. In “a sprawling consumer action accusing [Google] of using its advertising product, Google Analytics, to illicitly track, gather and monetize consumers’ private health data without their consent anytime they visit a healthcare provider’s website, through the use of a software cookie on a device,” apparently “Google Bard responded with lengthy explanations of why Google Analytics should not be used by such healthcare providers due to privacy concerns, according to the complaint.” The judge in the case called that “very awesome”. Oops.
ASCII art elicits harmful responses from 5 major AI chatbots: Why haven’t AI chatbots accounted for such a simple hack? \_(ツ)_/
‘Discovery may move forward immediately’: Judge allows major negligence lawsuit against Apple to advance after complaint says AirTags have helped ruin lives: The (lengthy) title says it all – here are two cases where AirTags were used to stalk and kill people (sadly).
U.S. Unveils Historic Sanctions Against Intellexa Spyware for Endangering Privacy and National Security: Russian-based? China? Nope, they’re Greece based, as Rob Robinson reports in his excellent ComplexDiscovery blog. Believe it or not, “the U.S. government’s first foray into directly targeting the murky spyware industry”!
YouTube will require disclosure of AI-manipulated videos from creators: And that will certainly stop it because people who like to manipulate videos always follow the rules. 😀
OpenAI’s chatbot store is filling up with spam: Another lack of moderation story to go with this one I covered yesterday. “TechCrunch found that the GPT Store, OpenAI’s official marketplace for GPTs, is flooded with bizarre, potentially copyright-infringing GPTs…[that] serve as little more than funnels to third-party paid services”. 😮
Lawmakers pass milestone privacy bill overshadowed by TikTok fever: Wow, I would have completely missed this if Tom O’Connor hadn’t shared this story on LinkedIn. The data-privacy bill (which was introduced together with the TikTok banning bill) passed Wednesday would stop certain companies from selling information to “foreign adversaries,” including China. The data law passed the House on Wednesday, 414-0! When does that ever happen?
Stochastic Parrots: the hidden bias of large language model AI: Polly want a chatbot response? Ralph Losey discusses the perfect term to describe what large language models do – and what they do not know (i.e., the meaning of their own responses) – in this transcribed video on the EDRM blog.
Microsoft’s Copilot Sets a New Course in AI-Driven Cybersecurity: Rob Robinson covered this announcement yesterday: “The tech titan’s latest offering, Copilot for Security, is the first GenAI solution custom-built for the cybersecurity domain, signaling a paradigm shift in how enterprises safeguard their digital assets.” Given the first, third and fifth items in this week’s “kitchen sink” – we need a paradigm shift! Will this deliver? We’ll see.
Hope you enjoyed the kitchen sink for March 22, 2024! Back next week with another edition!
So, what do you think? Is this useful as an end of the week wrap-up? Please share any comments you might have or if you’d like to know more about a particular topic.
Image Copyright © Twentieth Century Fox
Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.
Discover more from eDiscovery Today by Doug Austin
Subscribe to get the latest posts sent to your email.



