Those social media algorithms that recommend choices based on your activity could be under scrutiny with two SCOTUS cases set for oral arguments early next year.
The two SCOTUS cases address whether recommender systems are covered by liability exemptions under Section 230 of Title 47 of the United States Code that was enacted as part of the United States Communications Decency Act and generally provides immunity for website platforms with respect to third-party content. The cases are:
Gonzalez v. Google LLC (Docket 21–1333):
In November 2015, a series of coordinated terrorism attacks occurred in Paris. At least 130 were killed by the terrorists, and the Islamic State of Iraq and Syria (ISIS) took responsibility for the attack.
Among those killed in a series of coordinated terrorism attacks by ISIS in Paris in November 2015 was American Nohemi Gonzalez, a 23-year-old student. Her family began to seek legal remedies against Google, the parent company of YouTube. Their suit argued that through its recommendation system that tailors content based on user profiles, YouTube led users towards recruitment videos for ISIS, and were partially responsible for Nohemi’s death. Google defended itself by relying on Section 230, passed as part of the Telecommunications Act of 1996, which provides immunity from content published on an Internet service provider’s platform by third-party users. A lower court ruled in favor of Google, and the decision was upheld by the Ninth Circuit Court of Appeals.
In their appeal to the Supreme Court, the family focused more on the YouTube algorithm that has been tailored to deliver content believed to be of interest to the end user, arguing that while this was automatically done, it was a form of moderation that Section 230 does not fully cover. They wrote in their petition to the Supreme Court, “Whether Section 230 applies to these algorithm-generated recommendations is of enormous practical importance. Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media.”
Twitter Inc. v. Taamneh (Docket 21–1496)
Jordanian citizen Nawras Alassaf died in 2017 during an ISIS-affiliated attack in Istanbul. Arguing that the companies failed to control terrorist content on their sites, Alassaf’s family sued Twitter, Google and Facebook.
Arguing that the lower court decision improperly expanded the scope of the Anti-Terrorism Act (18 U.S.C. § 2333), Twitter appealed, arguing that the case warranted review from SCOTUS. On appeal, the Ninth Circuit did not consider protections under Section 230 in the case, and affirmed the lower court ruling that stated that Twitter, Google and Facebook could be liable.
The Supreme Court granted certiorari for both cases in October 2022 and both are currently scheduled for oral arguments in February 2023 (21st and 22nd, respectively). The Gonzalez case already has a ton of amici briefs filed and it will be interesting to see what happens and whether these two SCOTUS cases will lead to limiting the protections provided by Section 230!
Hat tip to Nick Wittenberg of Deloitte for the story!
So, what do you think? Do you think these two SCOTUS cases will change the scope of protections for sites with recommendation algorithms? Please share any comments you might have or if you’d like to know more about a particular topic.
Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.