top of page
Search
Elissa D. Hecker - Editor

Supreme Court Rules on Twitter and Google Cases

By Barry Skidelsky

Barry Skidelsky, a former EASL Section Chair who organized and moderated a CLE panel on the Regulation of Social Media and Online Content at the NYS Bar Association's Annual Meeting in January 2023, is a NYC based attorney and consultant with a business background and expertise in entertainment, media, telecommunications and technology. Barry can be contacted at bskidelsky@mindspring.com or 212-832-4800.


This addresses a companion set of United States Supreme Court rulings released on May 18th in the Twitter and Google technology law cases, which were among those discussed during the New York State Bar Association's Annual Meeting held in January.[1]


The country's highest court sided with the tech industry and digital rights groups by ruling in Twitter v. Taamneh et al (https://www.supremecourt.gov/opinions/22pdf/21-1496_d18f.pdf) that the social media platform is not subject to claims that it aided and abetted terrorism for hosting tweets from ISIS, and by partly relying on the Twitter decision and dismissing a companion case Gonzalez et al v. Google (https://www.supremecourt.gov/opinions/22pdf/21-1333_6j7a.pdf), which concerned YouTube's content moderation policies and practices.


In the Google case, the Court said that it would not address whether Section 230 of the Communications Decency Act (47 CFR § 230), which largely shields online platforms and websites from lawsuits regarding content posted by third party users.


This means that Section 230 remains unchanged for now, but as attendees at EASL's 2023 Annual Meeting and readers of the EASL Section's Journal know ,[2] this issue, among others, is teed up for Supreme Court review in the Net Choice litigations, which challenge state laws enacted in Florida and Texas (Net Choice et al v. Moody and Net Choice v. Paxton, respectively) that sought to impose limits on internet content or online speech.


Now that the Twitter and Google cases have been decided, it is widely expected that the Supreme Court will soon move forward on these Net Choice cases, taking these most recent rulings into account.[3] It remains to be seen if or how Section 230 will be next be addressed by the Supreme Court; but, to date, the high court has indicated that any Section 230 reform is best left to Congress rather than to the judiciary, although multiple legislative efforts to repeal or reform the law (which was first enacted in 1996 and is widely seen as a significant driver of today's internet-based ecosystem) have failed because of first amendment concerns, bi-partisan politics, and other reasons.


The Supreme Court's Twitter decision was unanimous and written by Justice Thomas, who said that social media platforms are little different from other digital technologies. He wrote: "It might be that bad actors like ISIS are able to use platforms like defendants' for illegal and sometimes terrible ends, but the same could be said of cell phones, email or the internet generally." This reflects the Court's struggles during oral argument to grapple with the complexities of internet or online speech, including how to identify what speech deserves protection or triggers liability.


The companion Google case was addressed by a brief unsigned order that stated in relevant part: "Much (if not all) of plaintiff's complaint seems to fail under either our decision in Twitter or the Ninth Circuit's unchallenged holdings below. We therefore decline to address the application of §230 to a complaint that appears to state little, if any, plausible claim for relief. Instead, we vacate the judgment below and remand the case to the Ninth Circuit to consider plaintiff's complaint in light of our decision in Twitter."


Google and other tech companies have expressed concerns that a restrictive interpretation of §230 would increase the legal risks associated with ranking, sorting and curating online content (all basic features of the modern Internet), and would result in websites seeking to play it safe by either removing far more content than is necessary or by giving up on content moderation entirely -- thus allowing even more harmful material on digital technology platforms and websites.


Amici filings in support of Google's position suggested that the stakes were not limited only to the tech giant's algorithms (which arguably represent the company's own speech, not that of others), and could also affect virtually anything on the web that might be construed as making a recommendation. That might mean that even average internet users who volunteer as moderators on various sites could face significant legal risks, as argued in an amicus filing made by Reddit.


The original co-authors of §230, Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, argued to the Court that Congress's intent in passing the law was to give websites broad discretion to moderate content as they saw fit. They chose a light-touch regulatory approach to allow a nascent digital transformation of global proportions to blossom; but, the question now is how the law (which traditionally has lagged technological changes) can possibly get ahead of this transformation for the benefit of all stakeholders.


In the end, if, when, and how Congress and/or the courts may better address the societal impacts of these and other emerging (yet rapidly evolving) technology related legal issues (including artificial intelligence, the internet of things, algo-discrimination, universal broadband, data privacy, cyber-security, online voting, and who knows what else may come down the line) remains to be seen; but, one thing is clear, each of us will be affected in every corner of our lives as we try to move ahead in the brave new world. Just in case, I'm keeping my pencils sharpened!


[1] See Spring 2023 NYSBA Entertainment Arts and Sports Law Journal (Vol. 34, No. 1) for Transcripts and Addendum. [2] See Fall 2022 NYSBA Entertainment Arts and Sports Law Journal (Vol. 33, No. 3) for underlying on-topic article. [3] Likewise, it is expected that the Twitter and Google cases will be taken into account in deciding the case of Volokh et al v. James, which challenges New York's Online Hate Speech Law (GBL §394-ccc). See Addendum cited in fn 1. As reported, on February 14, 2023 an Opinion and Order was issued by the SDNY granting a Preliminary Injunction to prevent the new law from taking effect; and, a Notice of Appeal to the Second Circuit challenging the PI was filed soon thereafter by the NYS Attorney General. Later, on April 3, 2023, the SDNY court granted a Stay of the action until 30 days after conclusion of the subject appeal (which appeal is currently pending as of this writing).

1 view0 comments

Recent Posts

See All

Week In Review

By Giancarla Sambo Edited by Elissa D. Hecker Below, for your browsing convenience, the categories are divided into: Entertainment, Arts,...

Σχόλια


bottom of page