The 26 Words that Created the Internet
The Supreme Court recently heard Gonzalez v. Google, a case about Section 230 of the Communications Decency Act. Section 230 protects providers of interactive computer services from liability resulting from other people’s content. This case was supposed to be a blockbuster event about the future of the internet, but it seems to have fallen flat.
The conservative supermajority on the Court seemed poised to take aim at Big Tech, an ideological consensus among many of the judges, especially Justice Alito. However, in this specific case referencing Section 230, they essentially punted the question to Congress.
The basis of the case comes from the tragic Paris terrorist attack in 2015, which saw 130 people killed by ISIS members. Nohemi Gonzalez, who was in Paris as part of an exchange program, was killed in these attacks. Consequently, her family began to pursue legal action against Google for its alleged role in the terrorist attacks. The Gonzalez family argues YouTube, which is owned by Google, played an essential role in radicalizing the people who committed the terrorist attacks.
The main contention in this case was about the recommendation algorithm that YouTube uses to display videos that a viewer might enjoy. Section 230 makes it so YouTube is not the publisher or provider of every video on its interface.
Section 230 is an exceptionally broad statute and Big Tech firms have used it as a defense against any content moderation complaint under the sun. Google, Facebook, and Microsoft all argue that Section 230 is crucial to content moderation; simply too much content is added to YouTube daily, so they need to have a workable algorithm for sorting this information.
The arguments in front of the Court focused around whether the specific “recommendation” feature on YouTube broke this idea of separating publisher from content moderator.
The Court seemed extremely hesitant to rule against Section 230, especially in such a broad way. The Gonzalez lawyers found it difficult to prove that the YouTube algorithm was the causal mechanism of the radicalization. Justice Kavanaugh argued the Gonzalez legal team into a corner, explaining that it did not make sense for the Supreme Court to take on Section 230. He contended that using Section 230 to punish internet service providers that it is supposed to protect is backwards.
The argument against Section 230 not presented in this case is one promoted by conservatives. With the recent release of the “Twitter Files”, a series of internal documents from Twitter, conservatives claim that they have been disproportionately censored for having minority views. Musk uncovered these documents and released them to show how the company had previously moderated conservative voices unfairly. Companies are legally allowed to do so under the breadth of Section 230, as they can simply claim that their algorithms filtered out controversial or hate speech.
The Justices performed well given their age in this case, with Justice Kagan cracking a joke that they were not the best experts on the internet. They had a decent grasp of the technology at hand and did not seem lost during the arguments about algorithms and thumbnails. The question at hand in this part of oral arguments was whether the publication of thumbnails by YouTube on recommended videos counted as publishing content. The Justices seemed dissuaded that an algorithm that recommended people content could prove a company liable for the content it recommends. The only way they seemed mildly on board with this line of reasoning was if the algorithm was inherently discriminatory.
The future of sites like YouTube and Facebook seem fine for now, but this case is only the beginning of a reckoning about the liability of Big Tech giants. However, the Court found Congress to be a more qualified and more appropriate place to wrestle with this question. If the Court affirms Section 230, Congress and the respective agencies would then be forced to look into new legislation surrounding content moderation. Without any legislation from Congress, these big tech firms have free range with respect to silencing voices, no matter which side of the political spectrum.
While many experts predicted this case would be a sweeping ruling, the Justices appeared restrained in their treatment of this case. They exercised caution with Section 230, likely with the reasoning that a sweeping ruling would cause extreme chaos and damage internet service providers to the point that they would no longer be usable. Many of these websites like YouTube would have to shut down temporarily if they violated Section 230. The sheer amount of content on websites like Twitter and YouTube precludes content moderation without algorithmic sorting.
Tailoring the user experience to the service user’s preferences should be protected under Section 230. Without this protection, services like YouTube would be much harder to use and far less helpful. Blaming an algorithm for tailoring third-party content to a user seems beyond harsh in this case. While companies such as Google and Facebook should not be entirely shielded from liability for content on their platforms, the harm does not outweigh the massive benefit of these services. Congress is a much more appropriate place to determine when these tech giants should be liable and how to regulate them.
Gonzalez v. Google may not have lived up to the billing, but it offered important insight into the conservative supermajority’s apprehension towards striking down Big Tech firms.
Andreas Rivera Young is a Junior at Brown University, concentrating in Political Science and History. He is a staff writer for the Brown Undergraduate Law Review and can be contacted at andreas_rivera_young@brown.edu.