Global Freedom of Expression

Gonzalez v. Google (9th Cir. 2021)

In Progress Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    June 22, 2021
  • Outcome
    Decision - Procedural Outcome, Affirmed in Part, Reversed in Part
  • Case Number
    18-16700, 18-17192 & 19-15043
  • Region & Country
    United States, North America
  • Judicial Body
    Appellate Court
  • Type of Law
    Civil Law, Criminal Law, Constitutional Law, Telecommunication Law
  • Themes
    Content Moderation, Content Regulation / Censorship, Digital Rights, Intermediary Liability, Licensing / Media Regulation
  • Tags
    Google, Facebook, YouTube, Twitter/X, Terrorism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The United States Ninth Circuit Court of Appeals affirmed two rulings, and reversed another one, regarding social networks’ responsibility in three different terrorist attacks, and held that interactive computer services (such as Google, YouTube, Twitter, and Facebook) do not create nor develop content and cannot be held liable for content posted by a third-party. The applicants, families of US national victims who lost their lives in three different acts of terrorism perpetrated by ISIS, argued that Google, YouTube, Twitter, and Facebook should be liable for allowing ISIS to use their platforms, disseminate their messages, and share ad revenue with the terrorist organization. The defendants argued that they were only internet providers that enjoyed immunity regarding the contents they displayed but not produced. The first instance court ruled in favor of the defendants in the three different cases. On appeal, the Ninth Circuit Court analyzed the three judgments together and agreed with the fact that Google, YouTube, Twitter, and Facebook only provide the public with access to their platforms, and thus, are not exposed to liability for content posted by third parties. Furthermore, it held that aiding and abetting an act of international terrorism requires more than the provision of material support to a terrorist organization. However, in one of the cases, the Court held that the defendants provided services that were central to ISIS’s growth and expansion and held that the applicants adequately stated a claim for secondary liability.


Facts

The self-proclaimed Islamic State, or ISIS, is a terrorist organization, operating largely in Iraq, Syria, Libya, and other regions. On November 2015, three ISIS terrorists fired into the crowd at a café in Paris and killed Nohemi Gonzalez. ISIS carried out other suicide bombings and mass shootings in Paris that day, including the massacre at the Bataclan Theatre. The day after the Paris Attacks, ISIS claimed responsibility by issuing a written statement and releasing a YouTube video. On January 2017, an individual affiliated with and trained by ISIS carried out a shooting massacre at a nightclub in Istanbul, killing Nawras Alassaf and other 38 people. In December 2015, two persons indiscriminately fired against a crowd in California and killed Sierra Clayborn, Tin Nguyen, Nicholas Thalasinos, and other 11 people. One of the killers declared on her Facebook page allegiance and loyalty to former ISIS leader, Abu Bakr al-Baghdadi. Two days after the attack, ISIS issued a statement that read: “two followers of Islamic State attacked several days ago a center in San Bernardino in California, we pray to God to accept them as Martyrs.”

The plaintiffs are families of the US national victims Nohemi Gonzalez, Nawras Alassaf, Sierra Clayborn, Tin Nguyen, and Nicholas Thalasinos, who lost their lives in the three acts of terrorism perpetrated by ISIS in Paris, Istanbul, and California. They alleged that Google (as the owner of YouTube), Twitter, and Facebook were directly and secondarily liable for the murders because they allowed ISIS to use their platforms, amplified their messages, and shared ad revenue with the terrorist organization.

A district court, in all three cases, granted the defendants’ motion to dismiss because the victims failed to adequately allege a proximate cause, and to plausibly allege a secondary liability claim, under the Anti-Terrorism Act (ATA). Furthermore, in the Gonzalez case, in specific, the tribunal ruled that defendants enjoyed immunity regarding the content they shared and did not produce. The United States Ninth Circuit Court of Appeals addressed the three appeals altogether in this case.


Decision Overview

Circuit Judge Christen delivered the opinion for the United States Ninth Circuit Court of Appeals. The main issue before the Court was whether Google (as the owner of YouTube), Twitter, and Facebook were directly and/or secondarily liable for the murders of Gonzalez, Alassaf, Clayborn, Nguyen, and Thalasinos, —relatives of the plaintiffs—, considering the use made by ISIS, the terrorist organization responsible for the attacks in which they died,  of these social networks, and also in light of the companies’ content promotion or recommendation strategies, through the use of machine learning, and ad revenue sharing policies.

The plaintiffs claimed that the defendants were directly and secondarily liable for the murders because their social media platforms allowed ISIS to post videos and other content to communicate the terrorist group’s message, radicalize new recruits, and generally further its mission. The families of the victims also claimed that Google placed paid advertisements in proximity to ISIS-created content and shared the resulting ad revenue with the terrorist organization.

Specifically, the Gonzalez’s complaint held that YouTube “has become an essential and integral part of ISIS’s program of terrorism” [p.3] and that ISIS uses YouTube to recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations. Furthermore, they considered that Google used computer algorithms to match and suggest content to users based on their viewing history; consequently, Google had “recommended ISIS videos to users” and enabled them to “locate other videos and accounts related to ISIS.” [p.3]

Gonzalez also recalled that Google’s practice is to share a percentage of the revenue it generates from ads with the users who post the videos. The plaintiffs explained that “each YouTube video must be reviewed and approved by Google before Google will permit advertisements to be placed with that video” and that, consequently, “Google has reviewed and approved ISIS videos for advertising,” [p.21] and shared a percentage of revenues generated from those advertisements with ISIS. For the plaintiffs, Google directly committed acts of international terrorism by providing material support to foreign terrorist organizations, thus financing terrorism.

Furthermore, Gonzalez also argued that Google was aware of ISIS’s presence on YouTube, had received complaints about its content, had the ability to remove ISIS content from YouTube, and had “suspended or blocked selected ISIS-related accounts at various times.” [p.4] However, despite Google’s knowledge and control, for the plaintiffs, Google “did not make substantial or sustained efforts to ensure that ISIS would not re-establish the accounts using new identifiers.” [p.4] In conclusion, they alleged that Google aided and abetted international terrorism and provided material support and resources to it by allowing ISIS to use YouTube.

As for the Taamneh plaintiffs, they also considered that Google, Twitter, and Facebook were a critical part of ISIS’s growth, as they allowed the organization to recruit members, issue terrorist threats, spread propaganda, raise funds, instill fear, and intimidate civilian populations. The Taamneh plaintiffs alleged that the defendants knowingly permitted ISIS and its members and affiliates to use their platforms, and reviewed ISIS’s use of them only in response to third-party complaints. Consequently, in their view, the defendants aided and abetted an act of international terrorism and provided material support to ISIS by allowing the terrorist organization to use their social media platforms.

The Clayborn plaintiffs also considered that Twitter, Facebook, and Google aided and abetted international terrorism and provided material support to international terrorists in violation of the Anti-Terrorism Act (ATA), by allowing ISIS to use their platforms.

The defendants referred to section 230 of the Communications Decency Act, which is part of the Telecommunications Act and immunizes providers of interactive computer services against liability arising from content created by third parties. In this sense, they argued that no provider of an interactive computer service shall be treated as the publisher or speaker of any information provided by another content provider. Additionally, the defendants claimed that there were no proximate causes between their actions and the harm suffered by the plaintiffs and the relatives that died amidst terrorist attacks.

The Court, firstly, recalled that the Anti-Terrorism Act allows United States nationals to recover damages for injuries suffered by reason of an act of international terrorism. Secondly, regarding section 230, the judges explained that to avoid chilling speech, “Congress made a policy choice not to deter harmful online speech through the separate route of imposing tort liability on companies that serve as intermediaries for other parties’ potentially injurious messages.” [p.8] The Court added that “this limitation of liability had the dual purposes of promoting the free exchange of information and ideas over the Internet and encouraging voluntary monitoring for offensive or obscene material.” [p.10]  

In the Court’s opinion, Google is an interactive computer service provider, that supplies a platform and communication services, and is not an “information content provider” because it does not make a material contribution to the contents it displays. Even if the plaintiffs had argued that Google made a material contribution to the unlawfulness of ISIS content by pairing it with selected advertising and other videos, the Court considered that “as long as a third party willingly provides the essential published content, the interactive computer service provider receives full immunity regardless of the specific editing or selection process.” [p.15] 

Furthermore, it said that “these functions —recommendations and notifications— are tools meant to facilitate the communication and content of others, not content in and of themselves.” [p. 15] To underscore this point, the Court made reference to the case Force v. Facebook, from the Second Circuit, where it was stated that even if Facebook’s algorithms may have made content more visible or available, this did not amount to developing the underlying information. Moreover, the Court acknowledged that websites are leveraging new technologies to detect, flag, and remove large volumes of criminal content such as child pornography; however, it considered that, in any case, Congress should be the one to decide that more regulation is needed. The Court recognized that although “technology has dramatically outpaced congressional oversight” —and that legislators could not imagine “the level of sophistication algorithms have achieved,” [p. 32]— it highlighted that “whether social media companies should continue to enjoy immunity for the third-party content they publish, and whether their use of algorithms ought to be regulated, are pressing questions that Congress should address.” [p. 33]

As for the revenue-sharing claims, the Court found that the law did not immunize Google because these claims were not related to the published content. However, for the Court, the plaintiffs’ complaint failed to plausibly prove that Google directly perpetrated an act of international terrorism because “the provision of material support to a terrorist organization does not invariably equate to an act of international terrorism.” [p. 21] Furthermore, “taking as true the allegation that Google shared advertising revenue with ISIS as part of its AdSense program, that action does not permit the inference that Google’s actions objectively appear to have been intended to intimidate or coerce civilians, or to influence or affect governments.” [p. 21]  

When analyzing the secondary liability claim, the Court also found that the Gonzalez failed to state a claim because “aiding and abetting an act of international terrorism requires more than the provision of material support to a designated terrorist organization.[p. 24] In this sense, the Court held that the defendant did not substantially assist the act of terrorism that injured the plaintiff because i) not all assistance is equally important and plaintiffs provided no information about the amount of assistance provided by Google, and ii) Google had no intent to finance, promote, or carry out ISIS’s terrorist acts nor it shared any of ISIS’s objectives.

Confusingly, for the Taamneh appeal, the Court found that the defendants provided services that were central to ISIS’s growth and expansion. Here, the Court agreed with the plaintiffs in the sense that the social media platforms were essential to ISIS’s growth and expansion since without the social media platforms the terrorist organization would have no means of radicalizing recruits beyond their territorial borders.

As for Clayborn, the result was the same as for Gonzalez. Here, the Court distinguished that the link between the San Bernardino attack in California and ISIS was not clear; even if the plaintiffs managed to demonstrate some connection between the shooters and ISIS, more was needed in order to plausibly allege a cognizable claim for aiding-and-abetting liability. For the Court, the main difference between the other two cases was that here ISIS did not adopt the actions as its own: it mainly approved of the shooting after learning it had occurred, and there was no inference that their attack was implicitly authorized by ISIS.

In the end, the Court concluded that Google enjoyed immunity for most of the Gonzalez’s claims and that the plaintiffs failed to state an actionable claim as to their remaining theories of liability pursuant to the ATA. Regarding the Taamneh case, the Court held that the first instance court erred by ruling that the plaintiffs failed to state a claim for aiding-and-abetting liability under the ATA. For Clayborn, the Court concluded that the first instance court correctly held that the plaintiffs failed to plausibly plead their claim for aiding-and-abetting liability. Consequently, the Court affirmed the judgments in Gonzalez and Clayborn —rejecting the plaintiff’s claims and granting the defendants’ motions to dismiss— and reversed and remanded for further proceedings in Taamneh.

The judgment has a concurrent opinion from Judge Berzon, and a dissenting opinion from Judge Gould.

Judge Berzon expressed that she wanted to “join the growing chorus of voices calling for a more limited reading” [p. 33] of the immunity awarded to publishers and providers of computer services. In her view, immunity should not include activities that promote or recommend content to users through the use of machine learning. Berzon said that “a website’s decisions to moderate content, restrict users, or allow third parties full freedom to post content and interact with each other all, therefore, fall squarely within the actions of a publisher shielded from liability under section 230. But the conduct of the website operators here— like the conduct of most social media website operators today—goes very much further. The platforms’ algorithms suggest new connections between people and groups and recommend long lists of content, targeted at specific users.” [p. 34]

In her opinion, “the algorithms used by YouTube do not merely publish user content. Instead, they amplify and direct such content, including violent ISIS propaganda, to people the algorithm determines to be interested in or susceptible to those messages and thus willing to stay on the platform to watch more.” [p. 34] Consequently, she held that “these types of targeted recommendations and affirmative promotion of connections and interactions among otherwise independent users are well outside the scope of traditional publication” and should not enjoy immunity. [p. 34] However, she acknowledged that “our case law squarely and irrefutably holds otherwise,” [p. 35] so she concurred in full with the majority opinion.

As for Judge Gould’s dissenting opinion, he considered that the scope of immunity was wrongly addressed by the majority. For him, the law “was not intended to immunize, nor does its literal language suggest that it immunizes, companies providing interactive computer services from liability for serious harms knowingly caused by their conduct” and, in this case, it could be inferred that “social media companies were aware of the risks to the public.” [p. 40] He considered that the main issue was not that the social media companies republished harmful propaganda from ISIS; the problem is “the algorithms devised by these companies to keep eyes focused on their websites.” [p. 40]

Judge Gould held that YouTube magnified and amplified ISIS’s communications by “joining them with similar messages, in a way that contributed to the ISIS terrorists’ message beyond what would be done by considering them alone.” [p. 41] He argued that “disseminating its terrorist messages through its propaganda videos was a proximate cause of the terrorist attacks at issue here” and that there was “some direct relationship between the asserted injuries of the plaintiffs’ families and the defendant social media companies’ conduct.” [p. 41] Judge Gould concluded that “providing the channels of communication for inflammatory videos should be considered substantial assistance to the primary violations of terrorist shootings or bombings.” [p. 41]

Furthermore, Judge Gould asserted that even if Google and YouTube cannot be held liable for the mere content of ISIS’s posts, he concluded that the law “in no way provides immunity for other conduct of Google or YouTube or Facebook or Twitter that goes beyond merely publishing the post,” [p. 42] such as amplifying and encouraging similar views. Gould also said that “[t]hough websites using neutral tools like algorithms are generally immunized by Section 230, I would hold that where the website (1) knowingly amplifies a message designed to recruit individuals for a criminal purpose, and (2) the dissemination of that message materially contributes to a centralized cause giving rise to a probability of grave harm, then the tools can no longer be considered neutral.” [p. 43]

Finally, Gould concluded in his dissenting vote that “regulation of social media companies would best be handled by the political branches of our government, the Congress and the Executive Branch, but in the case of sustained inaction by them, the federal courts are able to provide a forum responding to injustices that need to be addressed by our justice system.” [p. 39]


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This judgment deals with very complex and contested issues, such as machine learning, algorithms, social media, and terrorism. On the one hand, it could be argued that it expands freedom of expression because it upheld section 230 of the Communications Decency Act, which provides immunity from liability to social media platforms and content providers such as Google, YouTube, Twitter, and Facebook, for the contents they publish but do not produce. On the other hand, the analysis regarding targeted recommendations, and how they amplify and influence certain contents —specifically those related to terrorist organizations—, is quite limited and confusing. In this sense, it is not clear why the standards and thresholds set for Taamneh were not applied to Gonzalez: if the defendants’ contribution to Taamneh was substantial for ISIS’s terrorist attack, the same should have been true for Gonzalez. Ultimately, the Court’s decision to reverse this specific case contracts expression as it widens the scope of liability of social platforms.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

National standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

This case did not set a binding or persuasive precedent either within or outside its jurisdiction. The significance of this case is undetermined at this point in time.

Official Case Documents

Official Case Documents:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback