Global Freedom of Expression

Fields v. Twitter

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    November 18, 2016
  • Outcome
    Decision - Procedural Outcome, Dismissed
  • Case Number
    16-cv-00213-WHO
  • Region & Country
    United States, North America
  • Judicial Body
    First Instance Court
  • Type of Law
    Constitutional Law
  • Themes
    Content Regulation / Censorship
  • Tags
    Twitter/X

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The U.S. District Court of Northern California held that an amended complaint to hold Twitter, Inc. liable for providing material support to the Islamic State group (IS) by allowing its members to have accounts was barred by the Communications Decency Act. The Plaintiffs argued that Twitter was liable under the Anti-Terrorism Act for providing material support to IS by allowing IS members to sign up for Twitter accounts and that this support resulted in the killing of their family members. The Court reasoned that the provision of accounts by Twitter to IS members is publishing activity, just like monitoring, reviewing, and editing content.  Therefore even though the plaintiffs’ claims remained inherently tied up with IS’ use of Twitter to spread propaganda, the platform was protected from liability for third-party speech under section 230(c) (1) of the CDA.


Facts

In November 2015, two American government contractors were killed in Jordan by Abu Zaid. The Islamic State group (IS), an Islamic militant organization, claimed responsibility for the murders on the social media platform Twitter.

The families of the deceased filed suit against Twitter, Inc. The families sought to hold Twitter, Inc. liable for their supposed role in the deaths. The plaintiffs attempted to argue that Twitter was responsible for the content posted by the terrorist organization. The Communications Decency Act protects internet platforms from liability for third-party speech, therefore the first complaint was dismissed with leave to amend after being barred by the Communications Decency Act.

In the modified complaint, the plaintiffs attempted to argue that the complaint was not barred by the Communications Decency Act because Twitter, Inc. provided material support to IS and that support was the proximate cause of the death of the American contractors, therefore, Twitter could be held liable under the Anti-Terrorism Act. The plaintiffs did not allege that IS used Twitter to recruit or communicate with Abu Zaid, or that Abu Zaid had viewed IS propaganda on Twitter or even had a Twitter account. Rather, the plaintiffs attempted to argue that the mere fact that Twitter allowed members of IS to have accounts was equivalent to providing the terrorist organization with material support.


Decision Overview

J. Orrick dismissed the amended complaint based on the Communications Decency Act (CDA). The first complaint was noted as being dismissed because it attempted to hold Twitter, Inc. liable as a publisher of IS terrorist organization’s speech. Despite the re-pleading, J. Orrick found that the plaintiffs failed to overcome the protections of the CDA. “[The plaintiffs] seek to hold Twitter liable for allowing IS to use its network to spread propaganda and objectionable, destructive content. But . . . these claims are barred under the CDA.”

While the protections provided by section 230(c)(1) have been defined in broad terms, they are not unlimited. In order to be protected from liability by the Communications Decency Act, three elements must be met: (1) the defendant must be an interactive computer service; (2) the cause of action asserted by the plaintiff must treat the defendant as the “publisher or speaker” of the harmful information at issue: and (3) the information must be “provided by another information content provider”.

The Court dismissed the plaintiffs’ claim that it should hold Twitter liable as a publisher in allowing IS members to have Twitter accounts and therefore fell outside the protection of the CDA. Firstly, the Court said, providing accounts, just like monitoring, reviewing and editing content constituted publishing activity because Twitter’s decision was based on analyzing “some speech, idea or content expressed by the would-be account-holder”.  Therefore, a policy that would target such speech, ideas or content of would-be holders does not function as content neutral. Further, decisions regarding the structure and organization of Twitter are content-based decisions and the decision to permit or prevent an account from existing is publishing activity because it reflects choices about what third-party content can appear on Twitter.

Secondly, despite its re-pleading the Court found that the plaintiffs’ core allegations were still that Twitter knowingly failed to prevent IS from disseminating content through the Twitter platform, not its mere provision of accounts to IS. The Court pointed out that the Second Amended Statement of Claim of the SAC still focused on IS’ objectionable use of Twitter and Twitter’s failure to prevent IS from using the site, not its failure to prevent IS from obtaining accounts.

The Court also dismissed the plaintiffs’ claim that Twitter should be held liable under the promissory estoppel exception to CDA, in which, an internet service may be liable if it promises not to republish content and then breaks that promise. In these circumstances the Court will treat the internet provider as a party in breach in a contractual dispute, rather than as a publisher or speaker. The Court distinguished the present case from the other cases used by the plaintiffs in support of this claim such as where the internet service provider failed to warn an individual that two other individuals were using the website to identify and lure rape victims which the website could have done without any modification of user content. The Court emphasized that the plaintiffs’ assertion was inherently tied to content and the use of creative headings to segregate “their content-based allegations under a proximate banner” failed to avoid the fact that this as “fundamental publishing activity” within s.230(c)(1) of the CDA.

Thirdly, the Court said that the plaintiffs hadn’t adequately alleged proximate causation, namely the mere fact that IS members had Twitter accounts was not enough to assert proximate cause when the plaintiffs could not show that the attacker viewed, used, or made contact with the IS Twitter accounts prior to the attack. Even if the attacker had used the direct messaging feature, where conversations would presumably be private, this does not remove the transmission of such messages from the scope of publishing activity under section 230(c).

Finally, the Court dismissed the plaintiffs’ public policy argument that this should be a “minor exception” to the CDA. It stated such a policy would not only require Twitter to create a likely inaccurate filtering process that would lead to content-based regulations on speech. “If the goal of the CDA is to ‘encourage the unfettered and unregulated development of free speech,’ any policy that requires interactive computer service providers to remove or filter particular content undermines this purpose, ” the Court said.

The Court granted Twitter’s motion to dismiss without leave to amend.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

By ruling that the creation of a Twitter account fell within the protection of section 230 (c) (1) of the Communications Decency Act, the Court prevented a possibly harmful content-based policy that would limit the dissemination of free speech and the “free-for-all” nature of social media sites like Twitter.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

National standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

The decision was cited in:

Official Case Documents

Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback