Global Freedom of Expression

Moody v. NetChoice

In Progress Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    July 1, 2024
  • Outcome
    Decision - Procedural Outcome, Reversed and Remanded
  • Case Number
    22–277 and 22–555
  • Region & Country
    United States, North America
  • Judicial Body
    Supreme (court of final appeal)
  • Type of Law
    Constitutional Law, Telecommunication Law
  • Themes
    Content Moderation
  • Tags
    YouTube, Facebook, Social Media

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The U.S. Supreme Court held that the Fifth and Eleventh Circuit Courts of Appeals failed to adequately analyze the facial First Amendment challenges to the constitutionality of laws enacted by Florida and Texas regulating content moderation on social media platforms. The states of Florida and Texas enacted two laws limiting content moderation on digital platforms and social networks, prohibiting “censorship” based on users’ views, and requiring companies to provide detailed explanations to users about content moderation decisions. Petitioner NetChoice—a coalition of companies that own digital platforms and social networks—filed a facial challenge to these laws, arguing that they violated the First Amendment. The states of Florida and Texas argued that the rules were designed to balance the marketplace of ideas on social networks. Two separate district courts granted preliminary injunctions halting the enforcement of both laws. The Eleventh Circuit Court affirmed the injunction against the law in Florida, holding that it likely violated the First Amendment. At the same time, the Fifth Circuit Court vacated the injunction against the law in Texas, arguing that content moderation is a form of censorship, not a form of speech protected by the First Amendment.

The U.S. Supreme Court reversed both circuit courts’ decisions and remanded the cases for further proceedings, emphasizing that the First Amendment protects the editorial discretion of social media platforms. The Court considered that a facial First Amendment challenge to a statute requires the petitioner to demonstrate that a substantial portion of the statute’s applications is unconstitutional. In this regard, the Court held that the broad scope of the contested laws, which deal with various activities beyond content moderation, required a more precise analysis to determine their constitutionality. In addition, the Court held that altering the private editorial decisions of social media platforms violated the First Amendment, which prohibits the government from imposing its own prerogatives regarding the proper balance of ideas on the private sector.


Facts

In 2021, the states of Florida and Texas, in the United States of America, issued laws regulating the activities of social media companies and internet platforms (Florida’s  S. B. 7072 and  Texas’ H. B. 20). While the regulations issued in each of the two states differ in some respects, “both curtail the platforms’ capacity to engage in content moderation-to filter, prioritize, and label the varied third-party messages, videos, and other content their users wish to post.” [p. 1] The states claimed these laws were necessary to “correct” a perceived political bias resulting in the “silencing” of conservative voices on the platforms.  In addition, the laws contain rules, or disclosure provisions,  requiring social media companies to provide greater transparency on their policies and detailed explanations to their users regarding their content moderation decisions, such as the removal or modification of uploaded content. 

Both laws raise fundamental questions about the role of social media companies in the public sphere and if and how the government may regulate social media companies under the First Amendment of the US Constitution. For instance, Texas House Bill 20 (HB 20) characterizes large platforms (defined as having more than 50 million monthly active users) as “Common Carriers” due to their market dominance and role as a veritable digital public forum, serving society similar to utilities or telecommunications companies. Such a classification would allow the State to impose non-discrimination obligations on them. Hence, the legislature sought to prohibit large platforms from “censoring” content based on viewpoint. This requirement, termed a “must-carry provision,” effectively prevents platforms from removing objectionable content, including content which violates its community standards.

Florida’s S. B. 7072 similarly considered the platforms to be common carriers and regulated the ability of platforms to curate or moderate content by prohibiting the deplatforming of political candidates and “journalistic enterprises,” and the use of post-prioritization or shadow-banning algorithms on content related to political candidates. S.B. 7072 also contained extensive disclosure provisions and user data access requirements.   

Petitioners NetChoice LLC and the Computer & Communications Industry Association (hereinafter NetChoice), registered trade associations in the United States—which include companies such as YouTube and Facebook (now Meta)—, filed facial challenges, based on the First Amendment of the U.S. Constitution—which protects freedom of speech—, against the aforementioned state laws. Netchoice argued that content moderation is the same as exercising editorial discretion, much like that undertaken by a newspaper, and is therefore protected speech. Moreover, the intent behind the legislation, to correct a bias, was a content-based regulation by the government and should require strict scrutiny on the entire act. Further, there was no legitimate state interest in ensuring equal access to speech on private social-media platforms. One of the facial actions challenged the Florida statute, and the other challenged the Texas statute.[1] 

In both cases, district courts issued preliminary injunctions, at the initial stage of the proceedings, to stop the enforcement of the statutes. Both courts held that the petitioners’ claims “[were] likely to succeed because the statute infringes on the constitutionally protected editorial judgment of NetChoice’s members about what material they will display.” [p. 2]

The injunction against Florida’s S. B. 7072 was appealed to the Eleventh Circuit Court, which upheld the district court’s decision in Netchoice v. Attorney General, State of Florida. The Eleventh Circuit Court held that S. B. 7072 was unlikely to pass the First Amendment heightened scrutiny. It affirmed that the statute’s restrictions on corporate content moderation potentially impinged on their editorial discretion, which is protected by the right to free speech. Moreover, the court held that it would be very difficult for the state of Florida to overcome the First Amendment scrutiny regarding the companies’ obligation to provide detailed explanations to their users for every content moderation decision they issue. The court concluded that “the obligation to explain millions of decisions per day is unduly burdensome and likely to chill platforms’ protected speech.” [p. 2]

The State of Florida filed a writ of certiorari to the U.S. Supreme Court (SCOTUS) arguing that the Eleventh Circuit Court’s decision prevents it from ensuring diverse perspectives and ideas on the digital platforms owned by the petitioner.

On the other hand, the injunction against Texas’ H.B. 20 was appealed before the Fifth Circuit Court. In Netchoice v. Paxton, the Circuit Court held that the district court was wrong and reversed the lower court’s decision. The Circuit Court held that “the platforms’ content-moderation activities are not speech at all, and so do not implicate the First Amendment.” [p. 2] In addition, the court stated that Texas could regulate the companies that owned social media platforms to “protect diversity of ideas,” even if their activities involved speech. [p. 2] The court also held that the companies’ obligation to provide explanations regarding their content moderation decisions did not impose an undue burden because the petitioners would only have to expand the user complaint and appeals processes that already existed before the law.

NetChoice filed a writ of certiorari to the U.S. Supreme Court arguing that the Fifth Circuit Court’s decision violated its First Amendment right to freedom of speech.

Due to the conflicting decisions issued by the circuit courts about laws regulating content moderation on major digital platforms, the U.S. Supreme Court granted both petitions for certiorari and addressed them in a single judgment.

[1] According to Justice Alito’s vote in this decision, a facial challenge to a statute implies “that the Florida and Texas statutes facially violate the First Amendment, meaning that they cannot be applied to anyone at any time under any circumstances without violating the Constitution.” [p. 75]


Decision Overview

Justice Elena Kagan delivered the unanimous opinion for the United States Supreme Court. SCOTUS had to decide whether the conflicting decisions issued by the Eleventh and Fifth circuit courts properly analyzed the facial First Amendment challenges to the laws (S. B. 7072  and  H. B. 20) regulating major internet and social media platforms and their content moderation practices. 

NetChoice argued that the contested laws violated the First Amendment by restricting the platforms’ ability to moderate content in accordance with their own rules, community standards, or editorial decisions. It claimed that content moderation is a form of editorial judgment, and that these laws interfered with its right to choose which content it should show and which it should exclude—an activity that is protected under the First Amendment. According to NetChoice, forcing platforms to provide detailed explanations for every content moderation decision was an undue burden that violated free speech.

For their part, the defendant states of Texas and Florida argued that their laws were necessary to correct the imbalance of viewpoints present on social media platforms and to protect diversity in the marketplace of ideas. The defendants explained that platforms only host content, and regulating content moderation did not violate the First Amendment because, in their view, this activity is not a form of protected speech. The defendants claimed that the laws were designed to ensure that users have access to a wide range of viewpoints and opinions and to compel platforms to provide detailed explanations for their content moderation decisions. They considered that this last measure did not impose an undue burden on the platforms because they simply expanded the existing complaint and appeal processes.

First, the Court examined the requirements for a facial challenge under the First Amendment. It explained that in this type of case, the plaintiff must show that a substantial portion of the law is unconstitutional because it restricts free speech. SCOTUS clarified that facial challenges to a law seek to invalidate it in its entirety and under all circumstances, and therefore must be subjected to strict scrutiny by the courts.

Next, the Court noted that neither circuit court addressed the procedural requirements of a facial challenge under the First Amendment. It also highlighted that neither the parties nor the Fifth and Eleventh circuit courts considered or analyzed the broad range of activities regulated by S. B. 7072 and  H. B. 20, nor did they compare the allegedly unconstitutional provisions with those that were not. On the contrary, the Court noted that, at this early stage, the parties and the courts focused only on the application of the two laws with respect to content moderation on major social media platforms—specifically their content filtering, tagging services, and prioritization (e.g., how news stories are displayed on Facebook or what videos appear on YouTube’s home page)—, without considering the rest of the laws’ regulations.

For SCOTUS, it was unclear at this early stage of the proceedings what the scope of the contested laws was. It explained that to properly understand these state laws, one must first analyze the scope of their regulations regarding social media sites. The Court found that while it appears that these laws applied to more than just Facebook’s News Feed, and similar services offered by other social network giants, it was not yet clear whether they applied to other services—such as direct messaging—or how they affected other platforms and their functions. For this reason, the Court held that the lower courts should have determined more precisely the substantive scope of these laws before issuing injunctions.

Moreover, the Court explained that in a facial challenge, it is necessary to identify which aspects of the laws violate the First Amendment and compare them to aspects that do not. Considering this, SCOTUS said that analyzing content moderation regulations on digital platforms meant asking, for each affected platform or function, whether the laws impinged on the free speech and editorial discretion protected by the First Amendment. The Court also held that courts should have evaluated whether these requirements imposed an undue burden on free speech—referring to regulations requiring social media platforms to explain their content moderation decisions.

However, the Court concluded that since it only reviews cases, and does not make initial determinations, it could not make the aforementioned analysis of the detailed regulatory scope of the contested rules. Therefore, SCOTUS opined that since the circuit courts failed to conduct this regulatory analysis thoroughly, their decisions should be vacated and the cases remanded for further review.

Second, the Court had to consider how the First Amendment right to free speech related to the contested content moderation laws, to provide parameters for the lower courts on how to decide the cases. The Court established that it was necessary to analyze relevant prior First Amendment decisions because it found that the Fifth Circuit Court’s decision seriously erred in its interpretation of the right to free speech.

Consequently, the Court examined its case law on content moderation and the First Amendment. Citing the cases Miami Herald Publishing Co. v. Tornillo, Pacific Gas & Elec. Co. v. Public Util. Comm’n of Cal., Turner Broadcasting System, Inc. v. FCC, and Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, Inc., SCOTUS recalled that it has repeatedly held that forcing someone to provide a venue for the views of others, against their will, implicates the First Amendment right to free speech, only if the affected party alleges or demonstrates that their expressive activity or expression could be altered or disrupted by that imposition.

Within this body of precedent, the Court held that the most important decision was Hurley. In this judgment, SCOTUS held that the state of Massachusetts could not force the organizers of a St. Patrick’s Day parade to include a gay and lesbian group that wished to disseminate its “pride” slogans, because doing so would alter the expressive content of the parade, and the decision to include or exclude that message belonged solely to the organizers. Considering this, the Court concluded that “the government may not, in supposed pursuit of better expressive balance, alter a private speaker’s own editorial choices about the mix of speech it wants to convey.” [p. 26]

Subsequently, the Court said—citing Brown v. Entertainment Merchants Assn—that while technology evolves, the fundamental principles of the First Amendment remain constant. Consistent with the opinion laid out in Hurley, the Court highlighted that the First Amendment protects entities that compile and select the speech of others to create their own product and that the right to free speech prevents such entities from being forced to include messages they would prefer to exclude. The Court also emphasized that the government cannot justify compelled speech by asserting its interest in balancing the marketplace of ideas.

SCOTUS explained that digital platforms, such as Facebook and YouTube, provide users with a personalized stream of posts, using algorithms that prioritize content based on the users’ prior interests and activities—among other factors, which these platforms privately regulate. Similarly, the Court noted that social media companies have rules that specify what content is not allowed on their platforms and use algorithms to identify trustworthy content or suppress objectionable content. The Court went on to say that “the Texas law targets those expressive choices—in particular, by forcing the major platforms to present and promote content on their feeds that they regard as objectionable.” [p. 30]

Moreover, the Court stated that H. B. 20 was unlikely to overcome the petitioner’s facial challenge because, even at this early stage of the process, it was clear that the law severely limits the platforms’ ability to moderate content. SCOTUS held that “when the platforms use their Standards and Guidelines to decide which third-party content those feeds will display, or how the display will be ordered and organized, they are making expressive choices. And because that is true, they receive First Amendment protection.” [p. 32] Thus, the Court concluded that the Texas law restricted corporate control of content by barring platforms from “censoring” it based on the users’ point of view. From the Court’s perspective, this meant that platforms could not remove, tag, or delete posts they disapprove of, which would significantly affect the platforms’ editorial decisions.

The Court recalled that in Hurley it held that these types of regulations interfered with the right to free speech. Accordingly, SCOTUS held that social media platforms, just like publishers and parade organizers,  have the right to select content to create a distinctive expressive offering and that the Texas law would alter this right by forcing platforms to display content they deem objectionable. Hence, SCOTUS affirmed that Texas could not prohibit platforms such as Facebook and YouTube from removing posts that contravened their private rules.

Afterward, the Court recalled that Texas argued that the purpose of the law was to foster a better online environment, on major platforms, for plurality and diversity in the market of ideas. Regarding this argument, the Court opined that such interest did not support the constitutional validity of the law. According to it, “Texas does not like the way those platforms are selecting and moderating content, and wants them to create a different expressive product, communicating different values and priorities. But under the First Amendment, that is a preference Texas may not impose.” [p. 35]

Referring to United States v. O’Brien, SCOTUS concluded that the Texas law could not overcome even the lower standard of scrutiny that requires a law to advance a “substantial governmental interest” that is not “related to the suppression of free speech.” Moreover, the Court held that “a State may not interfere with private actors’ speech to advance its own vision of ideological balance.” [p. 33] It also clarified that, while states may desire a space where the public has access to diverse viewpoints, “the way the First Amendment achieves that goal is by preventing the government from tilting public debate in a preferred direction.” [p. 33]

Finally, SCOTUS highlighted that states are not allowed to prohibit speech to rebalance the marketplace of ideas, as this goal was not compatible with the First Amendment. On this point, it remarked that “on the spectrum of dangers to free expression, there are few greater than allowing the government to change the speech of private actors in order to achieve its own conception of speech nirvana.” [p. 33 and 34]

Considering the aforementioned arguments, SCOTUS unanimously decided to “vacate the judgments of the Courts of Appeals for the Fifth and Eleventh Circuits and remand the cases for further proceedings consistent with this opinion.” [p. 37]

Concurring and dissenting opinions

Justice Barrett

Justice Barrett joined the Court’s opinion and added a concurring opinion. For her, the Eleventh Circuit Court correctly interpreted the First Amendment’s protection of editorial discretion, while the Fifth Circuit Court did not. In addition, she highlighted the dangers of a facial challenge in this case, suggesting that NetChoice should focus on a challenge applied to specific features—such as Facebook’s News Feed and YouTube’s homepage—rather than encompassing multiple platforms and features in a single challenge.

In turn, Justice Barrett highlighted the complexity of determining how the First Amendment applied to digital platforms, especially when taking into consideration the use of algorithms and artificial intelligence tools to moderate content. Judge Barrett also mentioned that the corporate structure and ownership of platforms can also affect constitutional analysis. In this regard, she explained that foreign ownership and control over content moderation decisions could affect the applicability of the First Amendment.

Finally, she concluded that these complexities reinforce the need to address challenges in a specific and applied manner, rather than attempting to resolve all issues in a single facial challenge. Relatedly, the justice held that “while the governing constitutional principles are straightforward, applying them in one fell swoop to the entire social-media universe is not.” [p. 41]

Justice Jackson

Justice Jackson issued too a concurring opinion. She explained that both cases raised a complex conflict between the contested state laws and the First Amendment rights of social media platforms. The justice emphasized that “not every potential action taken by a social media company will qualify as expression protected under the First Amendment. But not every hypothesized regulation of such a company’s operations will necessarily be able to withstand the force of the First Amendment’s protections either.” [p. 42]

Nonetheless, Justice Jackson held that considering the early stage of the cases, the facial validity of the challenged state laws could not be properly assessed. She further emphasized that, when reviewing these cases, lower courts must be specific in their analysis, evaluating not only the regulated entities but also whether their activities constitute speech protected by the First Amendment.

Finally, Justice Jackson cautioned that further factual development was necessary before fully addressing these legal challenges: “[F]aced with difficult constitutional issues arising in new contexts on undeveloped records, this Court should strive to avoid deciding more than is necessary.” [p. 43 and 44]

Justice Alito

Justice Alito issued an opinion concurring with the Court’s majority, which was joined by Justices Thomas and Gorsuch. According to him, “NetChoice failed to prove that the Florida and Texas laws they challenged [were] facially unconstitutional. Everything else in the opinion of the Court is nonbinding dicta.” [p. 63]

Justice Alito criticized the majority’s decision to opine about the specific applications of the contested laws and to categorize them as erroneous. He explained that the broad ambition of the Court’s majority to provide “guidance on whether one part of the Texas law [was] unconstitutional as applied to two features of two of the many platforms that it reaches-namely, Facebook’s News Feed and YouTube’s homepage-[was] unnecessary and unjustified.” [p. 63] For Justice Alito these issues should be resolved in the context of specific court proceedings and not in a facial challenge.

Next, Justice Alito explained that social media platforms have become the “modern public square” and have a significant impact on people’s communication and daily lives. [p. 66] He stated that platforms, such as Facebook and YouTube, handle massive amounts of data and use algorithms to moderate content, which raises new questions about freedom of expression. Additionally, Justice Alito held that the petitioner did not provide sufficient information about how NetChoice’s members moderate content and which platforms were affected by the laws. Without this information, Justice Alito said, courts could not assess whether the laws had legitimate First Amendment implications.

Finally, he concluded that “the only binding holding in these decisions is that NetChoice has yet to prove that the Florida and Texas laws they challenged are facially unconstitutional. Because the majority opinion ventures far beyond the question we must decide, I concur only in the judgment.” [p. 96]

Justice Thomas

Justice Thomas, in addition to joining Justice Alito and the majority’s opinion, also issued a concurring opinion of its own. He  agreed with “the Court’s decision to vacate and remand because NetChoice has not established that Texas’s H. B. 20 and Florida’s S. B. 7072 are facially unconstitutional.” [p. 45] However, Justice Thomas disagreed with the Court’s majority when it provided opinions about these statutes. To him, that was unnecessary and based on an incomplete record of the cases.

Justice Thomas also criticized the Court’s approach when it selected only some specific platform features, such as Facebook’s Newsfeed and YouTube’s homepage, for its analysis, while it ignored other potential applications. Moreover, Justice Thomas recommended SCOTUS to abandon the practice of accepting facial challenges, as they exceeded the authority constitutionally granted to federal courts.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Supreme Court’s decision expands freedom of expression by reaffirming the First Amendment’s protection of editorial discretion in favor of social media platforms. This ruling, following an established case law pattern, hinders government interference in the editorial choices of private entities, thereby strengthening safeguards against compelled speech on online social media platforms. This decision upholds the principle by which the government cannot compel platforms to host content in ways that contradict their editorial policies—in the name of balancing the marketplace of ideas. Thus, it broadens the scope of protected speech in online spaces.

It is now up to the lower courts to assess the facial challenge and consider the more complex questions surrounding the scope of the laws and which aspects of the content moderation and disclosure provisions violate the First Amendment. There are still many pending issues, not least whether certain forms of algorithmic sorting may be deemed non-expressive and hence open to regulation. The rejection of this facial challenge has led some court watchers to posit that it could lead to a range of as-applied challenges to many of the aspects of the provisions, which will make their way through the courts over the coming years. Time will tell whether this will result in better laws, but the time has come for courts to apply First Amendment jurisprudence to evolving technologies.  


Although the Supreme Court did not refer to them, two amicus briefs submitted to the Court are worth highlighting for their arguments, from the U.S. and the international perspectives, against the laws. Their understanding of the scope of the right to freedom of expression in online contexts is certainly illustrative and guiding.

Brief of The Knight First Amendment Institute at Columbia University as Amicus Curiae in Support of Neither Party

The Knight First Amendment Institute filed an amicus brief before the Supreme Court of the United States in support of neither party. To it, the parties offered radically opposed theories of how the First Amendment should apply to social media regulations. The government’s vision, as characterized by the Knight Institute, “contend[ed] that the platforms’ content-moderation decisions do not implicate the First Amendment at all.” [p. 11] If accepted, the Institue considered, it would give governments sweeping authority over online spaces and hinder the articulation of distinctive online communities. Regarding NetChoice’s vision, the Institute argued that its theory—by which any regulation to content moderation should be subjected to strict scrutiny or outright declared unconstitutional—“would make it

nearly impossible for governments to enact even carefully drawn laws that serve First Amendment values.” [p. 11] In light of this framework, the Knight Institute held, nonetheless, that the Florida and Texas must-carry provisions were unconstitutional. 

According to the Knight Institute, social media platforms exercise editorial judgments in several contexts—for example when enforcing community standards to moderate content or by attaching labels to user-generated content. Relying on key case law by SCOTUS (Tornillo, Hurley, Pacific Gas, and Turner, among others), it showed that the First Amendment protects editorial judgment against compelled speech, as a way of preserving speakers’ autonomy and safeguarding the “public discourse from government intervention that might have distorted democratic self-governance.” [p. 18] Moreover, the Knight Institute argued that restrictions to editorial judgment—depending if they were content-neutral or content-based—should be subjected to intermediate or strict scrutiny to assess their validity under the First Amendment. 

As such, the Knight Institute held that SCOTUS did not have to analyze whether the Florida and Texas must-carry provisions were content-based or not, as said regulations failed even intermediate scrutiny “because they override the platforms’ exercise of editorial discretion.” [p. 27] As said in the amicus brief, “[t]hese provisions force platforms to publish a vast array of speech they do not want to publish, and that they view as inconsistent with the expressive communities they are trying to foster.” [p. 27] The Knight Institute considered that the aforementioned must-carry provisions should not survive as the interest they served according to the government (“assuring that the public has access to a multiplicity of information sources” [p. 29]) was not compelling enough since it was generically posited. The Knight Institute argued that it was not evident how the contested provisions advanced their asserted interest or how they would redress specific harms. Furthermore, the Institute said that the provisions were not specifically tailored to their purpose and that by preventing social media platforms from moderating content, the law “serves no legitimate governmental interest at all; it serves only to silence the platforms and impoverish public discourse,” [p. 29-30] undermining free exchange of ideas—an opposite effect to the interest pursued by the provision. 

Brief Of Amici Curiae Article 19: Global Campaign For Free Expression, International Justice Clinic At University Of California-Irvine School Of Law, And Open Net Association, Inc., In Support Of Respondents In No. 22-277 And Petitioners In No. 22-555

The amicus brief submitted by the International Justice Clinic at the University of California – Irvine School of Law, Article 19, and the Open Net Association, argues against the Texas and Florida laws (H.B. 20 and S.B. 7072), emphasizing their incompatibility with Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which the United States ratified decades ago.

First, the brief underscores that Article 19 “protects a robust freedom to seek and receive information and ideas of all kinds, through any media of one’s choice. As such, it requires that any limitation on free expression meet a three-part test focused on legality, necessity and proportionality, and legitimacy.” [p. 10] The amicus, citing General Comment 34 from the Human Rights Committee of the ICCPR, which interprets Article 19, argued that Article 19’s three-part test applies to all types of restrictions on freedom of expression, including those aimed at online platforms. 

The brief argues that H.B. 20 and S.B. 7072 fail this three-part test. Specifically, the brief explains that the laws violate the legality prong because they are “vague,” allowing for the arbitrary and politicized enforcement of speech regulations [p. 13]. The brief points out that “officials in Texas and Florida touted these laws as tools to ensure the dissemination of certain government-preferred political viewpoints.” [p. 11]. Furthermore, the amicus brief posits that these laws grant excessive discretion to state attorneys general, enabling them to pursue political motivations under the guise of enforcing the law.

Second, the amicus clarifies that the necessity and proportionality prong requires that any speech restrictions be the least intrusive means to achieve a legitimate state interest. The brief argues that the Texas and Florida laws are not necessary or proportionate, as they effectively prohibit all content moderation by social media platforms, which could lead to “information chaos.” [p. 16] The amicus explained that “[e]liminating all content moderation would destabilize the online marketplace of ideas, threatening the protected freedom of the audience to receive information under Article 19.” [p.26] This would undermine the platforms’ ability to function as spaces for public discourse, violating users’ right to receive information under Article 19.

Moreover, the brief asserts that these laws lack legitimacy, as their purpose is not to protect legitimate state interests like national security or public order, but rather to ensure the dissemination of “government-preferred political viewpoints.” [p. 11] The amicus reflects that this overtly political motivation is incompatible with Article 19’s restrictions on permissible speech limitations.

Further, the brief urges the Supreme Court to strike down H.B. 20 and S.B. 7072, arguing that they not only violate the First Amendment but also the United States’ international obligations under Article 19 of the ICCPR. The brief emphasizes that allowing these laws to stand would set a dangerous precedent, encouraging other countries to adopt similarly repressive measures under the guise of free expression regulation.

For all these reasons, the “Amici respectfully request that [the Supreme Court] strike down H.B. 20 and S.B. 7072 and, in the process, reaffirm the United States’s commitment to international norms protecting free expression as a fundamental human right.” [p.11 and 12]

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

National standards, law or jurisprudence

  • U.S., Constitution of the United States (1789), First Amendment.
  • U.S., State of Florida Senate Bill (S.B.) 7072 (2021)
  • U.S., State of Texas, Texas House Bill 20 (2021)
  • U.S., Americans for Prosperity Foundation v. Bonta, 594 U. S. 595, 615.
  • U.S., Miami Herald Publ'g Co. v. Tornillo, 418 U.S. 241 (1974)
  • U.S., Pacific Gas & Electric Co. v. Public Utilities Commission of California, 475 U.S. 1 (1986)
  • U.S., Turner Broadcasting Sys. Inc. v. FCC, 512 U.S. 622 (1994)
  • U.S., Turner Broadcasting System, Inc. v. FCC, 520 U. S. 180, 185, 189–190 (1997) (Turner II)
  • U.S., Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp., 515 U.S. 557 (1995)
  • U.S., PruneYard Shopping Ctr. v. Robins, 447 U.S. 74 (1980)
  • U.S., Rumsfeld v. Forum for Acad. & Inst. Rights, Inc., 547 U.S. 47 (2006)
  • U.S., Denver Area Ed. Telecomm. Consortium, Inc. v. FCC, 518 U.S. 727 (1996)
  • U.S., Brown v. Entm't Merchants Ass'n, 564 U.S. 786 (2011)
  • U.S., United States v. O'Brien 391 U.S. 367 (1968)
  • U.S., Sorrell v. IMS Health Inc.,131 S.Ct. 2653 (2011)
  • U.S., Buckley v. Valeo, 424 U.S. 1 (1976)
  • U.S., U.S. Telecom Ass’n v. FCC, 855 F.3d 381, 428 (D.C. Cir. 2017)

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback