Access to Public Information, Content Moderation, Content Regulation / Censorship, Digital Rights, Internet Shutdowns, National Security
SERAP v. Federal Republic of Nigeria
Nigeria
In Progress Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The U.S. Supreme Court held that the Fifth and Eleventh Circuit Courts of Appeals failed to adequately analyze the facial First Amendment challenges to the constitutionality of laws enacted by Florida and Texas regulating content moderation on social media platforms. The states of Florida and Texas enacted two laws limiting content moderation on digital platforms and social networks, prohibiting “censorship” based on users’ views, and requiring companies to provide detailed explanations to users about content moderation decisions. Petitioner NetChoice—a coalition of companies that own digital platforms and social networks—filed a facial challenge to these laws, arguing that they violated the First Amendment. The states of Florida and Texas argued that the rules were designed to balance the marketplace of ideas on social networks. Two separate district courts granted preliminary injunctions halting the enforcement of both laws. The Eleventh Circuit Court affirmed the injunction against the law in Florida, holding that it likely violated the First Amendment. At the same time, the Fifth Circuit Court vacated the injunction against the law in Texas, arguing that content moderation is a form of censorship, not a form of speech protected by the First Amendment.
The U.S. Supreme Court reversed both circuit courts’ decisions and remanded the cases for further proceedings, emphasizing that the First Amendment protects the editorial discretion of social media platforms. The Court considered that a facial First Amendment challenge to a statute requires the petitioner to demonstrate that a substantial portion of the statute’s applications is unconstitutional. In this regard, the Court held that the broad scope of the contested laws, which deal with various activities beyond content moderation, required a more precise analysis to determine their constitutionality. In addition, the Court held that altering the private editorial decisions of social media platforms violated the First Amendment, which prohibits the government from imposing its own prerogatives regarding the proper balance of ideas on the private sector.
In 2021, the states of Florida and Texas, in the United States of America, issued laws regulating the activities of social media companies and internet platforms (Florida’s S. B. 7072 and Texas’ H. B. 20). While the regulations issued in each of the two states differ in some respects, “both curtail the platforms’ capacity to engage in content moderation-to filter, prioritize, and label the varied third-party messages, videos, and other content their users wish to post.” [p. 1] The states claimed these laws were necessary to “correct” a perceived political bias resulting in the “silencing” of conservative voices on the platforms. In addition, the laws contain rules, or disclosure provisions, requiring social media companies to provide greater transparency on their policies and detailed explanations to their users regarding their content moderation decisions, such as the removal or modification of uploaded content.
Both laws raise fundamental questions about the role of social media companies in the public sphere and if and how the government may regulate social media companies under the First Amendment of the US Constitution. For instance, Texas House Bill 20 (HB 20) characterizes large platforms (defined as having more than 50 million monthly active users) as “Common Carriers” due to their market dominance and role as a veritable digital public forum, serving society similar to utilities or telecommunications companies. Such a classification would allow the State to impose non-discrimination obligations on them. Hence, the legislature sought to prohibit large platforms from “censoring” content based on viewpoint. This requirement, termed a “must-carry provision,” effectively prevents platforms from removing objectionable content, including content which violates its community standards.
Florida’s S. B. 7072 similarly considered the platforms to be common carriers and regulated the ability of platforms to curate or moderate content by prohibiting the deplatforming of political candidates and “journalistic enterprises,” and the use of post-prioritization or shadow-banning algorithms on content related to political candidates. S.B. 7072 also contained extensive disclosure provisions and user data access requirements.
Petitioners NetChoice LLC and the Computer & Communications Industry Association (hereinafter NetChoice), registered trade associations in the United States—which include companies such as YouTube and Facebook (now Meta)—, filed facial challenges, based on the First Amendment of the U.S. Constitution—which protects freedom of speech—, against the aforementioned state laws. Netchoice argued that content moderation is the same as exercising editorial discretion, much like that undertaken by a newspaper, and is therefore protected speech. Moreover, the intent behind the legislation, to correct a bias, was a content-based regulation by the government and should require strict scrutiny on the entire act. Further, there was no legitimate state interest in ensuring equal access to speech on private social-media platforms. One of the facial actions challenged the Florida statute, and the other challenged the Texas statute.[1]
In both cases, district courts issued preliminary injunctions, at the initial stage of the proceedings, to stop the enforcement of the statutes. Both courts held that the petitioners’ claims “[were] likely to succeed because the statute infringes on the constitutionally protected editorial judgment of NetChoice’s members about what material they will display.” [p. 2]
The injunction against Florida’s S. B. 7072 was appealed to the Eleventh Circuit Court, which upheld the district court’s decision in Netchoice v. Attorney General, State of Florida. The Eleventh Circuit Court held that S. B. 7072 was unlikely to pass the First Amendment heightened scrutiny. It affirmed that the statute’s restrictions on corporate content moderation potentially impinged on their editorial discretion, which is protected by the right to free speech. Moreover, the court held that it would be very difficult for the state of Florida to overcome the First Amendment scrutiny regarding the companies’ obligation to provide detailed explanations to their users for every content moderation decision they issue. The court concluded that “the obligation to explain millions of decisions per day is unduly burdensome and likely to chill platforms’ protected speech.” [p. 2]
The State of Florida filed a writ of certiorari to the U.S. Supreme Court (SCOTUS) arguing that the Eleventh Circuit Court’s decision prevents it from ensuring diverse perspectives and ideas on the digital platforms owned by the petitioner.
On the other hand, the injunction against Texas’ H.B. 20 was appealed before the Fifth Circuit Court. In Netchoice v. Paxton, the Circuit Court held that the district court was wrong and reversed the lower court’s decision. The Circuit Court held that “the platforms’ content-moderation activities are not speech at all, and so do not implicate the First Amendment.” [p. 2] In addition, the court stated that Texas could regulate the companies that owned social media platforms to “protect diversity of ideas,” even if their activities involved speech. [p. 2] The court also held that the companies’ obligation to provide explanations regarding their content moderation decisions did not impose an undue burden because the petitioners would only have to expand the user complaint and appeals processes that already existed before the law.
NetChoice filed a writ of certiorari to the U.S. Supreme Court arguing that the Fifth Circuit Court’s decision violated its First Amendment right to freedom of speech.
Due to the conflicting decisions issued by the circuit courts about laws regulating content moderation on major digital platforms, the U.S. Supreme Court granted both petitions for certiorari and addressed them in a single judgment.
[1] According to Justice Alito’s vote in this decision, a facial challenge to a statute implies “that the Florida and Texas statutes facially violate the First Amendment, meaning that they cannot be applied to anyone at any time under any circumstances without violating the Constitution.” [p. 75]
Justice Elena Kagan delivered the unanimous opinion for the United States Supreme Court. SCOTUS had to decide whether the conflicting decisions issued by the Eleventh and Fifth circuit courts properly analyzed the facial First Amendment challenges to the laws (S. B. 7072 and H. B. 20) regulating major internet and social media platforms and their content moderation practices.
NetChoice argued that the contested laws violated the First Amendment by restricting the platforms’ ability to moderate content in accordance with their own rules, community standards, or editorial decisions. It claimed that content moderation is a form of editorial judgment, and that these laws interfered with its right to choose which content it should show and which it should exclude—an activity that is protected under the First Amendment. According to NetChoice, forcing platforms to provide detailed explanations for every content moderation decision was an undue burden that violated free speech.
For their part, the defendant states of Texas and Florida argued that their laws were necessary to correct the imbalance of viewpoints present on social media platforms and to protect diversity in the marketplace of ideas. The defendants explained that platforms only host content, and regulating content moderation did not violate the First Amendment because, in their view, this activity is not a form of protected speech. The defendants claimed that the laws were designed to ensure that users have access to a wide range of viewpoints and opinions and to compel platforms to provide detailed explanations for their content moderation decisions. They considered that this last measure did not impose an undue burden on the platforms because they simply expanded the existing complaint and appeal processes.
First, the Court examined the requirements for a facial challenge under the First Amendment. It explained that in this type of case, the plaintiff must show that a substantial portion of the law is unconstitutional because it restricts free speech. SCOTUS clarified that facial challenges to a law seek to invalidate it in its entirety and under all circumstances, and therefore must be subjected to strict scrutiny by the courts.
Next, the Court noted that neither circuit court addressed the procedural requirements of a facial challenge under the First Amendment. It also highlighted that neither the parties nor the Fifth and Eleventh circuit courts considered or analyzed the broad range of activities regulated by S. B. 7072 and H. B. 20, nor did they compare the allegedly unconstitutional provisions with those that were not. On the contrary, the Court noted that, at this early stage, the parties and the courts focused only on the application of the two laws with respect to content moderation on major social media platforms—specifically their content filtering, tagging services, and prioritization (e.g., how news stories are displayed on Facebook or what videos appear on YouTube’s home page)—, without considering the rest of the laws’ regulations.
For SCOTUS, it was unclear at this early stage of the proceedings what the scope of the contested laws was. It explained that to properly understand these state laws, one must first analyze the scope of their regulations regarding social media sites. The Court found that while it appears that these laws applied to more than just Facebook’s News Feed, and similar services offered by other social network giants, it was not yet clear whether they applied to other services—such as direct messaging—or how they affected other platforms and their functions. For this reason, the Court held that the lower courts should have determined more precisely the substantive scope of these laws before issuing injunctions.
Moreover, the Court explained that in a facial challenge, it is necessary to identify which aspects of the laws violate the First Amendment and compare them to aspects that do not. Considering this, SCOTUS said that analyzing content moderation regulations on digital platforms meant asking, for each affected platform or function, whether the laws impinged on the free speech and editorial discretion protected by the First Amendment. The Court also held that courts should have evaluated whether these requirements imposed an undue burden on free speech—referring to regulations requiring social media platforms to explain their content moderation decisions.
However, the Court concluded that since it only reviews cases, and does not make initial determinations, it could not make the aforementioned analysis of the detailed regulatory scope of the contested rules. Therefore, SCOTUS opined that since the circuit courts failed to conduct this regulatory analysis thoroughly, their decisions should be vacated and the cases remanded for further review.
Second, the Court had to consider how the First Amendment right to free speech related to the contested content moderation laws, to provide parameters for the lower courts on how to decide the cases. The Court established that it was necessary to analyze relevant prior First Amendment decisions because it found that the Fifth Circuit Court’s decision seriously erred in its interpretation of the right to free speech.
Consequently, the Court examined its case law on content moderation and the First Amendment. Citing the cases Miami Herald Publishing Co. v. Tornillo, Pacific Gas & Elec. Co. v. Public Util. Comm’n of Cal., Turner Broadcasting System, Inc. v. FCC, and Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, Inc., SCOTUS recalled that it has repeatedly held that forcing someone to provide a venue for the views of others, against their will, implicates the First Amendment right to free speech, only if the affected party alleges or demonstrates that their expressive activity or expression could be altered or disrupted by that imposition.
Within this body of precedent, the Court held that the most important decision was Hurley. In this judgment, SCOTUS held that the state of Massachusetts could not force the organizers of a St. Patrick’s Day parade to include a gay and lesbian group that wished to disseminate its “pride” slogans, because doing so would alter the expressive content of the parade, and the decision to include or exclude that message belonged solely to the organizers. Considering this, the Court concluded that “the government may not, in supposed pursuit of better expressive balance, alter a private speaker’s own editorial choices about the mix of speech it wants to convey.” [p. 26]
Subsequently, the Court said—citing Brown v. Entertainment Merchants Assn—that while technology evolves, the fundamental principles of the First Amendment remain constant. Consistent with the opinion laid out in Hurley, the Court highlighted that the First Amendment protects entities that compile and select the speech of others to create their own product and that the right to free speech prevents such entities from being forced to include messages they would prefer to exclude. The Court also emphasized that the government cannot justify compelled speech by asserting its interest in balancing the marketplace of ideas.
SCOTUS explained that digital platforms, such as Facebook and YouTube, provide users with a personalized stream of posts, using algorithms that prioritize content based on the users’ prior interests and activities—among other factors, which these platforms privately regulate. Similarly, the Court noted that social media companies have rules that specify what content is not allowed on their platforms and use algorithms to identify trustworthy content or suppress objectionable content. The Court went on to say that “the Texas law targets those expressive choices—in particular, by forcing the major platforms to present and promote content on their feeds that they regard as objectionable.” [p. 30]
Moreover, the Court stated that H. B. 20 was unlikely to overcome the petitioner’s facial challenge because, even at this early stage of the process, it was clear that the law severely limits the platforms’ ability to moderate content. SCOTUS held that “when the platforms use their Standards and Guidelines to decide which third-party content those feeds will display, or how the display will be ordered and organized, they are making expressive choices. And because that is true, they receive First Amendment protection.” [p. 32] Thus, the Court concluded that the Texas law restricted corporate control of content by barring platforms from “censoring” it based on the users’ point of view. From the Court’s perspective, this meant that platforms could not remove, tag, or delete posts they disapprove of, which would significantly affect the platforms’ editorial decisions.
The Court recalled that in Hurley it held that these types of regulations interfered with the right to free speech. Accordingly, SCOTUS held that social media platforms, just like publishers and parade organizers, have the right to select content to create a distinctive expressive offering and that the Texas law would alter this right by forcing platforms to display content they deem objectionable. Hence, SCOTUS affirmed that Texas could not prohibit platforms such as Facebook and YouTube from removing posts that contravened their private rules.
Afterward, the Court recalled that Texas argued that the purpose of the law was to foster a better online environment, on major platforms, for plurality and diversity in the market of ideas. Regarding this argument, the Court opined that such interest did not support the constitutional validity of the law. According to it, “Texas does not like the way those platforms are selecting and moderating content, and wants them to create a different expressive product, communicating different values and priorities. But under the First Amendment, that is a preference Texas may not impose.” [p. 35]
Referring to United States v. O’Brien, SCOTUS concluded that the Texas law could not overcome even the lower standard of scrutiny that requires a law to advance a “substantial governmental interest” that is not “related to the suppression of free speech.” Moreover, the Court held that “a State may not interfere with private actors’ speech to advance its own vision of ideological balance.” [p. 33] It also clarified that, while states may desire a space where the public has access to diverse viewpoints, “the way the First Amendment achieves that goal is by preventing the government from tilting public debate in a preferred direction.” [p. 33]
Finally, SCOTUS highlighted that states are not allowed to prohibit speech to rebalance the marketplace of ideas, as this goal was not compatible with the First Amendment. On this point, it remarked that “on the spectrum of dangers to free expression, there are few greater than allowing the government to change the speech of private actors in order to achieve its own conception of speech nirvana.” [p. 33 and 34]
Considering the aforementioned arguments, SCOTUS unanimously decided to “vacate the judgments of the Courts of Appeals for the Fifth and Eleventh Circuits and remand the cases for further proceedings consistent with this opinion.” [p. 37]
Concurring and dissenting opinions
Justice Barrett
Justice Barrett joined the Court’s opinion and added a concurring opinion. For her, the Eleventh Circuit Court correctly interpreted the First Amendment’s protection of editorial discretion, while the Fifth Circuit Court did not. In addition, she highlighted the dangers of a facial challenge in this case, suggesting that NetChoice should focus on a challenge applied to specific features—such as Facebook’s News Feed and YouTube’s homepage—rather than encompassing multiple platforms and features in a single challenge.
In turn, Justice Barrett highlighted the complexity of determining how the First Amendment applied to digital platforms, especially when taking into consideration the use of algorithms and artificial intelligence tools to moderate content. Judge Barrett also mentioned that the corporate structure and ownership of platforms can also affect constitutional analysis. In this regard, she explained that foreign ownership and control over content moderation decisions could affect the applicability of the First Amendment.
Finally, she concluded that these complexities reinforce the need to address challenges in a specific and applied manner, rather than attempting to resolve all issues in a single facial challenge. Relatedly, the justice held that “while the governing constitutional principles are straightforward, applying them in one fell swoop to the entire social-media universe is not.” [p. 41]
Justice Jackson
Justice Jackson issued too a concurring opinion. She explained that both cases raised a complex conflict between the contested state laws and the First Amendment rights of social media platforms. The justice emphasized that “not every potential action taken by a social media company will qualify as expression protected under the First Amendment. But not every hypothesized regulation of such a company’s operations will necessarily be able to withstand the force of the First Amendment’s protections either.” [p. 42]
Nonetheless, Justice Jackson held that considering the early stage of the cases, the facial validity of the challenged state laws could not be properly assessed. She further emphasized that, when reviewing these cases, lower courts must be specific in their analysis, evaluating not only the regulated entities but also whether their activities constitute speech protected by the First Amendment.
Finally, Justice Jackson cautioned that further factual development was necessary before fully addressing these legal challenges: “[F]aced with difficult constitutional issues arising in new contexts on undeveloped records, this Court should strive to avoid deciding more than is necessary.” [p. 43 and 44]
Justice Alito
Justice Alito issued an opinion concurring with the Court’s majority, which was joined by Justices Thomas and Gorsuch. According to him, “NetChoice failed to prove that the Florida and Texas laws they challenged [were] facially unconstitutional. Everything else in the opinion of the Court is nonbinding dicta.” [p. 63]
Justice Alito criticized the majority’s decision to opine about the specific applications of the contested laws and to categorize them as erroneous. He explained that the broad ambition of the Court’s majority to provide “guidance on whether one part of the Texas law [was] unconstitutional as applied to two features of two of the many platforms that it reaches-namely, Facebook’s News Feed and YouTube’s homepage-[was] unnecessary and unjustified.” [p. 63] For Justice Alito these issues should be resolved in the context of specific court proceedings and not in a facial challenge.
Next, Justice Alito explained that social media platforms have become the “modern public square” and have a significant impact on people’s communication and daily lives. [p. 66] He stated that platforms, such as Facebook and YouTube, handle massive amounts of data and use algorithms to moderate content, which raises new questions about freedom of expression. Additionally, Justice Alito held that the petitioner did not provide sufficient information about how NetChoice’s members moderate content and which platforms were affected by the laws. Without this information, Justice Alito said, courts could not assess whether the laws had legitimate First Amendment implications.
Finally, he concluded that “the only binding holding in these decisions is that NetChoice has yet to prove that the Florida and Texas laws they challenged are facially unconstitutional. Because the majority opinion ventures far beyond the question we must decide, I concur only in the judgment.” [p. 96]
Justice Thomas
Justice Thomas, in addition to joining Justice Alito and the majority’s opinion, also issued a concurring opinion of its own. He agreed with “the Court’s decision to vacate and remand because NetChoice has not established that Texas’s H. B. 20 and Florida’s S. B. 7072 are facially unconstitutional.” [p. 45] However, Justice Thomas disagreed with the Court’s majority when it provided opinions about these statutes. To him, that was unnecessary and based on an incomplete record of the cases.
Justice Thomas also criticized the Court’s approach when it selected only some specific platform features, such as Facebook’s Newsfeed and YouTube’s homepage, for its analysis, while it ignored other potential applications. Moreover, Justice Thomas recommended SCOTUS to abandon the practice of accepting facial challenges, as they exceeded the authority constitutionally granted to federal courts.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The Supreme Court’s decision expands freedom of expression by reaffirming the First Amendment’s protection of editorial discretion in favor of social media platforms. This ruling, following an established case law pattern, hinders government interference in the editorial choices of private entities, thereby strengthening safeguards against compelled speech on online social media platforms. This decision upholds the principle by which the government cannot compel platforms to host content in ways that contradict their editorial policies—in the name of balancing the marketplace of ideas. Thus, it broadens the scope of protected speech in online spaces.
It is now up to the lower courts to assess the facial challenge and consider the more complex questions surrounding the scope of the laws and which aspects of the content moderation and disclosure provisions violate the First Amendment. There are still many pending issues, not least whether certain forms of algorithmic sorting may be deemed non-expressive and hence open to regulation. The rejection of this facial challenge has led some court watchers to posit that it could lead to a range of as-applied challenges to many of the aspects of the provisions, which will make their way through the courts over the coming years. Time will tell whether this will result in better laws, but the time has come for courts to apply First Amendment jurisprudence to evolving technologies.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.