Content Moderation, Content Regulation / Censorship, Digital Rights
NetChoice v. Attorney General, State of Florida
United States
On Appeal Contracts Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Fifth Circuit Court of Appeals in Texas, US, overturned a preliminary injunction that had prevented the implementation of a bill which prevented social media content moderation. The Texas legislature had passed a law prohibiting the large social media platforms from censoring users’ posts based on viewpoint. The district court had found that the bill violated the platforms’ editorial discretion, protected under the First Amendment. The Court of Appeals held that the content moderation did not constitute First-Amendment-protected speech and the bill was therefore constitutional. This decision differs from a similar one in Florida, which had held that prohibiting content-moderation was a violation of the First Amendment.
This decision was appealed to the U.S. Supreme Court which handed down an opinion on July 1, 2024 in Moody v. NetChoice.
In August 2021, the state legislature in Texas, US passed a law, House Bill 20 (HB 20), prohibiting large (defined as having more than 50 million monthly active users) social media platforms from “censoring” content on the basis of different viewpoints. The legislature characterized social-media platforms as “common carriers [by virtue of their market dominance], … affected with a public interest, … central public forums for public debate, and have enjoyed governmental support in the United States” [p. 3].
Section 7 of the Bill states: “A social media platform may not censor a user, a user’s expression, or a user’s ability to receive the expression of another person based on: (1) the viewpoint of the user or another person; (2) the viewpoint represented in the user’s expression or another person’s expression; or (3) a user’s geographic location in this state or any part of this state.” It does include qualifications, permitting censorship if it is authorized by federal law, the content involves sexual exploitation and harassment, or if it incites criminal activity or violence against a person based on the “race, color, disability, religion, national origin or ancestry, age, sex or status as a peace officer or judge”. The provision allows for defamation suits against a platform that violates section 7, but only for the recovering of costs and attorneys’ fees, not damages.
Section 2 requires that the social-media platforms make three categories of disclosure: they must “disclose how they moderate and promote content and publish an ‘acceptable use policy’”; they must publish a ‘biannual transparency report”; and they must maintain a “complaint-and-appeal system” for users [p. 5].
In September 2021, before the bill took effect, NetChoice – a trade association representing social media platforms – filed a case against Texas. NetChoice argued that HB 20 was facially unconstitutional as the First Amendment includes the activity of moderating content and should not be applied to anyone.
On December 1, 2021, the district court issued a preliminary injunction finding that sections 7 and 2 were facially unconstitutional. It disagreed with the State that social media platforms were “common carriers”, found that the platforms “engage in ‘some level of editorial discretion’ by managing and arranging content, and viewpoint-based censorship is part of that editorial discretion”, and held that the disclosure obligations were “inordinately burdensome” and would “chill the social media platforms’ speech” [p. 6]. The district court also held that HB 20 “discriminates based on content and speaker because it permits censorship of some content … and only applies to large social media platforms” [p. 6]. After finding that the bill could not withstand “any level of heightened scrutiny”, the district court issued a preliminary injunction.
The State of Texas appealed and sought a stay of the preliminary injunction. The Court of Appeals granted the stay, but on May 31, 2022, the Supreme Court vacated the stay.
Judge Andrew S. Oldham delivered the judgment of the Court, which was concurred to by Judge Edith H. Jones. Judge H. Southwick dissented in part.
The platforms argued that HB 20 prevented them from “censoring ‘pro-Nazi speech, terrorist propaganda, [and] Holocaust denial’[s]|’” [p. 15]. They submitted that section 7 “interferes with their speech by infringing their ‘right to exercise editorial discretion’” [p. 34].
The State argued that the purpose of HB 20 was to counter the platforms’ censorship of “pure political speech” and they gave examples of censorship of “fifteen prominent celebrities and political figures – including five holding federal elected office” as well as of “discriminat[ion] against Americans and in favor of foreign adversaries” [p. 14].
The Court noted that NetChoice had sought a declaration of constitutional invalidity before HB 20 was enacted and that “[t]o put it mildly, pre-enforcement facial challenges to legislative acts are ‘disfavored for several reasons’” [p. 8-9]. It explained that, generally, courts do not have the power to invalidate a law in entirety. However, it acknowledged that, following Americans for Prosperity Found. v. Bonta, in First Amendment matters, the Supreme Court “recognized a second type of facial challenge, whereby a law may be invalidated as overbroad if a substantial number of its applications are unconstitutional, judged in relation to the statute’s plainly legitimate sweep” [p. 11]. It quoted the Virginia v. Hicks case which had explained that the rationale behind this overbreadth doctrine is that “[m]any persons, rather than undertake the considerable burden (and sometimes risk) of vindicating their rights through case-by-case litigation, will choose simply to abstain from protected speech – harming not only themselves but society as a whole, which is deprived of an uninhibited market of ideas” [p. 11-12]. The Court held that this overbreadth doctrine did not apply to section 7 of HB 20 because “the primary concern of overbreadth doctrine is to avoid chilling speech … [b]ut Section 7 does not chill speech; instead, it chills censorship” [p. 12]. It added that “HB 20’s prohibitions on censorship will cultivate rather than stifle the marketplace of ideas that justifies the overbreadth doctrine in the first place” [p. 12]. The Court rejected NetChoice’s reliance on Supreme Court jurisprudence and added that the overbreadth doctrine was designed to “protect third parties who cannot ‘undertake the considerable burden’ of as-applied litigation and whose speech is therefore likely to be chilled by an overbroad law” [p. 13]. The Court held that as NetChoice represented all platforms affected by HB 20 there were no third parties and that as the platforms are “large, well-heeled corporations that have hired an armada of attorneys from some of the best law firms in the world to protect their censorship rights”, they did not meet the criteria of those who should benefit from the doctrine. The Court also emphasized that as HB 20 permitted only declaratory and injunctive relief – and not damages – the bill could not chill their speech to the extent that a “facial remedy is justified” [p. 14].
In examining the platforms’ First Amendment claim, the Court noted that the First Amendment “prevents the government from enacting laws ‘abridging the freedom of speech, or of the press’” and discussed the history of free speech jurisprudence [p. 16]. It characterized section 7 as a provision that “protects Texans’ ability to freely express a diverse set of opinions through one of the most important communications mediums used in that State” [p. 19]. It added that “no amount of doctrinal gymnastics can turn the First Amendment’s protection for free speech into protections for free censoring” [p. 20].
The Court provided an analysis of Supreme Court precedent to illustrate why it rejected the platforms’ arguments. It accepted that the jurisprudence holds that “the State may not force a private speaker to speak someone’s else message (sic)” but added that “the State can regulate conduct in a way that requires private entities to host, transmit, or otherwise facilitate speech” [p. 20]. In referring to the cases of Miami Herald Publishing Co. v. Tornillo, PruneYard Shopping Cnt v. Robins, Hurley v. Irish-Am Gay, Lesbian & Bisexual Group, Pacific Gas & Electric Co. v. Public Utilities Corp. of California, and Rumsfeld v. Forum for Academic and Institutional Rights (FAIR), the Court stated that, to bring a challenge under the First Amendment, a party must demonstrate that an impugned law “either (a) compels the host to speak or (b) restricts the host’s own speech” [p. 27]. It held that, in the present case, the platforms did not demonstrate either of those.
In respect of “compelled speech”, the Court distinguished the present case from Miami Herald, holding that the platforms “exercise virtually no editorial control of judgment” because they rely on algorithms and do not review the majority of content that is posted on their platforms [p. 28]. With reference to the case of Biden v. Knight First Amend. Inst, the Court stated that “Platforms are also ‘unlike newspapers’ in that they ‘hold themselves out as organizations that focus on distributing the speech of the broader public’” [p. 28-29]. The Court also rejected the platforms’ reliance on Hurley, stating that platforms do not “host” speech in the same way that a parade host does because they “permit any user who agrees to their boilerplate terms of service to communicate on any topic, at any time, and for any reason” [p. 31]. It stressed that the majority of the content on the platforms is not “meaningfully reviewed or edited in any way” [p. 31]. The Court referred to FAIR and noted that a platform’s decision to censor speech does not count as “expressive” unless it explains its reason for the censorship: without that factor “an observer might just as easily infer that the user himself deleted the post and chose to speak elsewhere” [p. 32].
The Court rejected the platforms’ argument that section 7 interfered with their right to exercise editorial discretion, finding that the Supreme Court jurisprudence did not “carve out ‘editorial discretion’ as a special category of First-Amendment-protected expression” [p. 34]. It also found that the platforms’ activity does not constitute editorial discretion at all because that requires “reputational and legal responsibility for the content [an entity] edits” and the platforms do not claim responsibility for their hosted content [p. 37]. Editorial discretion also “involves ‘selection and presentation’ of content before that content is hosted, published, or disseminated” [p. 37].
The Court held that section 7 does not prohibit the platforms from speaking, because they “have virtually unlimited space for speech” because of the nature of digital platforms and because “Platforms are free to say whatever they want to distance themselves from the speech they host” [p. 32-33].
Accordingly, the Court held that section 7 was constitutional.
The Court also identified that 47 U.S.C. § 230 [on intermediary liability] requires that “the Platforms ‘shall [not] be treated as the publisher or speaker’ of content developed by other users” [p. 39] and that this “reflects Congress’s judgment that the Platforms do not operate like traditional publishers and are not ‘speak[ing]’ when they host user-submitted content” [p. 39]. The Court stated that “Congress enacted Section 230 in 1996 to ease uncertainty regarding online platforms’ exposure to defamation liability for the content they host” [p. 39]. The Court commented that the platforms’ position in this case “is a marked shift” from their claims in the past that “they are simple conduits for user speech and that whatever might look like editorial control is in fact the blind operation of ‘neutral tools’” [p. 42].
The Court examined the common carrier doctrine which “vests States with the power to impose nondiscrimination obligations on communication and transportation providers that hold themselves out to serve all members of the public without individualized bargaining” [p. 44]. The Court held that the platforms were such entities, on which section 7 imposed a non-discrimination requirement. The Court discussed the history of the doctrine, noting that it has its roots in transportation systems and that the first telecommunication common carrier laws obliged telegraph companies to transmit messages from any individual. It identified two key questions to ask in determining whether an entity was a common carrier or not: “did the carrier hold itself out to serve any member of the public without individualized bargaining?” [p. 48]; and was the entity “affected with a public interest” (including whether it had a significant market share)? [p. 49] The Court held that the platforms were communications firms, and hold themselves out to serve the public because they offer the same conditions to all their users. The Court rejected the platforms’ argument that they were not common carriers because they required their users to accept their terms and conditions. It also rejected the platforms’ argument that they were not open to the public because they prevent the use of certain expression, on the grounds that telephone companies can censor obscene content but remain common carriers. It also commented that common carriers have – in the past – engaged in “viewpoint-based discrimination” and so the platforms’ contention that their own viewpoint-based censorship exempts them from common carrier status must be wrong.
The Court accepted the State position that the platforms are “affected with a public interest” and, with reference to the case of Packingham v. North Carolina, mentioned the large numbers of the public who use social media for communication, learning about current events, and finding employment opportunities. The Court referred to Supreme Court cases which “reflect the modern institution that the Platforms are the forum for political discussion and debate, and exclusion from the Platforms amounts to exclusion from the public discourse” [p. 56-57]. The Court described the platforms as having an “effective monopoly over its particular niche of online discourse” and rejected the platforms’ argument that censored individuals can find other places to post [p. 57]. In further rejecting the platforms’ common carrier arguments, the Court found that it is not necessary that the government has “contributed to a carrier’s monopoly, such as by licensing a legal monopoly or acquiring property for the carrier through eminent domain” [p. 58] and reiterated that it was reasonable for Texas to find that social media market dominance meant it was a common carrier. It also rejected the platforms’ submission that common carriers had to “carry” a physical thing, finding that the distinction between physical carriage and data processing had no support.
The Court examined the platforms’ arguments that even if section 7 did convey common carrier status on the platforms the provision remains unconstitutional. It held that the Supreme Court had expressly rejected the dissent in Nebbia v. New York that “a state may not by legislative fiat convert a private business into a public utility” and so it could therefore not find section 7 unconstitutional [p. 63]. It also held that because the platforms host and facilitate “other people’s speech” they are not akin to newspapers “but instead indispensable conduits for transporting information” [p. 64]. The Court concluded that “it’s bizarre to posit that the Platforms provide much of the key communications infrastructure on which the social and economic life of this Nation depends, and yet conclude each and every communication transmitted through that infrastructure still somehow implicates the Platforms’ own speech for First Amendment purposes” [p. 64].
The Court held that – even if it did find that the Platforms’ First Amendment rights were violated – they “would still not be entitled to facial pre-enforcement relief” [p. 64]. It found that the law was content- and viewpoint-neutral and so would require only intermediate scrutiny. It rejected the platforms’ submissions that the Texas legislature targeted them because of their “disagreement with those Platforms’ partisan censorship efforts”, finding no evidence of “the Texas legislature’s alleged improper motives” [p.68]. The Court then held that section 7 did satisfy intermediate scrutiny because its “regulation of viewpoint-based censorship” advances “important governmental interests unrelated to the suppression of free speech and does not burden substantially more speech than necessary to further those interests” [p. 68]. It found that “protecting the free exchange of ideas and information” was an important government interest and that section 7 does not suppress free speech (rejecting the platforms’ argument that it curtailed their censorship as speech). The Court held that the platforms’ argument that Texas could create their own “government-run social media platform” was “bizarre” because it would be “unlikely to be able to reproduce that network and create a similarly valuable communication medium” [p. 71]. It also noted that the platforms had provided no possible alternative which would burden speech less than section 7 does.
The Court held that it was legitimate for section 7 to focus only on large social media platforms, commenting that “Texas reasonably determined that the largest social media platforms’ market dominance and network effects make them uniquely in need of regulation to protect the widespread dissemination of information” and that “regulating smaller platforms would intrude more substantially on private property rights and perhaps create unique constitutional problems of its own” [p. 72].
In respect of section 2, the Court referred to Zauderer v. Off. Of Disciplinary Couns, in finding that the disclosure obligations in that provision were constitutional because they were not unduly burdensome on the platforms. The Court held that these obligations did not “unduly burden (or ‘chill’) protected speech” [p. 75]. The Court reiterated that the platforms’ “editorial process” is not the same as a “traditional publisher’s” [p. 78].
The Court then compared the present case to a similar one, brought against Florida’s Senate Bill 7072 in the Eleventh Circuit Court of Appeal, Netchoice v. Attorney General. A challenge to that law had led to a preliminary injunction against the enforcement of its provisions, but the Court declined to follow that opinion. It identified differences between Texas’s HB 20 and Florida’s SB 7072: SB 7072 “only targets censorship of speech by [or about] political candidates and journalistic enterprises” whereas HB 20 “applies to all speakers equally, instead of singling out political candidates and journalists for favored treatment”; SB 7072 does target platforms’ own speech by preventing them from “post[ing] an addendum to any content or material posted by a user” and limiting their ability to amend their own rules; and SB 7072 allows for the collection of fines whereas the remedies in HB 20 do not include damages.
The Court disagreed with the findings on the merits in the Florida case. It stated, that unlike the Eleventh Circuit, it did “not think the Supreme Court has recognized ‘editorial discretion’ as an independent category of First-Amendment-protected expression” and disagreed that “Platforms’ censorship is akin to the [Supreme Court’s] ‘editorial judgment’” [p. 82]. It disagreed with the Eleventh Circuit finding that “the common carrier doctrine does not support the constitutionality of imposing nondiscrimination obligations on the Platforms” [p. 82]. It also disagreed with that Court’s application of the Miami Herald, Pacific Gas and Electric, Turner and Hurley cases, stating the cases do not, in fact, mention a “editorial-judgment principle” and that the Supreme Court has contradicted that concept in cases which the Eleventh Circuit “addressed only as an afterthought” [p. 83]. It described the distinction the Eleventh Circuit identified between disseminating speech as a business and as incidental to a private company’s activities as “turn[ing] law, logic, and history on their heads” [p. 84]. The Court also disagreed that the platforms’ activity would have fallen into the category of “editorial judgment” even if there was Supreme Court jurisprudence for that category. The Court noted that the Eleventh Circuit “tried to equate the Platforms’ censorship with the editorial processes of newspapers and cable operators” and that because they have specific goals in curating types of communities of their users their censorship was protected by the First Amendment. However, the Court describes the Eleventh Circuit’s reasoning as circular: it had found that the platforms “have a right to censor because they exercise editorial judgment, and they exercise editorial judgment because they censor” [p. 87]. It also said the Eleventh Circuit’s “common carrier” reasoning was circular because it had held that “a firm can’t become a common carrier unless the law already recognizes it as such, and the law may only recognize it as such if it’s already a common carrier” [p. 89].
In conclusion, the Court held that it “rejects the Platform’s attempt to extract a freewheeling censorship right from the Constitution’s free speech guarantee” [p. 90]. It stressed that “the Platforms are not newspapers. Their censorship is not speech” and that they were not entitled to the remedy they sought, “pre-enforcement facial relief” [p. 90]. Accordingly, the Court held that HB 20 was constitutional and vacated the preliminary injunction and remanded the case for further proceedings.
In his partly concurring and partly dissenting opinion, Judge Southwick would have concluded that social media platforms’ content-moderation is First-Amendment-protected expression. He stated that he and the majority “simply disagree about whether speech is involved in this case,” [p. 94] and that “[t]he majority’s perceived censorship is my perceived editing” [p. 95]. The Judge agreed with the majority that “a successful facial challenge to a state law is difficult” and that social media platforms are “firms of tremendous public importance” [p. 93]. Although describing the platforms acting to “blatantly censor the views of those with whom they disagree, leaving no equivalent platform available to the speakers they scorn” and having taken “aggressive, inconsistent positions” on the applicability of the First Amendment to their activities, the Judge emphasized that “[t]he legal issues before us, though, must be separated from any disquiet irrelevant to the application of the First Amendment” [p. 93]. The Judge described the majority as “forcing the picture of what the Platforms do into a frame that is too small” and noted that while he did “not celebrate the excesses”, the First Amendment envisions “wide-ranging, free-wheeling, unlimited variety of expression” [p. 94]. The Judge stated that “[t]he First Amendment, though, is what protects the curating, moderating, or whatever else we call the Platforms’ interaction with what others are trying to say” [p. 94]. He acknowledged that the novel nature of social media meant there is little precedent – he says “[t]hese activities native to the digital age have no clear ancestral home within our First Amendment precedent” [p. 96] – and describes the closest precedent as that “establishing the right of newspapers to control what they do and do not print” [p. 94].
Judge Southwick examined the same jurisprudence as the majority – Miami Herald, Hurley, Pacific Gas & Electric, PruneYard, Turner, FAIR – and explained why his interpretations differed. In examining the Supreme Court’s use of “editorial discretion”, he noted that, in respect of Miami Herald, the majority had not recognized that the Supreme Court “have recognized the selection process [of what to publish in a newspaper] itself as First Amendment expression” [p. 97] and that in respect of Turner emphasized that both transmitting a message and exercising editorial discretion were First-Amendment-protected speech. Judge Southwick concluded that Miami Herald was the most applicable to the present case and stated that “[w]hen the Platforms curate their users’ feeds, which are the behaviors prohibited in Section 7 of HB 20, they are exercising their editorial discretion [which] is a type of First Amendment-protected activity recognized in Miami Herald, PG&E, Turner and Hurley” [p. 103]. The Judge stated that the majority “disregards the Supreme Court’s recognition that there may be more than one type of First Amendment activity … when … an article is selected and printed in a newspaper” [p. 103]. The Judge also disagreed with the majority that because the platforms’ moderation occurs after publication it does not constitute editorial conduct because, for social media platforms, “the majority of decisions on moderating what has been posted can only be made, as a practical matter, after the appearance of the content on the Platform” [p. 105].
Judge Southwick would have found that section 7 does not survive even intermediate scrutiny. He referred to the Florida case, Netchoice v. Attorney General, in agreement that private actors had a First Amendment right to be ‘unfair’ – which is to say, a right to have and express their own points of view” [p. 109].
Judge Southwick also disagreed with the majority’s “common carrier” analysis, saying that designating social media platforms as common carriers is likely not appropriate, and that his interpretation of the jurisprudence is that “common carriers retain their First Amendment protections for their own speech” [p. 110-111].
Judge Southwick concluded that “when the social media Platforms who are in the business of speech make decisions about which speech is permitted, featured, promoted, boosted, monetized, and more, they are engaging in activity to which First Amendment protection attaches” [p. 113]. He added that “[b]alance and fairness would be preferable, but the First Amendment does not require it” [p. 113].
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
Limiting platforms’ ability to moderate content may undermine the ability of society to deal with hate speech and give power to governments to decide what content is available for users.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.