Global Freedom of Expression

Oversight Board Case of Former President Trump’s Suspension

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    May 5, 2021
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
    2021-001-FB-FBR
  • Region & Country
    United States, International
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Meta Newsworthiness allowance, Oversight Board Policy Advisory Statement, Disinformation, Incitement, Political speech

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

This case is available in additional languages:    View in: Español

Case Analysis

Case Summary and Outcome

The Oversight Board upheld Facebook’s decision to restrict then-President Donald Trump’s access to posting content on Facebook and Instagram. On January 6, 2021, during the counting of the 2020 U.S. presidential electoral votes, a mob forcibly entered the Capitol Building in Washington, D.C. Five people died, and many were injured during the violence. During these events, then-President Donald Trump posted two pieces of content: a video on Facebook and Instagram, followed by a written statement on Facebook. Facebook found the content violated its content policies and thus decided to remove them and suspend his account for 24 hours.
The Board employed the three-part test to analyze Facebook’s decision to impose restrictions on freedom of expression by restricting Mr. Trump’s access to the accounts. However, the Board also made policy recommendations for Facebook to implement in developing clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.

*The Oversight Board is separate from Meta and will provide its independent judgment on individual cases and policy questions. An independent trust funds both the Board and its administration. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding unless implementing them could violate the law. The Board can also choose to issue company content policy recommendations.

 


Facts

On January 6, 2021, during the counting of the 2020 U.S. presidential electoral votes, a mob forcibly entered the Capitol Building in Washington, D.C., where the ballots were counted. This violence threatened the constitutional process. During these events, five people died, and many more were injured.

Before January 6, then-President Donald Trump claimed, without evidence, that the opposition had stolen November 2020 presidential election. The Trump campaign raised these claims in court with little or no evidence, but the court consistently rejected them. Further, after an investigation, the then-attorney General stated that there had been no fraud “on a scale that could have effected a different outcome in the election.” Nevertheless, Mr. Trump continued to make these unfounded claims through social media. Specifically, he posted two comments on the Trump Facebook page that referenced a rally scheduled for January 6.

On December 19, 2020, the Trump Facebook page posted: “Peter Navarro releases 36-page report alleging election fraud ‘more than sufficient to swing victory to Trump – A great report by Peter. Statistically impossible to have lost the 2020 Election. Big protest in D.C. on January 6. Be there, will be wild!” [p. 9.]

On January 1, 2021, the Trump Facebook page posted: “The BIG Protest Rally in Washington, D.C., will take place at 11.00 am on January 6. Locational details to follow. StopTheSteal!” [p. 9].
On the morning of January 6, 2021, Mr. Trump attended a rally near the White House and gave a speech in which he continued to make unfounded claims that he had won the election. During his remarks, he suggested that Vice President Mike Pence should overturn President-elect Joe Biden’s victory, a power Mr. Pence did not have. Additionally, he stated, “we will stop the steal,” and “we’re going to the Capitol” [p. 9].

After the ex-president’s speech, many rally attendees marched to the U.S. Capitol Building, where they joined other protestors already gathered. During the riot, many protestors attacked Capitol security, violently entered the building, and rioted through the Capitol. Mr. Pence and ten other Congress members were at serious risk of targeted violence. Five people died, and many were injured [p. 10].

At 4:21 pm EST, during the riot, Mr. Trump posted a video on Facebook and Instagram that stated: “I know your pain. I know you’re hurt. We had an election that was stolen from us. It was a landslide election, and everyone knows it, especially the other side, but you have to go home now. We have to have peace. We have to have law and order. We have to respect our great people in law and order. We don’t want anybody hurt. It’s a very tough period of time. There’s never been a time like this where such a thing happened, where they could take it away from all of us, from me, from you, from our country. This was a fraudulent election, but we can’t play into the hands of these people. We have to have peace. So go home. We love you. You’re very special. You’ve seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace” [p.10].

At 5:41 pm EST, Facebook removed this post for violating its Community Standard on Dangerous Individuals and Organizations.

At 6:07 pm EST, while police were securing the Capitol, Mr. Trump posted a written statement on Facebook: “These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long. Go home with love in peace. Remember this day forever!” [p. 11].

At 6:15 pm EST, eight minutes after this post was published, Facebook removed the post for violating its Community Standard on Dangerous Individuals and Organizations. Furthermore, the platform blocked Mr Trump from posting on Facebook or Instagram for 24 hours.

The District of Columbia declared a public emergency on January 6 and extended it until January 21 that same day.

On January 7, after further reviewing Mr. Trump’s posts, his recent communications off Facebook, and additional information about the severity of the violence at the Capitol, Facebook extended the block “indefinitely and for at least the next two weeks until the peaceful transition of power is complete” [p. 11]. In addition, Facebook cited Mr. Trump’s “use of our platform to incite violent insurrection against a democratically elected government” [p. 11].

On January 16, 2021, a participant in the riot was quoted in the Washington Post: “I thought I was following my president. . . . He asked us to fly there. He asked us to be there. So I was doing what he asked us to do.” A video captured a rioter on the steps of the Capitol screaming at a police officer, “We were invited here! We were invited by the president of the United States!” [p. 11].

On January 21, a day after Mr. Trump ceased to be the president of the United States, Facebook announced it had referred this case to the Oversight Board. In its submission, Facebook asked the Board whether it correctly decided on January 7 to prohibit Mr. Trump’s access to posting content on Facebook and Instagram for an indefinite time. Moreover, the platform requested recommendations about suspensions when the user is a political leader.

On January 27, the Department of Homeland Security (DHS) issued a National Terrorism Advisory System Bulletin warning of a “heightened threat environment across the United States, which DHS believes will persist in the weeks following the successful Presidential Inauguration” [p. 10]. It stated that “drivers to violence will remain through early 2021, and some [Domestic Violent Extremists] may be emboldened by the January 6, 2021 breach of the U.S. Capitol Building in Washington, D.C., to target elected officials and government facilities” [p. 11].


Decision Overview

The main issue for the Board was whether Facebook’s decision to prohibit Donald J. Trump’s access to posting content on Facebook and Instagram for an indefinite time was consistent with the platform’s Community Standards, values, specifically its commitment to voice and safety, and human rights responsibilities.

The American Center for Law and Justice and a page administrator on Mr. Trump’s behalf submitted a statement to the Board. Via the submission, they requested the Board reverse Facebook’s indefinite suspension account of the former U.S. President. They also held it inconceivable that either of the posts could be viewed as a threat to public safety or an incitement to violence. Similarly, they commented that there was a “total absence of any serious linkage between the Trump speech and the Capitol building incursion” [p. 16].

Furthermore, they argued that Facebook’s reasons for imposing the restrictions could not be safety-related since “any content suspected of impacting safety must have a direct and obvious link to the actual risk of violence” [p. 17]. Likewise, they asserted that Mr. Trump’s speech during the rally did not violate the Dangerous Individuals and Organizations Community Standard and that the political discourse on January 6 never ‘proclaim[ed] a violent mission,’ a risk that lies at the very centre of the Facebook policy” [p. 17]. In the statement, Mr. Trump’s representatives claimed that Facebook’s Violence and Incitement Community Standard had failed to support the suspension of the Ex-president’s Facebook account since the two posts “merely called for peace and safety.” They held that when considered in their actual context, none of the words in Mr. Trump’s speech could reasonably be construed as incitement to violence or lawlessness” [p. 17].
Moreover, in the submission, they expressed that employing content decisions based on what seems reasonable or how a reasonable person would react to that content was insufficient. They believed Facebook should “consider a much higher bar” [p. 16]. Additionally, they mentioned that the Supreme Court requires strict scrutiny for laws that burden political speech and that Facebook held market dominance. They also discussed the constitutional standards for incitement to violence. They claimed that preserving public safety was a legitimate aim, but Mr. Trump’s speech did not present safety concerns. On necessity and proportionality, they denied the validity of the restrictions and stated the penalty was disproportionate.

The statement concluded with suggestions for the Board’s policy recommendations on suspensions when the user is a political leader. Lastly, it argued that the Board should defer to the legal principles of the nation-state in which the leader was governing.

In its submission, Facebook stated that it removed the two pieces of content posted on January 6, 2021, for violating the Dangerous Individuals and Organizations Community Standard. Specifically, the company stated that the content was removed for violating “its policy prohibiting praise, support, and representation of designated Violent Events” [p. 19]. Facebook also noted it contained “a violation of its Dangerous Individuals and Organizations policy prohibiting praise of individuals who have engaged in acts of organized violence” [p. 16]. The company stated that its Community Standards prohibit “content that expresses support or praise for groups, leaders or individuals involved in activities such as terrorism, organized violence or criminal activity, and that this includes organized assault as well as planned acts of violence attempting to cause injury to a person with the intent to intimidate a government to achieve a political aim” [p. 19]. Facebook remarked that its assessment reflected the letter of its policy and the surrounding context in which the statements were made.

Moreover, Facebook commented that in cases where it must make an emergency decision with widespread interest, it endeavors to share its decision and reasoning with the public, often through a post in its Newsroom. Facebook stated that it usually does not block the ability of pages to post or interact with content but removes pages that severely or repeatedly violate Facebook’s policies. In the immediate case, Facebook noted that, in line with its standard enforcement protocols, it initially imposed a 24-hour block on the ability to post from the Facebook and Instagram accounts. However, after further evaluating the evolving situation and emerging details of the violence at the Capitol, it concluded that the 24-hour ban was insufficient to address “the risk that Trump would use his Facebook and Instagram presence to contribute to a risk of further violence” [p. 20].

Facebook remarked that it maintained the indefinite suspension after Mr. Biden’s inauguration partly due to a National Terrorism Advisory System Bulletin issued on January 27 by the Department of Homeland Security, which stated that violence connected to Mr. Trump had not passed. Facebook also declared that its decision was informed by Article 19 of the ICCPR and U.N. General Comment No. 34 on freedom of expression, which permits necessary and proportionate restrictions of freedom of expression in situations of public emergency threatening the nation’s life.

Finally, Facebook argued that the events of January 6 represented an unprecedented threat to the democratic processes and constitutional system of the United States. However, it asserted that while it strived to act proportionately and with accountability in curtailing public speech, given the unprecedented and volatile circumstances, it should retain operational flexibility to take further action, including a permanent ban.

Following Facebook’s statement, the Board asked the company 46 questions, and Facebook declined to answer seven entirely and two partially. The questions that Facebook did not answer included how Facebook’s features impacted the visibility of Mr. Trump’s content and information about violating content from followers of Mr. Trump’s accounts. The Board also asked questions about the suspension of other political figures and the removal of other content. Facebook stated that such information was not reasonably required for decision-making under the intent of the Charter; it was not technically feasible to provide since it could not be delivered because of legal, privacy, safety, or data protection concerns.

The Board began by anticipating it would base its analysis of the case on three standards: Facebook’s content policies, specifically Facebook’s Community Standard on Dangerous Individuals and Organizations and Facebook’s Community Standard on Violence and Incitement, as well as the corresponding Community Guidelines, and Facebook’s Terms of Service and Instagram’s Terms of Use. It would also base its analysis on Facebook’s values of “Voice,” “Safety,” and “Dignity,” as well as evaluating Facebook’s decision given international human rights standards as applicable to Facebook.

Compliance with content policies
According to the Board, Facebook’s Community Standard on Dangerous Individuals and Organizations holds that users should not post content “expressing support or praise for groups, leaders, or individuals involved in” violating events. In the Board’s opinion, both posts praised or supported people who engaged in violence. It then found that the two posts severely violated Facebook policies since the user praised and supported people involved in a continuing riot where people died, lawmakers were at serious risk of harm, and disrupted a key democratic process. Thus, considering the continuing threat of violence and turmoil, the Board deemed that Facebook’s decisions to restrict Mr. Trump’s access to Facebook and Instagram on January 6 —and to extend those restrictions on January 7— were necessary and proportionate. However, the Board stressed that the company’s decision to make the restrictions “indefinite” was against the Community Standards and would violate the principles of freedom of expression.

The Board noted limited detailed public information on the cross-check system and newsworthiness allowance. Although Facebook stated the same rules apply to high-profile and regular accounts, the Board believed that different processes might lead to different substantive outcomes. It then stressed that although Facebook had told the Board, it did not apply the newsworthiness allowance to the posts at issue in this case. However, the lack of transparency regarding these decision-making processes appeared to contribute to perceptions that political or commercial considerations may unduly influence the company. As a result, the Board called on Facebook to address widespread confusion about decisions relating to influential users. Further, it stressed that the considerations of newsworthiness should not take priority when urgent action is needed to prevent significant harm.

Compliance with Facebook’s values
The majority of the Board considered that the application of the Facebook Community Standards was consistent with the company’s values of “Voice” and “Safety.” Further, the Board held that, in this case, the protection of public order justified limiting freedom of expression.
Compliance with Facebook’s human rights responsibilities

The Board clarified that its “decisions do not concern the human rights obligations of states or application of national laws, but focus on Facebook’s content policies, values, and human rights responsibilities as a business” [p. 25].

Then the Board explained that under Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which sets out the right to freedom of expression, political speech receives high protection under human rights law because of its importance to democratic debate. Further, the Board noted the U.N. Human Rights Committee’s authoritative guidance on Article 19 ICCPR in General Comment No. 34 holds that “free communication of information and ideas about public and political issues between citizens, candidates and elected representatives is essential” [p. 26].

In this case, the Board recognized that Facebook’s decision to suspend Mr. Trump’s Facebook page and the Instagram account had “freedom of expression implications not only for Mr. Trump but also for the rights of people to hear from political leaders, whether they support them or not.” [p. 26]. The Board emphasized that while political figures do not have a greater right to freedom of expression than other people, “restricting their speech can harm the rights of other people to be informed and participate in political affairs” [p. 26]. Yet, the Board noted that international human rights standards, mainly through the Rabat Plan of Action, “expect state actors to condemn violence and to provide accurate information to the public on matters of public interest, while also correcting misinformation” [p. 26].

The Board underscored that international law allows for expression to be limited when the restrictions meet three conditions. First, rules must be clear and accessible; second, they must be designed for a legitimate aim; and third, they must be necessary and proportionate to the risk of harm. The Board then employed this three-part test to analyze Facebook’s actions in this case.

I. Legality (clarity and accessibility of the rules)
The Board remarked that the principle of legality requires that any rule used to limit expression is clear and accessible. In this case, the Board held that these rules were Facebook’s Community Standards and Instagram’s Community Guidelines. For the Board, the aim of such policies is, on the one hand, to establish what people cannot post and, on the other, when the company can restrict access to Facebook and Instagram accounts.

Concerning the Standard against praise and support of Dangerous Individuals and Organizations, the Board reckoned, as it had in a prior decision (case 2020-005-FB-UA), it left much to be desired. Moreover, the Board recalled its decision in the case 2020-003-FB-UA to note that “there may be times in which certain wording may raise legality concerns, but as applied to a particular case, those concerns are not warranted” [p. 27]. It held that any vagueness under the Standard did not yield its application to the circumstances of the instant case.

The Board considered that the January 6 riot at the Capitol “fell squarely within the types of harmful events set out in Facebook’s policy, and Mr. Trump’s posts praised and supported those involved at the very time the violence was going on, and while Members of Congress were calling on him for help” [p. 27]. In its view, Facebook’s policies provided adequate notice to the user and guidance to those enforcing the rule.

Regarding the penalties for violations, the Board highlighted that the Community Standards and related information about account restrictions were published in various sources. Further, referring to its previous decision in the case 2020-006-FB-FBR, it reiterated that “the patchwork of applicable rules makes it difficult for users to understand why and when Facebook restricts accounts and raises legality concerns” [p. 27].

While the Board deemed that the Dangerous Individuals and Organizations Standard was clear under the circumstances surrounding the immediate case, it considered Facebook’s imposition of an indefinite restriction vague and uncertain. In the Board’s view, indefinite restrictions were not described in the Community Standards. The Board considered it unclear what standards would trigger this penalty or what criteria should be employed to maintain or remove it. Moreover, the Board held that Facebook failed to provide information on prior impositions of indefinite suspensions in other cases. It recognized the necessity of some discretion on Facebook to suspend accounts in urgent situations like the present one, yet considered that users could not be left uncertain for an indefinite time. Thus, the Board rejected Facebook’s request for it to endorse indefinite restrictions imposed and lifted without clear criteria. In addition, it stressed that “appropriate limits on discretionary powers are crucial to distinguish the legitimate use of discretion from possible scenarios around the world in which Facebook may unduly silence speech not linked to harm or delay action critical to protecting people” [p. 28].

II. Legitimate aim
The Board then recalled that the second part of the test requires that any measure restricting expression must convey a legitimate aim. In the Board’s opinion, Facebook’s policy on praising and supporting individuals involved in “violating events”, violence, or criminal activity was in line with the aims listed by the U.N. General Comment No. 34.

III. Necessity and proportionality
Lastly, to define the requirement of necessity and proportionality, the Board referred yet again to the U.N. General Comment No. 34 to indicate that any restriction on expression must, among other things, be the least intrusive way to achieve a legitimate aim. Regarding this provision, the Board believed that, where possible, “Facebook should use less restrictive measures to address potentially harmful speech and protect the rights of others before resorting to content removal and account restriction. At a minimum, this would mean developing effective mechanisms to avoid amplifying speech that poses risks of imminent violence, discrimination, or other lawless action, where possible and proportionate, rather than banning the speech outright” [p. 29].

In its submission, Facebook had stated to the Board that it considered Mr. Trump’s “repeated use of Facebook and other platforms to undermine confidence in the integrity of the election (necessitating repeated application by Facebook of authoritative labels correcting the misinformation) represented an extraordinary abuse of the platform” [p. 29]. However, when the Board sought clarification from the company about the extent to which the platform’s design decisions amplified Mr. Trump’s posts after the election. The company had declined to answer whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. In the Board’s opinion, the lack of response from the platform made it difficult to assess whether less severe measures taken earlier could have been sufficient to protect the rights of others.

For the Board, the crucial question was whether Facebook’s decision to restrict access to Mr. Trump’s accounts on January 6 and 7 was necessary and proportionate to protect the rights of others. To understand the risk posed by the January 6 posts, the Board assessed Mr. Trump’s Facebook and Instagram posts and off-platform comments since the November election. Maintaining an unfounded narrative of electoral fraud and persistent calls to action noted that Mr. Trump had created an environment where a severe risk of violence was possible. The Board asserted, “On January 6, Mr. Trump’s words of support to those involved in the riot legitimized their violent actions” [p. 29]. Furthermore, it held that “Although the messages included a seemingly perfunctory call for people to act peacefully, this was insufficient to defuse the tensions and remove the risk of harm that his supporting statements contributed to” [p. 30].

In light of the above, the Board considered that in the context of escalating tensions in the United States and his statements in other media and public events, the company’s interpretation of Mr. Trump’s posts on January 6 was necessary and proportionate.

As part of its analysis, the Board drew upon the six factors from the Rabat Plan of Action to assess the capacity of speech to create a severe risk of inciting discrimination, violence, or other lawless action.

I. Context. The Board considered that Mr. Trump’s posts were published during a time of high political tension centered on the unfounded claim that the opposition had stolen the November 2020 presidential election. Even though courts had consistently rejected Trump’s campaign claims on this matter, the then-president continued to assert these claims on social media, utilizing his authoritative status as head of state to lend credibility. Further, Mr. Trump prompted his supporters to attend the January 6 “StoptheSteal,” rally to the extent that he suggested it would be “wild.” Moreover, on the day of the rally, Mr. Trump urged the attendees to march to the Capitol building to challenge counting the electoral votes. According to the Board, “at the time of the posts, severe violence was continuing. The situation remained volatile when the restrictions were extended on January 7.” [p. 30]. Lastly, it pointed out that the District of Columbia took steps to warn of a heightened risk of violence surrounding the events at the Capitol.

II. Status of the speaker. The Board noted that as president, Mr. Trump had credibility and authority with members of the public, which contributed to the events of January 6. In the Board’s words, “Mr. Trump’s status as head of state with a high position of trust not only imbued his words with greater force and credibility but also created risks that his followers would understand they could act with impunity” [p. 30].

III. Intent. The Board held it was not able to assess Mr. Trump’s intentions. Yet it considered that the possibility of violence linked to Mr. Trump’s statements was clear. He likely knew or should have known that his publications could pose a risk of legitimizing or promoting violence.

IV. Content and form. Even though the two posts called on the rioters to go home peacefully, the Board considered that they likewise praised and supported their actions. Further, it held that the posts reiterated the unfounded claim that the election was stolen, which some rioters understood as legitimizing their activities. In the Board’s opinion, the evidence showed that “Mr. Trump used the communicative authority of the presidency in support of attackers on the Capitol and an attempt to prevent the lawful counting of electoral votes” [p. 31].

V. Extent and reach. The Board highlighted that Mr. Trump had a large audience, with a following of at least 35 million accounts on Facebook and at least 24 million on Instagram. It also stated that the Ex-president’s social media posts were frequently picked up and shared more broadly through mass media channels and by his high-profile supporters with large audiences, significantly increasing their reach.

VI. Imminence of harm. Since the posts were published “during a dynamic and fluid period of ongoing violence” [p. 27], the Board believed there was a clear, immediate risk of harm to life, electoral integrity, and political participation. In the Board’s view, the violence at the Capitol started within an hour of the rally organized via Facebook and other social media. The Board held that “even as Mr. Trump was posting, the rioters were rampaging through the halls of Congress and Members of Congress were expressing fear by calling on the White House and pleading for the president to calm the situation” [p. 31]. The Board commented that the riot directly interfered with Congress’s ability to discharge its constitutional responsibility of counting electoral votes, delaying this process by several hours.

After analyzing these factors, the Board concluded that the violation, in this case, was severe in terms of its human rights harms. It justified Facebook’s decision to impose restrictions on Mr. Trump’s accounts on January 6. Further, it held that, given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in imposing account-level restrictions and extending them on January 7.

However, the Board deemed it inappropriate for Facebook to impose an indefinite suspension. The Board found that it was not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the company will restore the account. Furthermore, it emphasized “Facebook’s role in creating and communicating necessary and proportionate penalties that it applies in response to severe violations of its content policies” [p. 33]. At the same time, it recalled that the Board’s role “is to ensure that Facebook’s rules and processes are consistent with its content policies, values, and commitment to respect human rights” [p. 33].

Therefore, in applying an indeterminate and standardless penalty and then referring the case to the Board to resolve, Facebook sought to avoid its responsibilities. Thus, the Oversight Board declined the company’s request and insisted that it should apply and justify a defined penalty.
Additionally, the Board held that within six months of this decision, Facebook had to reexamine the arbitrary penalty it imposed on January 7 and determine the appropriate punishment. It clarified that such a penalty is based on the gravity of the violation and the prospect of future harm and is consistent with Facebook’s rules for severe violations, which must, in turn, be transparent, necessary, and proportionate.

Finally, the Board deemed that if Facebook determined that Mr. Trump’s accounts should be restored, the company should apply its rules to that decision, address any further violations promptly, and follow its established content policies.

Policy advisory statement:

The Board limited its guidance to issues of public safety. In its policy advisory statement, the Board made several recommendations to guide Facebook’s policies regarding serious risks of harm by political leaders and other influential figures. The Board stated that it is not always practical to draw a firm distinction between political leaders and other influential users, recognizing that other users with large audiences can also contribute to serious risks of harm.
While the same rules should apply to all users, the Board explained that context matters when assessing the probability and imminence of harm. When posts by influential users pose a high likelihood of imminent harm, Facebook should act quickly to enforce its rules. Although Facebook explained that it did not apply for its ‘newsworthiness’ allowance in this case, the Board called on Facebook to address widespread confusion about how decisions are made relating to influential users. The Board stressed that the considerations of newsworthiness should not take priority when urgent action is needed to prevent significant harm.

In the Board’s view, Facebook should publicly explain its rules when it imposes account-level sanctions against influential users. Likewise, it suggested that such regulations ensure that when Facebook imposes a time-limited suspension on account of an influential user to reduce the risk of significant harm, it will assess whether the risk has receded before the suspension ends.
The Board noted that heads of state and other high government officials could have a greater power to cause harm than other people. Also, it indicated that if a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm. Suspension periods should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.

Dissenting or Concurring Opinions:

A minority of the Board believed it was necessary to outline some minimum criteria that reflect the Board’s assessment of Facebook’s human rights responsibilities.

While the majority decided to provide this guidance as a policy recommendation, the minority noted that Facebook’s responsibilities to respect human rights include facilitating the remediation of adverse human rights impacts it has contributed. They likewise considered that to fulfill its responsibility to guarantee that the adverse effects are not repeated, Facebook should have assessed whether reinstating Mr. Trump’s accounts posed a serious risk of inciting imminent discrimination, violence, or other lawless action.

Furthermore, the minority considered Facebook’s enforcement procedures rehabilitative, and the minority believes that this aim corresponds well with the principle of satisfaction in human rights law. Thus, they emphasized that Facebook’s rules should ensure that users who seek reinstatement after suspension recognize their wrongdoing and commit to observing the rules in the future. In this case, the minority suggested that before the company can restore Mr. Trump’s account, Facebook must correspondingly aim to ensure the withdrawal of praise or support for those involved in the riots.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

The Board decision expands expression by holding it was not appropriate for Facebook to impose an indefinite suspension of Mr. Trump’s account, mainly when imposed and lifted without clear standards. The Board considered that reasonable limits on discretionary powers are crucial to distinguish the legitimate use of discretion from possible scenarios around the world in which Facebook may unduly silence speech not linked to harm or delay action critical to protecting people.

While Facebook restricted expression by imposing restrictions on Mr. Trump, it did so under a justified and recognized limitation to freedom of expression. The Board considered that the posts violated the company’s content policies and international human rights standards on freedom of expression. Specifically, by employing the three-part test established under Article 19 of the ICCPR, International law allows for expression to be limited when meeting three requirements. Given that Facebook’s actions fulfilled the criteria, the Board determined that Facebook was justified in imposing account-level restrictions.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 2

    The Board marginally referred to this Article to highlight Facebook’s human rights responsibilities as a business.

  • ICCPR, art. 6

    The Board marginally referred to this Article to highlight Facebook’s human rights responsibilities as a business.

  • ICCPR, art. 9

    The Board marginally referred to this Article to highlight Facebook’s human rights responsibilities as a business.

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression and employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited; the Board referred to the General Comment for guidance.

  • ICCPR, art. 20

    The Board marginally referred to this Article to highlight Facebook’s human rights responsibilities as a business.

  • ICCPR, art. 25

    The Board marginally referred to this Article to highlight Facebook’s human rights responsibilities as a business.

  • ICERD, art. 2

    The minority of the Board referred to the ICERD to emphasize Facebook´s commitment to respect the right to non-discrimination in line with the requirements for restrictions on the right to freedom of expression to prevent the use of its platforms for advocacy of racial or national hatred constituting incitement to hostility, discrimination or violence.

  • UNGPs, principle 11

    The Board marginally referred to this principle to highlight Facebook’s human rights responsibilities as a business.

  • UNGPs, principle 13

    The Board marginally referred to this principle to highlight Facebook’s human rights responsibilities as a business.

  • UNGPs, principle 15

    The Board marginally referred to this principle to highlight Facebook’s human rights responsibilities as a business.

  • UNGPs, principle 18

    The Board marginally referred to this principle to highlight Facebook’s human rights responsibilities as a business.

  • UNGPs, principle 19

    The Board marginally referred to this principle to highlight Facebook’s human rights responsibilities as a business.

  • UNGPs, principle 22

    The minority noted that Facebook’s responsibilities to respect human rights include facilitating the remediation of adverse human rights impacts it has contributed, as stated on this principle.

  • UNHR Comm., General Comment No. 31 (2004)

    The Board referred to the interpretation of the General Comment to analyze the right to remedy.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

  • OHCHR, Rabat Plan of Action on the prohibition of advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence (2011).

    As part of its analysis, the Board drew upon the six factors from the Plan of Action to assess the capacity of speech to create a serious risk of inciting discrimination, violence, or other lawless action.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression had raised concerns about the vagueness of Facebook´s Dangerous Individuals and Organizations Community Standard.

  • OHCHR, Joint Statement of international freedom of expression monitors on COVID-19 (March, 2020)

    The Board referred marginally to the Joint Statement to highlight that international rights expect state actors to provide accurate information to the public on matters of public interest while also correcting misinformation.

General Law Notes

  • Nazi quote (2020-005-FB-UA)
    • By referring to this decision, the Board emphasized that the clarity of the Standard against praise and support of Dangerous Individuals and Organizations left much to be desired.
  • Armenians in Azerbaijan (2020-003-FB-UA)
    • The Board recalled that as noted in the decision of this case that there may be times in which certain wording may raise legality concerns, but as applied to a particular case those concerns are not warranted.
  • Claimed COVID cure (2020-006-FB-FBR)
    • By referring to this case, the Board reiterated that the patchwork of applicable rules makes it difficult for users to understand why and when Facebook restricts accounts and raises legality concerns.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Article 2 of the Oversight Board Charter states, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The Board’s resolution of each case will be binding, and Facebook (now Meta) will implement it promptly unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with similar context – which the Board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the Board’s decision to that content as well. When a decision includes policy guidance or an advisory policy opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance. It will consider it in the formal policy development process of Facebook (now Meta) and transparently communicate about actions taken.

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback