Global Freedom of Expression

Oversight Board Case of a Brazilian General’s Speech

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    June 22, 2023
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2023-001-FB-UA
  • Region & Country
    Brazil, Latin-America and Caribbean
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Oversight Board Content Policy Recommendation, Incitement, Political speech, Facebook, Social Media

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board overturned Meta’s original decision to leave up a Facebook video that incited violence by calling for the storming of government buildings to protest the 2022 Brazilian Presidential election results. After political unrest relating to the elections, a Facebook user reported the aforementioned content. Meta considered it did not violate their guidelines. After evaluating the matter, the Board held that Meta was in breach of their own “Violence and Incitement” policies and human rights standards. The Board also made two recommendations to help the company prevent and address similar events that might arise in the future.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In October 2022, Brazil held its Presidential election and Luiz Inácio Lula da Silva (Lula) defeated incumbent president Jair Bolsonaro, coming out on top with 50.9% of the votes. In that same month and just after the electoral results, a video surfaced online calling on people to besiege and raid government buildings located in the Three Powers Plaza in Brasília. The content also featured a prominent Brazilian general supporting the re-election of the former President and encouraging people to take control of the National Congress and the Supreme Court.

On December 12, Lula’s victory was confirmed, which led protestors supporting Bolsonaro to attack Police departments and attempt to bomb locations near Brasília’s airport.

The election period had been marked by a “heightened risk of political violence”, and, Meta had, beforehand, designated Brazil as being a “temporary high-risk location” and had decided to remove any content encouraging people to invade government buildings.

On January 1, 2023, Lula was sworn in as the new President. However, the risk of violence did not decrease with his inauguration, and civil unrest, protests, and encampments took place predominantly in front of military bases. On January 3, a user posted the original video on Facebook, which then “was played over 18,000 times and was not shared” [p. 6] by its viewers.

That day, a user reported the content for having violated Meta’s Violence and Incitement Community Standard. A moderator reviewed the complaint and concluded that the video did not violate Meta’s guidelines. The user filed an appeal but a second content reviewer confirmed the original decision. Until January 4, four other users reported the same content six times. Five different moderators, with linguistic and cultural expertise to analyze Brazilian content, reviewed the complaints and all concluded that the content did not violate the guidelines.

On January 8, supporters of Bolsonaro broke into the National Congress, the Supreme Court, and presidential offices. They used violence to intimidate individuals and destroyed property. Around 1,400 people were arrested. On January 9, Meta declared the actions of the previous day as a “violating event under its Dangerous Individuals and Organizations policy” and decided to remove any content that validated those movements.

As a result of the above, the Oversight Board selected this case for review.


Decision Overview

The main issue for the Board was to analyze whether Meta’s decision to leave up the content—a video on Facebook that incited violence by calling for the storming of government buildings to protest the 2022 Brazilian Presidential election—was consistent with the company’s Violence and Incitement Community Standard and the company’s human rights responsibilities.

In their appeal to the Board, the user who reported the content stated that Meta’s decisions when reviewing complaints are “always the same, that it doesn’t violate the Community Standards. ”[p. 11] According to him, it was people “who do not accept the results of elections” [p. 11] who put into action the incentives found in the video.

In its referral of the case to the Board, Meta affirmed that its platforms are “important places for political discourse, especially around elections.” [p. 11] The company also noted that during political crises, it is very common for people to express their dissent and call to violence. Those statements are, in most cases, not serious. But in order to remove content from their platforms, Meta has to make certain that “there is a genuine risk of physical harm or direct threats to public safety.” [p. 10]

Additionally, Meta explained that the company had established “risk evaluation and mitigation measures” to address election-related matters: as part of the pre-election preparation, Meta and Brazil’s Superior Electoral Court worked together to make sure only reliable information reached the people, which led to the increase of reports of violating content. Meta also raised awareness of its Community Standards regarding, specifically, misinformation. Additionally, the company prohibited paid advertising on its platforms and limited the forwarding option on WhatsApp which helped limit the sharing of information. As part of the post-election preparation, Meta’s teams categorized content into groups which helped policy mitigation, and under the company’s Community Standards, the company was able to limit the exposure to “authoritative information about the Brazil elections.” [p. 18]

Meta noted that the Election Operations Centre team was covering Brazil’s elections only from September to November 2022 so it could not help track content that violated Meta’s policies at the time that the case content was posted on Facebook. However, Meta designated the post-election unrest as “violating events” under its Crisis Policy Protocol which “help[ed] the company assess how best to mitigate content risks.” [p. 13] By adopting risk evaluation measures from the beginning of the elections, Meta explained it had acknowledged the heightened risk of violence in Brazil before and after the video was posted.

Despite having argued the reasons why Meta left the content up, the company admitted that its repeated decisions to not remove the Facebook video were an error, and ascertained that the content entailed a heightened risk of offline harm. Meta also stated that, in order to remove content from their platforms, the company had to be sure that “there [was] a genuine risk of physical harm or direct threats to public safety.” [p. 10] Even if the calls to “besiege” governmental buildings alone did not constitute a serious call for violence, the company observed that attaching the Brazilian general’s speech and the sequence of images to the video made the incitement to violence explicit.

As for the implementation of their risk mitigation measures, Meta admitted to a “possible shortcoming”. According to the company’s internal teams, the content reviewers did not detect any violation of Meta’s guidelines because they “may have misunderstood the user’s intent”, or they “made a wrong decision despite the correct guidelines being in place”, or they “may not have seen the violation in the video.” [p. 17] However, Meta did not elucidate why the video was not transferred to subject matter experts for deeper analysis at such a critical period.

Following Meta’s statements, the Board asked the company fifteen questions, and Meta answered them all but two. The first unanswered question examined the “relationship between political advertising and misinformation.” [p. 13] The second question concerned “the number of removals of Pages and accounts while the Election Operation Centre for the 2022 Brazil elections was in place,” [p. 13] to which the company only stated that it had already shared publicly the number of content takedowns. Meta considered that coming up with all the information for the Board would take time and resources. The company also added that it did not have “prevalence data” on claims that were made during the election period to share with the Board, because its “enforcement systems are set up to monitor and track based on the policies that they violate.” [p. 13]

To determine whether the content in question complied with Meta’s content policies and standards, the Board focused its analysis on content rules, enforcement action and transparency.

Compliance with Meta’s Content Policies

  • Content rules

Regarding content rules, The Board held that Meta’s Community Standard on Violence and Incitement holds that it is prohibited to post on Meta’s platforms “statements of intent or advocacy, calls to action, or aspirational or conditional statements to forcibly enter locations where there are temporary signals of a heightened risk of violence.” [p. 15]

According to the Board, the two “high risk” designations that must exist simultaneously for there to be a violation of Meta’s policy were met. It noted that Meta had designated Brazil a temporary high-risk location from September 1, 2022, until February 22, 2023, based on the company’s “assessment of increased risk of violence associated with ongoing civil and election-related unrest.” [p. 16] The company—the Board noted—had additionally designated the Three Powers Plaza area to be a high-risk location “by virtue of being places of work or residence of high-risk persons or their families.” [p. 15] The Board considered that “while Meta’s value of ‘Voice’ is particularly relevant in electoral processes,” [p. 15] the company should balance that value against the safety issue, and therefore not disregard the company’s value for “Safety” as well.

The Board noted that Meta’s internal teams are informed when a location is designated as a temporary high-risk location, which results in content being subject to proactive review “before users report it”. That being the case, the Board found that Meta’s decision to keep the content on its platforms was a clear breach of its own policy.

  • Enforcement action

Regarding enforcement actions, the Board noted its concern about Meta’s moderators. If content similar to that of January 3 had already been published and was the cause of civil unrest in Brazil, the Board questioned how the moderators “repeatedly assessed” that the contested content did not violate Meta’s guidelines.

The Board highlighted that, when asked about information on election-related claims, Meta argued that the company did not have any “prevalence data” on the matter. Accordingly, the Board concluded that Meta did not comply with a previous recommendation presented in the Knin Cartoon case in which “the Board urged Meta to provide more clarity on how content gets escalated to subject matter experts.” [p. 10]

  • Transparency

On the topic of transparency, the Board acknowledged Meta’s significant efforts to preserve the integrity of Brazil’s elections. It also recognized the company’s efforts to generate platforms for political discourse around elections, as it previously did in the Myanmar Bot case, and to create a balance between moderating content, protecting freedom of expression and avoiding impacts to other human rights, as it did too in the Former President Trump’s Suspension case.

However, the Board noted that Meta never provided the Board with detailed information regarding the “metrics measuring the success of [Meta’s] election integrity efforts generally,” [p. 19] and that the company was never transparent about how it conducted risk evaluation measures.

Consequently, the Board demanded more transparency from Meta to assess whether the company’s measures were sufficient, especially in a context of a transition of power. The Board had previously made a recommendation in the Tigray Communication Affairs Bureau case, highlighting “Meta’s responsibility to establish a principled and transparent system for moderating content in conflict zones to mitigate the risks of its platforms being used to incite violence”. However, The Board concluded that Meta had not complied with the recommendation in the immediate case.

Compliance with Meta’s human rights responsibilities

In assessing the company’s compliance with Meta’s human rights responsibilities and to determine whether Meta’s decision to leave the content up was justified, the Board proceeded to employ the three-part test set out in Article 19 of the International Covenant on Civil and Political Rights.

  • Legality

The principle of legality requires that restrictions are provided by law. In the immediate case, the Board noted that Meta’s guidelines stipulated that any content that calls for forcible entry into certain high-risk locations is prohibited and that the “exact conditions under which the prohibition is triggered” [p. 21] were clear. This, for the Court, met the accessibility requirement of the legality test. Hence, the Board found that the legality requirement was met.

  • Legitimate aim

To satisfy the test, a restriction must truly pursue one of the legitimate aims provided for in Article 19. Accordingly, the Board noted that the purpose behind Meta’s guidelines is the protection of national security or public order. Under the Violence and Incitement policy, Meta is allowed to remove content to prevent potential offline harm and content that might lead to direct threats to public safety. Therefore, the Board found that the legitimacy requirement was met.

  • Necessity and proportionality

The test requires the demonstration of the precise nature of the threat, and the necessity and proportionality of the restriction imposed—in particular by establishing a direct and immediate connection between the expression and the threat. The Board relied on the Rabat Plan of Actions to argue why it was justified to restrict freedom of expression by removing the disputed content from Facebook. It found that the criteria set out for speech restrictions were all met, since the video clearly revealed “the intent of the speaker, the content of the speech and its reach, as well as the likelihood of imminent harm resulting in the political context of Brazil at that time.” [p. 22] In light of this, the Board stressed that the decision to remove the video from Facebook would have been consistent with Meta’s Human Rights responsibilities.

Accordingly, the Board held Meta accountable as its initial decision was inconsistent with the company’s Community Standards and violated human rights guidelines. For these reasons, the Board overturned Meta’s decision to leave the content up.

Recommendations

The Board presented two specific recommendations regarding Meta’s Community Standards to improve enforcement accuracy and treat users fairly. Regarding enforcement, the Board recommended that Meta develop a framework for evaluating the company’s election integrity efforts. The Board also recommended Meta to clarify in its Transparency Centre that, in addition to the Crisis Policy Protocol, the company runs other protocols in its attempt to prevent and address potential risks of harm arising in electoral contexts or other high-risk events.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

Although the Oversight Board’s decision contracted expression by overturning Meta’s decision to leave the content up it did affirm that users of Meta’s platforms should be able to express their ideas in electoral contexts but not when it constitutes incitement to imminent violence. The Board’s decision was a balanced approach that protected the right to freedom of expression under Article 19 of the ICCPR and ensured that any content that incites violence is promptly taken down to prevent the likelihood of offline harm

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 6

    The Board noted that its analysis of this case was informed by this standard on the right to life.

  • ICCPR, art. 19

    The Board analyzed Meta’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Meta’s actions alligned with such responsabilities.

  • ICCPR, art. 20

    The Board analyzed Meta’s human rights responsibilities through this precept on freedom of expression.

  • ICCPR, art. 21

    The Board analyzed Meta’s human rights responsibilities through this precept on the right to peaceful assembly.

  • ICCPR, art. 25

    The Board analyzed Meta’s human rights responsibilities through this precept on the right to participate in public affairs and the right to vote.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

  • UNHRC Comm., General Comment No. 37 (2020)

    The Board referred to this instrument to underscore the heightened level of protection afforded to the legitimate excercise of the people’s rights to freedom of expression and protest.

  • OSB, Former President Trump’s suspension, 2021-001-FB-FBR (2021)

    The Board referred to this case to underscore Meta’s human rights responsabilities in electoral contexts to allow political expression while avoiding serious risks to other human rights.

  • OSB, Myanmar Bot, 2021-007-FB-UA (2021)

    The Board referred to this decision to highlight the protection of political speech during during periods of political crisis and the relationship between harmful online content and offline violence

  • OSB, Knin Cartoon, 2022-001-FB-UA (2022)

    The OSB cited this case to note that Meta’s escalation channels are insufficiently clear and effective.

  • OSB, Tigray Communication Affairs Bureau, 2022-006-FB-MR (2022)

    The Board cited this case to highlight “Meta’s responsibility to establish a principled and transparent system for moderating content in conflict zones to mitigate the risks of its platforms being used to incite violence.”

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback