Global Freedom of Expression

Oversight Board Case of COVID Lockdowns in Brazil

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    August 19, 2021
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
  • Region & Country
    Brazil, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Oversight Board Policy Advisory Statement, COVID-19

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On August 19, 2021, the Oversight Board upheld Facebook’s (now Meta) decision to leave up a post on Facebook by a state-level medical council in Brazil that claimed that COVID-19 lockdowns were ineffective and had been condemned by the World Health Organization (WHO). The Board found that Facebook’s decision to keep the content on the platform was consistent with its Community Standard on Violence and Incitement. The Board found that while the content contained some inaccurate information which raised concerns considering the severity of the pandemic in Brazil and the council’s status as a public institution, it did not create a risk of imminent harm.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


In March 2021, the Facebook page of a state-level medical council in Brazil posted a picture of a written notice in Portuguese entitled “Public note against lockdown.” The publication claimed that the COVID-19 lockdowns were ineffective, against the fundamental rights in the Constitution, and condemned by the WHO. It also included an alleged quote from Dr. David Nabarro, one of the WHO’s special envoys for COVID-19, stating that “the lockdown does not save lives and makes poor people much poorer” [p. 4]. 

The notice also claimed that the Brazilian state of Amazonas had an upsurge in the number of deaths and hospital admissions after lockdown as proof of the failure of lockdown restrictions. Additionally, it asserted that lockdowns would lead to increased mental disorders, alcohol and drug abuse, and economic damage, amongst other things. It concluded that effective preventive measures against COVID-19 included education campaigns about hygiene measures, the use of masks, social distancing, vaccination, and extensive monitoring by the government, but never the decision to adopt lockdowns. 

The page had more than 10,000 followers, and the content was viewed around 32,000 times and shared around 270 times. No users reported the content. While the content remained on the platform, Facebook referred the case to the Board. 

Decision Overview

The main issue for the Board in the immediate case was if Facebook’s decision to leave the state-level medical council post claiming that lockdowns were ineffective and had been condemned by the WHO was consistent with the company’s Community Standard on Violence and Incitement, its values, and its human rights responsibilities. 

Facebook confirmed to the Board that it had sent the user a notification that the case had been referred to the Board and provided the user the opportunity to submit information on the issue. Yet, the user did not submit a statement. In this regard, while the Board noted that the notification sent by Facebook allowed the user to submit information, it expressed concern that the company did not give the user sufficient information to provide a statement correctly. It further remarked that notifications shown by Facebook to the user stated the general topics that the case related to but did not explain why the content was referred to the Board and the relevant policies the content might be enforced against.

In its statement, Facebook noted that it took no action against the content and stated the case was challenging because the “content does not violate Facebook’s policies, but could still be read by some people as advocacy for taking certain safety measures during the pandemic” [p. 9]. It explained that an internal team at the company familiar with the region noted reports from the press about the case content and flagged the case for review. However, the reviewers determined that the content did not violate Facebook’s policies. Moreover, the company claimed that the content did not meet the standard of risk of imminent violence or physical harm. 

Compliance with Community Standards

The Board noted that the information published in the post referring to the inefficiency of lockdowns and the WHO’s alleged condemnation of such measures was not entirely accurate. Specifically, the Board referred to a fragment of the quote included in the post from Dr. David Nabarro that read: “lockdown does not save lives”. It further explained that while Dr. Nabarro had stated that the WHO did “not advocate lockdowns as a primary means of control of this virus” and that they have the consequence of “making poor people an awful lot poorer,” he did not declare lockdown did not save lives. However, the Board remarked that Facebook’s argument regarding the threshold of “imminent harm” had not been met because the WHO and “other health experts” had advised the company to “remove claims advocating against specific health practices, such as social distancing” [p.11] but not statements advocating against lockdowns.

The Board believed that the company should have considered the local context and the situation resulting from the pandemic in Brazil when evaluating the risk of imminent physical harm. In the same vein, the Board voiced its concern regarding the spread of COVID-19 misinformation in the country and its possible consequences, such as how endangering people’s trust in public information about appropriate measures to counter the pandemic could increase the risk of users adopting risky behaviors. The Board acknowledged that the latter would justify a more nuanced approach by Facebook in the country, intensifying its efforts to counter misinformation. Yet, it found that the post in question did not meet the threshold of imminent harm because it discusses a measure not suggested unconditionally by the public health authorities and emphasized the importance of other measures to counter the spread of COVID-19 – including social distancing. 

Compliance with Facebook’s values

Regarding whether the company had complied with Facebook’s values, the Board found that the platform’s decision to take no action against the content in question was consistent with its value of “Voice”. This content did not pose an imminent danger to the value of “Safety” to justify displacing “Voice”.

Compliance with Facebook’s human rights responsibilities

Concerning Facebook’s human rights responsibilities through international standards, the Board highlighted that Article 19 of the International Covenant on Civil and Political Rights (ICRRP)  provides broad protection for expressions of all kinds. The Board noted that even though the medical councils do not have authority to impose measures such as lockdowns, they are part of the state government administration and could exert influence over the authorities deciding to adopt measures to counter the spread of COVID-19. The Board held that since the post in question was shared by the Facebook page of a medical council in Brazil, there was a general increased interest in its views as an institution on public health issues. 

The Board proceeded to employ the three-part test set out in article 19 to assess whether the removal of the post would be justified under Facebook’s human rights responsibilities. 

I. Legality (clarity and accessibility of the rules)

The Board found that the rules on applying the Violence and Incitement policy to health misinformation were not sufficiently accessible to the public. While it recognized that the Help Center article provided useful information for users to understand how the policy was enforced, the Board noted that it contained several sources of rules outside the Community Standards that were not “made accessible to the public”. 

II. Legitimate aim

The Board then briefly analyzed if the restriction on freedom of expression pursued a “legitimate aim”. In this regard, the Board held that the ICCPR lists legitimate aims in Article 19, para. 3, which includes the protection of the rights of others as well as the protection of public health. 

III. Necessity and proportionality

Finally, the Board assessed whether content removal was necessary to protect public health and the right to health, in line with Facebook’s human rights responsibilities. In the Board’s view, since the content was shared by the page of the state government administration, it may influence other public authorities and the general public’s behavior.  The Board noted that it was relevant for Facebook to consider whether a page or account was administered by a public institution “because those institutions should not make, sponsor, encourage or further disseminate statements which they know or reasonably should know to be false or which demonstrate a reckless disregard for verifiable information” [p. 15]. 

The Board remarked that public authorities should verify the information they provide to the public. Moreover, it noted that such duty is not lost when disseminated false information is not directly related to its statutory responsibilities. 

Furthermore, the Board remarked that the content was not used as a basis by the council to adopt public health measures that could create risks since it did not have the authority to decide these matters. The Board recalled its case decision 2020-006-FB-FBR, where it recommended that Facebook consider less intrusive measures than removals for misinformation that may lead to forms of physical harm that are not imminent. In the immediate case, the Board deemed Facebook’s decision to keep the content on the platform justified, given that the threshold of impending physical harm was not met. Nevertheless, the Board underscored that disseminating misinformation on public health could affect trust in public information and the effectiveness of specific measures that may be essential in certain contexts. In the Board’s view, Facebook should have provided the public with more context about the statements of Dr. Nabarro and the WHO’s stance on lockdowns. 

Policy advisory statement:  

The Board recommended that Facebook implement the Board’s recommendation from case decision 2020-006-FB-FBR and an analysis to identify less intrusive measures than removing content. The least intrusive measure should be used where content related to COVID-19 distorts the advice of international health authorities and where a potential for physical harm is identified but is not imminent. Additionally, the Board urged the platform to prioritize fact-checking content flagged as health misinformation from public authorities, considering the local context. Finally, the Board recommended that Facebook provide more transparency within the False News Community Standard regarding when content is eligible for fact-checking, including whether public institutions’ accounts are subject to fact-checking.

Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Oversight Board’s decision expands expression by establishing that although the analyzed content contained some inaccurate information, it did not create a risk of imminent harm. The Board’s analysis regarding the relevance of adopting alternative measures, other than content removal, strengthens expression while protecting other rights that may come into tension, since it advocates for regulation when information on health-related matters could be distorted without having to censor it. 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 6

    The Board referred to this Article to highlight Facebook’s human rights responsibilities as a business regarding the right to life.

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.

  • ICESCR, art. 12

    The Board referred to this article to highlight that all should enjoy the highest attainable physical and mental health standards.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

General Law Notes

Oversight Board decisions:

  • Claimed COVID cure (2020-006-FB-FBR)
    • By referring to this case, the Board noted that it had found it difficult for users to understand what content relating to health misinformation is prohibited under Facebook’s Community Standards, considering the “patchwork” of relevant rules.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar. In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:

Amicus Briefs and Other Legal Authorities


Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback