Global Freedom of Expression

Oversight Board Case of a Claimed COVID Cure

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    January 28, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2020-006-FB-FBR
  • Region & Country
    France, Europe and Central Asia
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Oversight Board Enforcement Recommendation, Oversight Board Transparency Recommendation, Oversight Board on Meta Interstitials, Oversight Board Policy Advisory Statement, COVID-19

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On January 28, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a user’s Facebook post criticizing the Agence Nationale de Sécurité du Médicament for refusing to authorize hydroxychloroquine combined with azithromycin for use against COVID-19. Facebook removed the post since it considered the content breached the platform’s misinformation and imminent harm rule, part of its Violence and Incitement Community Standard. The Board considered that Facebook failed to demonstrate how the post contributed to imminent physical harm. The Oversight Board also argued that Facebook’s misinformation and imminent harm rule was too vague, making it “difficult for users to understand what content is prohibited”. According to the Board, Facebook also failed to prove that it chose the least intrusive measure balancing both freedom of expression and the protection of public health. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In October 2020, a Facebook user “posted a video and accompanying text in French in a Facebook public group related to COVID-19. The video and text alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products) which refused to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorized and promoted remdesivir” [p. 1].

The user claimed that Didier Raoult’s cure was being used in other parts of the world to treat COVID-19. Raoult is a French “professor of microbiology at the Faculty of Medicine of Marseille, and directs the ‘Institut Hospitalo-Universitaire Méditerranée Infection’ (IHU) in Marseille” [p. 2]. The user’s post questioned the risk to society in letting doctors prescribe a “harmless drug” in an emergency once the first symptoms of COVID-19 appeared.   

The user’s post, which was published in a Facebook public group about COVID-19 with more than 500,000 members, “received about 50,000 views, about 800-900 reactions (the majority of which were “angry” followed by “like”), 200-300 comments on the post made by 100-200 different people and was shared by 500-600 people” [p. 2].

After reviewing the post, Facebook removed the content claiming it violated the company’s “misinformation and imminent harm rule under its Violence and Incitement Community Standard”. According to Facebook, the decision to remove content was based on two main reasons: “(1) the post claimed a cure for COVID-19 exists, which is refuted by the World Health Organization (WHO) and other credible health authorities, and (2) leading experts have told Facebook that content claiming that there is a guaranteed cure or treatment for COVID-19 could lead people to ignore preventive health guidance or attempt to self-medicate” [p. 6].

Facebook referred the case to the Oversight Board as an example of the challenges of addressing the risk of offline harm caused by misinformation regarding the COVID-19 pandemic.


Decision Overview

The Oversight Board analyzed whether the user’s post, questioning the Agence Nationale de Sécurité du Médicament refusal to authorize hydroxychloroquine combined with azithromycin to treat COVID-19, violated the misinformation and imminent harm rule under its Violence and Incitement Community Standard. Likewise, the Board assessed —applying a three-part test— if the measure to remove the content issued by Facebook complied with Human Rights Standards on freedom of expression.

Facebook confirmed to the Oversight Board that it had notified the user to file a statement regarding this case, however, the user did not submit one.

In its submission,  Facebook argued that it removed the content because it “contributed to the risk of imminent physical harm during a global pandemic” [p. 6]. The company explained that leading experts told Facebook “that content claiming that there is a guaranteed cure or treatment for COVID-19 could lead people to ignore preventive health guidance or attempt to self-medicate” [p. 6], hence the need to limit this type of content.

Compliance with Community Standards

The Board found that Facebook failed to prove how the user’s post “contributed to imminent harm in this case”. According to the Board, an analysis of several contextual factors is needed to decide when misinformation “contributes to Facebook’s own standard of ‘imminent’ harm”.  The prohibition of misinformation that contributes to imminent violence or physical harm is contained “within the Community Standard on Violence and Incitement”.  The policy rationale on violence and incitement states that “it aims ‘to prevent potential offline harm that may be related to content on Facebook’”.

In the Board’s view, “the status of the speaker and their credibility, the reach of their speech, the precision of the language used, and the availability of the alleged treatment to the vulnerable audience, were all contextual factors that needed to be analyzed in order to conclude whether the post really contributed to imminent harm” [p. 8].  

The Oversight Board noted that the post did not encourage people “to buy or take certain drugs without a medical prescription”; instead, the removed content was “geared towards pressuring a governmental agency to change its policy” [p. 8]. While the Board acknowledged that the combination of antibiotics and anti-malarial medicines, allegedly considered to be a cure, could be harmful.  The Board further noted that since the medications in question were “not available without a prescription in France,” it wondered about the extent of the imminent harm allegedly the removed content could create. 

The Board considered that since French authorities had not approved the alleged cure, it was unclear how other users “would be inclined to disregard health precautions for a cure they cannot access”. In light of these facts, the Board opined that Facebook’s failure to provide contextual factors concerning removed content did not “support a finding that this particular post would meet [Facebook’s] own imminent harm standard” [p. 9]. 

Compliance with Facebook’s Values

The Board explained that Facebook’s Community Standards establish “Voice” as a fundamental value as the platform aims to “create a place for expression and give people a voice.” More, it highlighted that the value of “Voice” could only be circumscribed when it conflicts with another core value, such as “Safety” in the present case. “Safety” will take precedence over “Voice” if the content “that threatens people has the potential to intimidate, exclude or silence others”. The Board ultimately found that the safety concerns were not “sufficient” to restrict Voice and remove the post. 

Compliance with Human Rights Standards on Freedom of Expression 

When analyzing Facebook’s decision to remove the content concerning Human Rights Standards on freedom of expression, the Board cited Article 19 para. 2 of the ICCPR, which “provides broad protection for expression of ‘all kinds’”. The Oversight Board also observed that the UN Human Rights Committee has remarked that “the value of expression is particularly high when discussing matters of public concern”. Likewise, it underscored, that the removed post was a critique of government policy regarding medical issues: “The user raises a matter of public concern […]. The Board argued that an opinion mirroring minority views did not make it less worthy of protection.

The Board pointed out that, under Article 19, para. 3 of the ICCPR, restrictions to freedom of expression, such as removing online content, are allowed, as long as three conditions are met: legality, legitimacy and necessity. Thus, the Board analyzed Facebook’s decision to remove the user’s post using this three-part test. 

  • Legality

Upon analyzing the legality of the measure issued by Facebook, the Board argued that the misinformation and imminent harm rule was “inappropriately vague”, since the “rule contains no definition of ‘misinformation’”. The Board then cited the UN Special Rapporteur on Freedom of Opinion and Expression to underscore the fact that “vague and highly subjective terms such as ‘unfounded,’ ‘biased,’ ‘false,’ and ‘fake’- do not adequately describe the content that is prohibited” [p. 10]. The Oversight Board argued that vague or ambiguous prohibitions could empower authorities with the faculty to define what is true and false, which fosters self-censorship.

The Board also noted that over time “Facebook has announced multiple COVID-19 policy changes through its Newsroom without reflecting those changes in the current Community Standards”. For example, on March 25, 2020, Facebook maintained that it would “remove COVID-19 related misinformation that could contribute to imminent physical harm”, thus implying a different threshold than the one laid out by the misinformation and imminent harm rule (“could contribute” vs. “actually contributes”).

Taking into consideration the lack of definition of key concepts such as misinformation and the varying and differing standards laid out by Facebook regarding imminent harm, the Board deemed that the rule applied was inappropriately vague and made it “difficult for users to understand what content is prohibited” [p. 11]. Therefore, the measure issued by Facebook against the user’s post did not meet the legality test. 

  • Legitimate Aim

Upon analyzing the legitimacy of Facebook’s decision to remove the post, the Oversight Board considered that the measure pursued a legitimate aim since it sought to protect “public health during a global pandemic”. The protection of public health is listed in Article 19, para. 3 of the ICCPR, as a legitimate cause for restricting freedom of expression.

  • Necessity and Proportionality

Regarding the necessity of the measure issued by Facebook against the post, the Board pointed out that Facebook had to demonstrate that it chose the least intrusive measure “to address the legitimate public interest objective”. The Board highlighted that Facebook has a diverse set of tools, beyond content removal, to address false news, such as: “the disruption of economic incentives for people and pages that promote misinformation; the reduction of the distribution of content rated false by independent fact checkers; and the ability to counter misinformation by providing users with additional context and information about a particular post, including through Facebook’s COVID-19 Information Center” [p. 13]. Since Facebook did not explain why it considered that removing content was the least intrusive measure for protecting public health, the Board found that, in this case, Facebook’s measure failed the necessity test. 

In view of these reasons, the Oversight Board overturned “Facebook’s decision to remove the post in question”. 

Policy Advisory Statements

The Board made several Policy Advisory Statements. It urged “Facebook to set out a clear and accessible Community Standard on health misinformation, consolidating and clarifying existing rules in one place (including defining key terms such as misinformation)”. 

The Board also considered that “Facebook should adopt less intrusive enforcement measures for policies on health misinformation,” such as labeling posts with alerts for users, highlighting “the disputed nature of the post’s content and provid[ing] links to the views of the World Health Organization and national health authorities” [p. 14]. Other measures like “preventing interactions or sharing, to reduce organic and algorithmically driven amplification” or “downranking content, to prevent visibility in other users’ newsfeeds” [p. 14] can also be considered. These measures should be notified to users and subject to appeal. 

Lastly, the Board recommended that Facebook improve “its transparency reporting on health misinformation content moderation” by publishing a report “on how the Community Standards have been enforced during the COVID-19 global health crisis” [p. 15]. 

 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

In its assessment of Facebook’s misinformation and imminent harm rule, and the platform’s application of it, the Oversight Board provides a robust protection of freedom of expression —aligned with international Human Right Standards— by arguing that restrictions on this right shouldn’t be vague, overly broad or ambiguous. The Board also took into consideration the special protection of discourse regarding matters of public concern. The Board’s request to Facebook to adopt less intrusive measures for content moderation also strengthens freedom of expression against censorship.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression and employed the three-part test established in this Article to assess if Facebook’s measures were a valid restriction to freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this general comment to underscore that “the value of expression is particularly high when discussing matters of public concern”.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • UN Special Rapporteur on freedom of opinion and expression, Research Paper 1/2019 on Elections in the Digital Age (2019)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression had raised concerns about vague and subjective terms, such as “biased”, being used in “legislative and regulatory initiatives to restrict “fake news” and disinformation”.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression has highlighted the importance that measures that restrict freedom of expression helps achieve their intended goal and are not ineffective or counterproductive.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression has pointed out that rule-making should be accompanied with “detailed hypotheticals that illustrate the nuances of interpretation and application of [these] rules” to provide further clarity for users.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback