Global Freedom of Expression

Oversight Board Case of Protest in India Against France

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    February 12, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2020-007-FB-FBR
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Political Expression, Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Oversight Board Policy Advisory Statement, Meme

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On February 12, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a Facebook user’s post that contained a meme featuring an image from a Turkish television show depicting a character in leather armor holding a sheathed sword. A text overlay said the sword should be “taken out of its sheath” if kafirsspeak against the Prophet. The accompanying text referred to the President of France, Emmanuel Macron, as the devil and called for the boycott of French products. After reviewing the content, Facebook considered the user’s post a veiled threat that breached its Violence and Incitement Community Standard. However, in its decision, the Board determined that the post was not a call for physical harm, nor did the context surrounding the publication suggest that the post was likely to lead to violent acts. The Board also highlighted that speech on religious and political matters is protected under International Law.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In October 2020, a Facebook user posted a meme in which a character from the Turkish TV show Diriliş: Ertuğrul is shown in leather armor and with a sheathed sword. The text overlay, in Hindi, read — according to Facebook’s English translation — “if the tongue of the kafir starts against the Prophet, then the sword should be taken out of the sheath” [p. 4]. The accompanying text of the post stated in English “that the Prophet is the user’s identity, dignity, honor and life, and contained the acronym ‘PBUH’ (peace be upon him)” [p. 4]. The user included hashtags referring to the President of France, Emmanuel Macron, as the devil and called for a boycott of French products. 

The content was posted “in a public group that describes itself as a forum for providing information for Indian Muslims” [p. 3] and “was viewed about 30,000 times, received less than 1,000 comments and was shared fewer than 1,000 times” [p. 4].

Facebook (now Meta) removed the content in November 2020, arguing that the post violated the Violence and Incitement Community Standard. According to the company, “kafir” was a pejorative term used to talk about nonbelievers. Facebook considered that the image and text were “a veiled threat of violence against ‘kafirs’” [p. 4]. 

Although the post had been reported twice in the past for Hate Speech and Violence and Incitement, respectively, Facebook decided to remove the content only after it “received information from a third-party partner that this content had the potential to contribute to violence” [p. 4]. Facebook confirmed that this third party was not linked to any particular state but to a trusted partner network that includes “non-governmental organizations, humanitarian organizations, non-profit organizations, and other international organizations” [p. 4]. 

After the content was flagged by this third-party, Facebook “sought additional contextual information from its local public policy team, which agreed with the third-party partner that the post was potentially threatening” [p. 4].

On November 19, 2020, Facebook referred the case to the Oversight Board, mentioning the challenges in issuing a decision on the matter, since “the content highlighted tensions between what it is considered religious speech and a possible threat of violence, even if not made explicit” [p. 4].


Decision Overview

The Oversight Board analyzed whether Facebook was correct to remove the user’s post for violating the company’s Violence and Incitement Community standard. The post depicted a man in leather armor with a sheathed sword, a text overlay saying the sword should be “taken out of its sheath” if “kafirs” speak against the Prophet, and accompanying text called for a boycott of French products. The Board also assessed — through a three-part test — if Facebook’s measure of removing this content complied with Human Rights standards on freedom of expression. 

Although the Board gave the user 15 days to submit a statement regarding this case, the Board “received no statement from the user” [p. 6].

For its part, Facebook argued that the post was removed because it breached the company’s Violence and Incitement Community Standard, which “prohibits content that creates a ‘genuine risk of physical harm or direct threats to public safety’ including coded statements ‘where the method of violence or harm is not clearly articulated, but the threat is veiled or implicit.’”  [p. 6]. According to Facebook, the post included veiled threats against “kafirs” when it mentions that “the sword should be taken out of the sheath”. The company interpreted this reference to a sword as a threatening call to action that “implied reference to historical violence” [p. 6].

Facebook also said that the context of the post was relevant, concluding that the content merited removal. The company noted that the post was uploaded “at a time of religious tensions in India related to the Charlie Hebdo trials in France and elections in the Indian state of Bihar” [p. 7].

Compliance with Community Standards

The Board noted that the Violence and Incitement Community Standard “aim[s] to prevent potential offline harm that may be related to content on Facebook” [p. 5]. More, it stressed that “as such, this policy restricts content that poses a threat to public safety or when there “is a genuine risk of physical harm” [p. 5].

For a majority of the Board, the use of hashtags to call for a boycott of French products “was a call to non-violent protest and part of discourse on current political events” [p. 8]. Similarly, the use of a meme from a popular TV show — the majority of the Board considered— was not a call to physical harm, even if the image referred to violence. 

The Board then proceeded to analyze Facebook’s justifications regarding the context in which the post was uploaded to the platform. The Board noted that both the protests in India “in reaction to President Macron’s statement following the killings in France in response to cartoon depictions of the Prophet Muhammad” [p. 8], and the elections of November 2020 in the Indian state of Bihar, were not marked by violence against persons based on their religion. 

Although the Board unanimously agreed that contextual analysis is essential when moderating content with veiled threats, the majority “did not find Facebook’s contextual rationale in relation to possible violence in India in this particular case compelling” [p. 8]. Hence, the majority of the Oversight Board “supported restoring the post under the Violence and Incitement Community Standard” [p. 9].

Compliance with Facebook’s values

Facebook’s Community Standards establish “Voice” and “Safety” as fundamental values for the platform. The goal of “Voice” —which is described a paramount value— is to create a place for expression in which people are “able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable” [p. 5]. “Safety” highlights the relevance of making Facebook a safe place in which content that threatens people or has the potential to “exclude or silence others isn’t allowed” [p. 5] on the platform. 

The Board acknowledged that “Safety” is particularly important considering the heightened religious tensions in India. But even in this particular context, for the majority of the Board, the user’s post “did not pose a risk to ‘Safety’ that justified displacing ‘Voice’” [p. 9]. 

Compliance with Human Rights Standards

Upon analyzing Facebook’s measures regarding Human Rights standards, the Board mentioned, in line with General Comment No. 34 and Article 19 of the ICCPR, that “individuals have the right to seek, receive and impart ideas and opinions of all kinds, including those that may be controversial or deeply offensive” [p. 9]. Ideas that are blasphemous or that criticize religious doctrines or leaders are also protected by freedom of speech. The Board underscored too, that “[p]olitical expression is particularly important and receives heightened protection under international human rights law” [p. 9]. The Board also mentioned that freedom of expression is not an absolute right, on the contrary, it can be subjected to limitations under international human rights law.

Taking into consideration this legal framework, and “after discussing the factors in the [UN] Rabat Plan of Action” [p. 10], the Board considered that the removed post did not constitute advocacy of religious hatred “reaching the threshold of incitement to discrimination hostility or violence which states are required to prohibit under ICCPR Article 20, para. 2” [pp. 9-10].

Subsequently, the Board analyzed Facebook’s removal decision in order to assess if this measure was a valid restriction to freedom of expression under Article 19, para. 3 of the ICCPR. This provision allows measures that restrict freedom of expression under certain conditions: Restrictions must be “easily understood and accessible (legality requirement), to have the purpose of advancing one of several listed objectives (legitimate aim requirement), and to be necessary and narrowly tailored to the specific objective (necessity and proportionality requirement)” [p. 10].

  • Legality 

Concerning this requirement, the Board argued that Facebook’s Violence and Incitement Community Standard is not clear enough since the “process and criteria for determining veiled threats is not explained to users in the Community Standards, making it unclear what ‘additional context’ is required to enforce the policy” [p. 10].

  • Legitimate aim

For the Board, Facebook’s measure complied with this requirement since the aim of the restriction was the protection of “the rights to life and integrity of those targeted by the post” [p. 10].

  • Necessity and Proportionality

When studying this requirement, the Board analyzed several factors before reaching the conclusion that the post was unlikely to lead to physical harm, thus rendering Facebook’s measure unnecessary. 

The majority noted that the broad nature of the target (“kafirs”) and the lack of clarity around potential physical harm or violence, which did not appear to be imminent, contributed to their conclusion. The Board also opined that the post made no veiled reference to a particular action in a specific time or location. In addition, the fact that the user was not a public figure, a state actor, or someone with influence over other people’s conduct was significant. For these reasons, a majority of the Board concluded that “physical harm was unlikely to result from this post” [p. 11].

A minority of the Board considered that the post was indeed threatening. Events like the “Charlie Hebdo killings and recent beheadings in France related to blasphemy mean this threat cannot be dismissed as unrealistic” [p. 11]. For the minority position, violence does not need to be imminent for Facebook to remove content “that threatens or intimidates those exercising their right to freedom of expression” [p. 11].

Nonetheless, most of the Board members considered that Facebook failed to properly analyze all the contextual information. Thus, the Board overturned “Facebook’s decision to take down the content, requiring the post to be restored” [p. 11].

Policy Advisory Statement

In addition to restoring the removed post, the Oversight Board urged Facebook to “provide users with additional information regarding the scope and enforcement of the Violence and Incitement Community Standard” [p. 12]. The enforcement criteria used by Facebook, besides being public, “should address intent, the identity of the user and audience, and context” [p. 12]. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

This decision expands freedom of expression by allowing content that, although controversial or offensive, touches upon religious, political, and social issues. The Board’s assessment of this case — which argues that the removed post does not threaten violence and is unlikely to lead to harm — widens the scope of protection of speech on sensitive issues, in line with International Human Rights standards on the topic.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback