Global Freedom of Expression

Oversight Board Case of Pro-Navalny Protests in Russia

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    May 26, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2021-004-FB-UA
  • Region & Country
    Russian Federation, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Safety, Bullying and Harassment
  • Tags
    Political speech, Oversight Board Policy Advisory Statement

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On May 26, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a comment from Facebook in which a supporter of imprisoned Russian opposition leader Alexei Navalny referred to another user as a “cowardly bot”. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” and since the content was reported by the attacked user, it was removed. Although the Board concluded that the removal was in line with the Bullying and Harassment Community Standard, it considered the measure an unnecessary and disproportionate restriction on free expression under International Human Rights and not compliant with Facebook’s values.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On January 24 (year not specified) a Facebook user in Russia published a post “consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23” [p. 4]. Another user (the Protest Critic) responded to the post claiming that people protesting in Moscow were all school children, mentally “slow”, and were “shamelessly used”. Other users defended the protesters and questioned the Protest Critic, arguing that this user spread nonsense and misunderstood the Navalny movement [p. 5]. The Protest Critic responded to several comments referring to Navalny as “rotten” and a “pocket clown” and “claiming that people supporting him have no self-respect” [p. 5]. The user also referred to people who took their grandparents to the protests as “morons”.

A user who self-identified as elderly and “as having participated in the protest in Saint Petersburg” [p. 5] (the Protester), responded to the Protest Critic. The Protester “noted that there were many people at the protests, including disabled and elderly people, and that they were proud to see young people protesting” [p. 5]. Likewise, the Protester stated that the Protest Critic was mistaken for thinking young protesters were manipulated. Lastly, the Protester ended their comment by calling the Protest Critic a “cowardly bot’” [p. 5].

The Protest Critic reported the Protester’s comment for bullying and harassment. Upon review, Facebook removed the post, explaining that “the term ‘cowardly’ was a negative character claim against a ‘private adult’” [p. 5]. Additionally, Facebook did not find the term “bot” to be a negative character claim. The Protester appealed the decision to Facebook; however, it determined that the comment breached its Bullying and Harassment Policy. The deleted content was “reviewed within four minutes of the Protester requesting an appeal, which according to Facebook ‘falls within the standard timeframe’ for reviewing the content on appeal” [p. 5].


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove a comment on Facebook, in which a supporter of the imprisoned opposition leader Alexei Navalny called another user a “cowardly bot”, was in line with the company’s Bullying and Harassment Community Standard, and the company’s values. The Board also studied, through a three-part test, if this measure complied with Facebook’s human rights responsibilities.

In its submission to the Board, the Protester argued that their comment was not offensive and that the Protest Critic “sought to prevent people from seeing contradictory opinions and was ‘imposing their opinions in many publications’” [p. 8].

For its part, Facebook stated that “Community Standards require the removal of content that targets private adults with negative character claims whenever it is reported by the targeted person. A user is deemed to be targeted when they are referenced by name in the content.” [p. 8]. The term “cowardly”, Facebook noted, could be easily understood as a negative character claim aimed at the Protest Critic, thus it was removed. 

Compliance with Community Standards

The Board concluded that the content removal was consistent with the “Do not” rule that prohibited targeting private individuals with negative character claims. The Community Standard on Bullying and Harassment stated that: “Facebook removes negative character claims aimed at a private individual when the target reports the content. If the same content is reported by a person who is not targeted, it will not be removed” [p. 10]. However, the Board found that in light of the context and the discussion surrounding the case, the word “cowardly” did not appear to be a serious or harmful connotation. However, it understood why Facebook had concluded that the Protest Critic was a private individual and that the term “cowardly” could be construed as a negative character claim.

Nevertheless, the Board identified that the negative character claim was used in a heightened exchange on a matter of public issue and was no worse than the language used by the Protest Critic, who had voluntarily engaged in a debate on a matter of public interest. The Board noted that the wider context should have been considered and found that removing content because of the use of a single term highlights a decontextualized approach that may disproportionately restrict freedom of expression. Ultimately, the Board found Facebook’s decisions to remove content seemed to be based on a single word, if that word is deemed to be a negative character claim, regardless of the context of any exchange of which the content may form a part.

Compliance with Facebook’s values

The Board found that by removing the content, Facebook failed to balance the values of “Dignity” and “Safety” against “Voice”. 

The Board stressed that political speech was central to the value of “Voice” and that it should only be limited where there are clear concerns around “Safety” or “Dignity.” Thus, it underscored that a certain level of disagreement in an online political discussion should be expected. Additionally, the Board remarked that the value of “Voice” was particularly important in countries where freedom of expression is routinely suppressed. Hence, it observed that assessing Russia’s context was crucial to the decision’s rationale.  

While the Board recognized that the values of “Safety” and “Dignity” protected users from feeling threatened, silenced, or excluded, the Protest Critic was not invited to provide a statement and so the impact of the post on them was unknown. However, the Board stated that “analysis of the comment thread shows the user actively engaged in a contentious political discussion and felt safe to attack and insult Navalny, his supporters, and January 23 protesters” [p. 12]. Therefore, considering the tone of the overall exchange, the Board determined that the term “cowardly bot” had a minor risk of likely harm to the Protest Critic. 

Compliance with Facebook’s human rights responsibilities

After applying the three-part test, the Board found that the removal of the content under the Bullying and Harassment Community Standard was inconsistent with Facebook’s human rights responsibilities.

I. Legality 

The Board highlighted that the principle of legality under international human rights law requires rules that limit expression to be clear and accessible. However, it found that the Bullying and Harassment Community Standard was organized in a way that made it difficult for users to understand what was allowed by Facebook. It explained that the policy rationale offered a broad understanding of what the Standard aimed to achieve, including making users feel safe and preventing speech that degrades or shames them. The rationale is followed by several “Do nots” and additional rules under two yellow warning signs. However, the Board believed, “it is not made clear in the Community Standards if the aims of the rationale serve simply as guidance for the specific rules that follow, or if they must be interpreted conjunctively with the rules. Furthermore, the information is organized in a seemingly random order. For example, rules applicable to private individuals precede, follow and are sometimes mixed in with rules related to public figures” [p. 13]. 

In addition, the Board found that Standard failed to differentiate between bullying and harassment. It remarked that combining the different concepts of bullying and harassment into a single definition and the corresponding set of rules could result in the removal of expressions that constitute legitimate speech. Furthermore, the Board considered that while the Bullying and Harassment Policy applied differently to various categories of individuals and groups, it failed to define these categories. Consequently, the Board concluded that the Community Standard failed the requirement of legality.

II. Legitimate aim

The Board acknowledged that the Bullying and Harassment Community Standard aimed to protect the rights of others, as established in Article 19 of the ICCPR. Yet, it underscored that users’ freedom of expression could also be undermined when forced off the platform due to bullying and harassment. While it noted that the policy aimed to deter behavior that could cause significant emotional distress and psychological harm, affecting the users’ right to health, the Board also observed that the existence of a rule’s connection to a legitimate aim was not sufficient to satisfy human rights standards on freedom of expression.

III. Necessity and proportionality

The Board concluded that Facebook’s decision to remove the content under its Bullying and Harassment Community Standard was unnecessary and disproportionate. The Board identified blocking or muting another user as a useful and less restrictive tool against abuse. It explained that analyzing the context was essential when assessing necessity and proportionality; however, in the immediate case, moderators had failed to recognize Russia’s heated context and the fact that both users were equally hostile in their comments. Finally, the Board believed that the removed content appeared to have lacked elements that often constitute bullying and harassment, such as repeat attacks or an indication of a power imbalance. In conclusion, it held that while calling someone cowardly could be a negative character claim, in the specific case at issue, the content resulted from a heated political exchange on current events in Russia.

Policy advisory statement:

The Board recommended Facebook amend and redraft its Bullying and Harassment Community Standard. In the first place, it must explain the relationship between the policy rationale and the “Do nots” as well as the other rules restricting content that follow it. Additionally, it should differentiate between bullying and harassment and define each term. Further, the Community Standard should clearly explain to users how bullying and harassment differ from speech that only causes offense and could be protected under International Human Rights Law. Moreover, Facebook needs to format the Community Standard by user categories currently listed in the policy and define its approach to different target user categories with examples. It should also illustrate violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target. In the same way, when Facebook is assessing content including a ‘negative character claim’ against a private adult, it should require an assessment of the social and political context of the content. This is relevant to deciding whether content should be removed or not. Finally, whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact so that they can repost it without the negative character claim.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision expands expression because the Board protected political speech. The Board recognized that it is central for freedom of expression, and users of Facebook’s platforms should be able to disagree in an online political discussion.  The Board applied a three-part test that revealed that the restrictions on expression, in this case, were not consistent with Human Rights standards. Additionally, the decision explained to Facebook that bullying and harassment differ from speech that only offends, and the latter may be protected by International Human Rights Law, so it must be allowed on the company’s platforms. 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, Art. 19, para. 2

    The Board highlighted the heightened protection of political speech through this precept on freedom of expression.

  • ICCPR, art. 19

    The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression through any media and regardless of frontiers. It also used it to apply the three-part test of legality (clarity), legitimacy, and necessity and proportionality.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this general comment to underscore the relevance of free communication regarding political issues.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board referenced the report to underscore the obligations of companies in protecting freedom of expression and abiding to International Human Rights standards.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback