Global Freedom of Expression

Oversight Board Case of Armenians in Azerbaijan

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    January 28, 2021
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
    2020-003-FB-UA
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Oversight Board Policy Advisory Statement, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board upheld Facebook’s (now Meta) decision to remove a post on Facebook in which a user, in the accompanying text, used the word “taziks” — “wash bowl” in Russian—, a play on words on “azik”, a slur, or derogatory term, to refer to Azerbaijanis. The user also claimed that Azerbaijanis had no history compared to Armenians. Facebook deleted the post arguing that it breached the company’s Hate Speech Community Standard. The Board agreed with Facebook, considering that the post — uploaded amidst a recent armed conflict between Armenia and Azerbaijan— was meant to dehumanize its target. Likewise, the Board considered that Facebook’s measure to remove the content was a restriction that complied with International Human Rights standards on freedom of expression, including that the limitation was both necessary and proportional.  

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In November 2020, a Facebook user posted historical photos of Baku and Azerbaijan churches on the platform. In the accompanying text, the user claimed, in Russian, that “Armenians built Baku and that this heritage, including the churches, has been destroyed” [p. 4]. The user also used the term т.а.з.и.к.и (“taziks”) to refer to Azerbaijanis, “who the user claimed are nomads and have no history compared to Armenians” [p. 4]. “Tazik means “wash bowl” in Russian and appears to be a play on words on “azik”, a derogatory term for people from Azerbaijan. 

The user also included hashtags “calling for an end to Azerbaijani aggression and vandalism” and “the recognition of Artsakh, the Armenian name for the Nagorno-Karabakh region, which is at the center of the conflict between Armenia and Azerbaijan” [p. 4]. The content, which was posted amidst a recent armed conflict between Armenia and Azerbaijan, was viewed over 45,000 times. Facebook (now Meta) removed the post arguing that it violated the company’s Community Standard on Hate Speech. The user submitted, before the Oversight Board, a request to review this decision. 


Decision Overview

The Oversight Board analyzed whether Facebook´s decision to remove a user’s post with negative remarks about Azerbaijani people, specifically by calling them “taziks”, complied with the company’s Community Standard on Hate Speech. The Board also examined, through a three-part test, whether Facebook’s measure complied with International Law Human Rights Standards on freedom of expression. 

In its submission to the Board, the user argued that their post “was not hate speech but was intended to demonstrate the destruction of Baku’s cultural and religious heritage” [p. 7]. The user also declared that the only reason for the content removal was “because Azerbaijani users who have ‘hate towards Armenia and Armenians’ are reporting content posted by Armenians”[p. 7].

For its part, Facebook argued that the removed post violated “its Hate Speech Community Standard because it used a slur to describe a person or group of people on the basis of a protected characteristic (national origin)” [p. 7]. Facebook considered that the use of the word “taziks”, which means “wash bowl” in Russian, was a wordplay on the word “azik”. This last word “is on [Facebook’s] internal list of slur terms, which it compiles after consultation with regional experts and civil society organizations” [p. 7].

Compliance with Community Standards

The Community Standard on Hate Speech defines hate speech as “a direct attack on people based on what [Facebook calls] protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability” [p. 5]. Slurs — described as words commonly used as insulting labels — are prohibited in the platform under this policy. Facebook says such speech is not allowed because it “creates an environment of intimidation and exclusion and in some cases may promote real-world violence” [p. 5].

In light of this prohibition on the use of slurs based on ethnicity or national origin, the Board considered that the removed content did indeed violate the company’s Community Standard on Hate Speech. According to a linguistic report commissioned by the Board, “the post implies a connection between “тазики,” or “wash basin,” and “азики,” a term often used to describe Azerbaijanis in a derogatory manner” [p. 8]. The report, therefore, supported Facebook’s understanding of this term as a slur. 

Although the Board acknowledged that there could be instances in which demeaning words can be used in an empowering way, the specific context in which the term “тазики,” was used made it clear it “was meant to dehumanize its target” [p. 8]. 

Compliance with Facebook’s values

Facebook’s Community Standards establish “Voice” as a paramount value for the company. The goal of this value is “to create a place for expression” in which people are “able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable” [p. 5].

Nonetheless, “Voice” can be limited when in conflict with other values. For the Board, both “Safety” and “Dignity” were relevant values to consider for issuing a decision on this matter. “Safety” prohibits, on Facebook, threatening and intimidating expressions that “exclude or silence others” [p. 6]. “Dignity” states that “all people are equal in dignity and rights” [p. 6]. Thus, the Board deemed expressions that harass or degrade others should be excluded from the platform.

In light of this framework, the Board considered that the removal was “consistent with Facebook’s values of ‘Safety’ and ‘Dignity,’ which in this case displaced the value of ‘Voice’” [p. 8].  

For the Board, using a slur interfered with the values of “Dignity” and “Safety”. Moreover, the Board identified several contextual cues that were relevant in the matter. The Board noted that “[t]he conflict between Armenia and Azerbaijan, neighbors in the Southeast Caucasus, is of long standing” [p. 9]. From September to November 2020, several deaths were reported over the disputed territory of Nagorno-Karabakh. This context, in which the post was uploaded, highlights “the danger of dehumanizing slurs proliferating in a way that escalates into acts of violence” [p. 9].

Compliance with International Human Rights Standards

Upon analyzing the compliance of Facebook’s measure with Human Rights standards on freedom of expression, the Board noted that Facebook accepted “its responsibilities to respect human rights under the UN Guiding Principles on Business and Human Rights” [p. 9].

In line with what the UN Special Rapporteur on freedom of expression has said, the Board also opined that, although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression” [p. 9].

The Oversight Board also underscored — under Article 19, para. 2 of the ICCPR — the existence of a particular “protection to expression on political issues, and discussion of historical claims, including as they relate to religious sites and peoples’ cultural heritage. That protection remains even where those claims may be inaccurate or contested and even when they may cause offense” [p. 9].

Considering these provisions, the Board then analyzed if the measure of removing content online issued by Facebook was valid. Article 19, para. 3 of the ICCPR “requires limits on freedom of expression to satisfy the three-part test of legality, legitimacy, and necessity and proportionality” [p. 9].

  • Legality 

Any rule that restricts freedom of expression, “[t]o satisfy the requirement of ‘legality,’ […]  must be clear and accessible. Individuals must have enough information to determine if and how their speech may be limited, so that they can adjust their behavior accordingly” [p. 9]. 

In this case, the Board considered that the restrictions included in the Hate Speech Community Standard meet this requirement since this policy specifies “that ‘slurs’ are prohibited, and that these are defined as ‘words that are inherently offensive and used as insulting labels’ in relation to a number of ‘protected characteristics’, including ethnicity and national origin” [p. 10]. The Board noted that there could be other more contested scenarios in which this rule could raise concerns regarding the legality requirement, especially around concepts like “inherently offensive” or “insulting”. Nevertheless, for the Oversight Board, it was clear that the use of the word “taziks”, “connecting a national identity to an inanimate unclean object, plainly qualifies as an ‘insulting label’” [p. 10].

The Board also noted that the user attempted to conceal the slur from the automated detection tools used by Facebook by placing points between letters, which “tends to confirm that the user was aware that they were using language that Facebook prohibits” [p. 10].

  • Legitimate aim

The Board stated that restrictions on freedom of expression must also pursue a legitimate aim. The ICCPR lists these aims, which include “protecting ‘the rights of others’” [p. 10]. Facebook’s prohibition of slurs on its platform, the Board considered, aligns with International Human Rights such as equality and non-discrimination (Art. 2, para. 1 ICCPR), the right to security (Art. 9, ICCPR), and the right to life (Art. 6 ICCPR). Thus, the Board considered that Facebook’s measure fulfilled this requirement.

  • Necessity and Proportionality 

Necessity and Proportionality, as laid out by the Human Rights Committee General Comment No. 34, require “Facebook to show that its restriction on freedom of expression was necessary to address the threat, in this case the threat to the rights of others, and that it was not overly broad” [p. 11].

The Board pointed out, based on a report by the UN Special Rapporteur on freedom of opinion and expression, that human rights law “allows prohibitions on ‘insults, ridicule or slander of persons or groups or justification of hatred, contempt or discrimination’ if such expression ‘clearly amounts to incitement to hatred or discrimination’ on the grounds of race, colour, descent or national or ethnic origin” [p. 11].

A majority of the Board found the slur included in the post dehumanizing and hurtful.  The context in which the content was uploaded also mattered since the post “was widely disseminated at the height of an armed conflict between the user’s State and the State whose nationals the post attacked” [p. 11]. With this in mind, the majority of the Board opined that the post had the potential to create “a discriminatory environment that undermines the freedom of others to express themselves” [p. 11].

Likewise, the Board’s majority also considered that Facebook’s measure was proportionate. Other less severe interventions —label, warning screens, dissemination reduction— “would not have provided the same protection” [p. 12]. This majority also noted that Facebook didn’t issue more severe measures, such as “suspending the user’s account, despite the user seemingly re-posting offending content several times” [p. 12]. This illustrated that despite the content removal, the user was still free to discuss and engage in debate, as long as it abided by the Community Standards. 

Hence, the majority of the Board argued that Facebook’s decision to remove the content fulfilled the necessity and proportionality requirement. 

A minority of the Board considered that the elimination of the post was not a proportionate measure, arguing that the risks cited by the majority were remote and alternative measures could therefore have been enacted, such as warnings, reducing virality or promoting counter-messaging. This minority also considered that deleting a whole post that discussed issues of public concern —because it used a slur— did not satisfy the necessity and proportionality requirement. 

Ultimately, the Board upheld “Facebook’s decision to remove the user’s post” [p. 12].

Policy Advisory Statement

Although the Board agreed with Facebook’s decision, it also urged the company to take measures to strengthen transparency in the platform’s content moderation. It requested that Facebook always notify users of “the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing” [p. 13]. The Board noted that in this case, the user was informed that their post breached the Hate Speech Community Standard, “but was not told that the post violated the standard because it included a slur targeting national origin” [p. 12]. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

Although the Oversight Board ultimately agreed, in this particular case, with the deletion of online content, it did so because it considered that the removed post constituted a form of hate speech. This is a valid restriction that complies with several International Human Rights Law standards on non-discrimination, equality, and the protection of life and dignity. Similarly, content moderation that tackles hateful or violent language against certain groups or in the context of armed conflicts can foster a better environment for expression without harassment, especially in favor of targeted groups.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 2

    The Board referred to this provision to underscore Facebook’s human right responsibilities regarding equality and non-discrimination.

  • ICCPR, art. 6

    The Board referred to this provision to underscore Facebook’s human right responsibilities regarding the right to life.

  • ICCPR, art. 9

    The Board referred to this provision to underscore Facebook’s human right responsibilities regarding the right to security.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • UN Special Rapporteur on freedom of opinion and expression, report A/74/486 (2019)

    The Board referenced the report to underscore the obligations of companies in protecting freedom of expression and to highlight the prohibition of hate speech under International Human Rights Law

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback