Global Freedom of Expression

Oversight Board Case of Myanmar Bot

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    August 11, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2021-007-FB-UA
  • Region & Country
    Myanmar, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Oversight Board Policy Advisory Statement, Political speech

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board overturned Facebook’s (now Meta) decision to remove a post from Facebook in which a user, who appeared to be in Myanmar, used profanity in the Burmese language to describe the Chinese Government and its policy in Hong Kong. Facebook removed the content because it considered it to violate the Hate Speech Community Standard, which prohibits profane phrases that target a person, or a group of people based on their race, ethnicity, or national origin. The Board concluded that the content was directed at the Chinese state rather than the Chinese people. Specifically, the user used an obscenity to refer to a Chinese policy in Hong Kong as part of a political discussion on the Chinese government’s role in Myanmar. Thus, the Board argued that the content abided by the company’s Community Standards. Similarly, the Board noted that human rights standards supported restoring the content and highlighted the importance of protecting political speech. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.

 


Facts

In April 2021, a Facebook user who claimed to be in Myanmar published in Burmese a post discussing the February 1, 2021, coup in Myanmar. The post examined different strategies for limiting funding to the Myanmar military. It suggested giving tax money to the Committee Representing Pyidaungsu Hluttaw (CRPH), a collection of lawmakers who opposed the coup. The post generated around 500,000 views, 6,000 replies, and approximately 6,000 shares. Nobody on Facebook reported the post.

The allegedly offensive portion of the user’s post was translated by Facebook as “Hong Kong people, because the fucking Chinese tortured them, changed their banking to the UK, and now (the Chinese) they cannot touch them” [p. 3]. The day after it was posted, Facebook removed the post because it violated their “Tier 2” Community Standard for Hate Speech which forbids using “profane language or phrases with the intent to insult” [p. 4] in anything that targets a person or group of persons based on their race, ethnicity, or national origin.

A reshare of the original post was “automatically picked as a part of a sample and forwarded to a human reviewer to be used for classifier training” [p. 4], according to Facebook (now Meta). Facebook creates data sets of instances of violating and non-violating content to train its automated detection and enforcement processes to foresee if the information breaches Facebook’s policies. The reviewer found that the shared post was in violation of the Community Standard against Hate Speech. While the process aimed to generate sets of content to train the classifier, the shared post was removed once it was determined to be violating Facebook’s policies.

The original post was automatically recognized for review by an “administrative action bot” once it was determined that the shared post had violated Facebook’s standards. According to Facebook, the administrative action bot is an internal Facebook account that executes “a variety of actions throughout the enforcement system based on decisions made by humans or technology” [p. 4] and does not evaluate content. The original post was then examined by two human reviewers, who concurred that it was “Tier 2” hate speech; the material was taken down. A fourth human reviewer at Facebook affirmed the removal when the user filed an appeal against it. According to Facebook, the content reviewers in this case were all part of a Burmese content review team at the company. After that, the user appealed to the Oversight Board.


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove a post that used profanity against the Chinese government and its policy in Hong Kong aligned with the company’s Community Standards and values. The Board also studied, through a three-part test, whether this measure complied with the company’s Human Rights responsibilities. 

In its submission to the Board, the user, who self-identified as an activist, stated that the content was meant to “stop the brutal military regime and provide advice to democratic leaders in Myanmar. The user also reiterated the need to limit the Myanmar military regime’s funding” [p. 6].

For its part, Facebook claimed that the content was removed because the post used “profane curse words targeted at people based on their race, ethnicity and/or national origin. According to Facebook, the allegedly violating content was considered to be an attack on Chinese people” [p. 6]. According to Facebook, this amounted to a Tier 2 attack under the company’s Hate Speech policy. Facebook also argued that after the February 2021 coup “there were reports of an increasing anti-Chinese sentiment in Myanmar and that ‘several Chinese people were injured, trapped, or killed in an alleged arson attack on a Chinese-financed garment factory in Yangon, Myanmar’” [p. 7]. 

Facebook also argued that the phrase in Burmese “$တရုတ်” can be translated as “fucking Chinese” (or sout ta-yote): “Facebook’s regional team further specified that ‘$’ can be used as an abbreviation for ‘စောက်’ or ‘sout,’ which translates to ‘fucking’” [p. 6]. The word “ta-yote”, according to Facebook’s team, “is perceived culturally and linguistically as an overlap of identities/meanings between China the country and the Chinese people” [p. 6]. In this linguistic context, and since the user did not specify whether the term used was meant to refer to the country or the government of China, Facebook decided to remove the content under its Hate Speech Community Standard. 

Compliance with Community Standards

Following Facebook’s Community Standard on Hate Speech, the Board determined that reinstating this content is appropriate. According to Facebook’s policy, it is forbidden to use “profane phrases with the purpose to offend” [p. 6] that specifically mention someone’s race, ethnicity, or national origin. The Board determined that the post, which was made in the context of debating the Chinese government’s role in Myanmar, was not directed at any specific individual but was aimed at the Chinese government’s policies in Hong Kong.

The Board requested feedback from the public and two translations of the text. These contained translations from two Burmese speakers, one who lived in Myanmar and the other who lived abroad. According to public comments and the Board’s translators, the same word is used in Burmese to refer to both states and residents of those states. The Board argued that context is essential for interpreting the intended meaning, especially when implementing Facebook’s hate speech policy. The Board noted that when the user posted the content, the Hate Speech Community Standard forbade attacks on individuals based on their national origin but did not prohibit attacks against nations.

After considering several factors, the Board decided that this post did not specifically target Chinese individuals because of their ethnicity, race, or national origin. First, the broader post offered ideas for limiting financial engagement with the military regime and increasing funding for the CRPH. Second, rather than referring to specific individuals or Chinese people in Myanmar, the allegedly infringing portion of the post refers to China’s financial policies in Hong Kong as “torture” or “persecution”. Third, even though the lack of complaints about a post that has been shared a lot doesn’t always mean it is not illegal, this post received over 500,000 views and more than 6,000 shares, yet no complaints were made. Fourth, although the same term is used to refer to both a state and its people, both translators that the Board consulted indicated that in this case, it referred to the state. When asked, the translators did not indicate that they were uncertain about any potential ambiguity in this reference. Fifth, according to both translators, the post used phrases that the Chinese embassy and the Myanmar government frequently use to communicate with one another. Finally, the public comments generally noted that the post’s overall sound was primarily a political discussion.

The Board determined that it did not infringe Facebook’s Hate Speech Community Standard since the profanity targeted a state and did not target individuals based on their race, nationality, or national origin. The Board remarked that it is essential to ensure that restrictions on discriminating against people based on protected traits should not be interpreted in a way that defends governments or organizations from criticism.

The Board highlighted that Facebook had revised its Hate Speech Community Standard during the Board’s deliberations in this matter to add information on how it prohibits “concepts” connected to protected characteristics in certain circumstances. However, The Board did not examine how this policy would apply to the matter at issue because the content was not a part of the Community Standard at the time Facebook removed the post, and Facebook did not argue to the Board that it did so under the updated Standard. However, the Board did point out that a broad spectrum of expression, including political speech, could be covered by the phrase “concepts, institutions, ideas, practices, or beliefs” [p. 10]. 

Compliance with Facebook’s values

The Board determined that reinstating this content aligned with Facebook’s core principles. It remarked that although Facebook’s “Dignity” and “Safety” values were significant, the content did not put those values at risk to the point where “Voice” should have been replaced, considering the February 2021 coup in Myanmar. Additionally, the Board determined that the post included political speech, which is essential to “Voice”.

Compliance with Facebook’s human rights responsibilities

The Board underscored that under the UN Guiding Principles on Business and Human Rights, Facebook has pledged to uphold international human rights standards. After applying the three-part test, the Board found that removing the content was inconsistent with the company’s human rights responsibilities.

I. Legality 

The Hate Speech Community Standard forbids using vulgarity to disparage anyone based on their race, ethnicity, or national origin. Due to the challenges in determining intent at scale, Facebook, according to a statement to the Board, considered “the phrase ‘fucking Chinese’ as referring to both Chinese people and the Chinese country or government, unless the user provides additional context that it refers solely to the country or government.” [p. 11]. The Community Standard makes no mention of the default policy of removal.

As mentioned in the Board’s examination of the Hate Speech Community Standard, the Board observed that the user supplied further context to the post being about a state or country. The Board’s translators, those who commented publicly, and likely many of the more than 500,000 users who saw the message but did not report it, all came to a different conclusion than Facebook’s multiple reviewers. Given this discrepancy, the Board asked whether Facebook’s internal guidelines, resources, and training for content moderators are adequate.

The Board did not assess whether the private policy of defaulting to removal violates the legality principle given that it has determined that the user did not breach Facebook’s Hate Speech policy. However, the Board expressed worry that the Community Standards are not clear as regards the policy of defaulting to removal when considering profanity that may be understood as addressed either to a people or to a state. The Board considered that Facebook should, in general, make “public internal guidance that alters the interpretation of its public-facing Community Standards” (p. 11).

II. Legitimate aim

Any censorship of speech must serve one of the legitimate aims specified in the ICCPR; the “rights of others” are among these aims. Facebook claimed that its anti-hate speech guidelines are intended to shield users from discrimination. The Board acknowledged the legitimacy of this objective.

III. Necessity and proportionality

In this instance, the Board concluded that blocking this post would not fulfill a protective function based on its interpretation of the content. The Board determined that default removal should not apply in this instance since there are significant risks involved in both maintaining harmful content and removing content that poses little or no risk of harm. Facebook should be especially careful to avoid removing political criticism and expression, in the instant case, supporting democratic governance, even when its concerns about hate speech in Myanmar are well-founded.

The Board emphasized that Facebook’s policy of assuming profanity when stating national origin may result in excessive enforcement in some linguistic circumstances, such as this one, where the same word is used for referring to both the state and the people. The Board further highlighted that this removal had an effect outside of the specific case because Facebook disclosed that it had been used as an example of content that violates the Hate Speech Community Standard in classifier training. 

Given the above, international human rights standards supported restoring the content to Facebook.

Policy advisory statement:

The Board recommended that Facebook should ensure that its Internal Implementation Standards are available in the language in which content moderators review content. If prioritization is required, Facebook should start by concentrating on situations where there are greater risks to human rights.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision expanded expression because it protected political criticism and expression. The Board determined that users of Meta’s platforms should be able to express their ideas and arguments against governments. Additionally, the Board concluded that it is important to consider the context when enforcing hate speech policies because it is mandatory to protect political speech and debates. Sometimes language barriers may restrict expression without a legal basis. Moreover, since the user exercised its freedom of expression in a heated political context, Meta should have ensured that they were an effective communications medium in the country.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board used the UNGPs as the legal basis of Meta’s commitment to respect human rights.

  • International Covenant on Civil and Political Rights, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression and employed the three-part test established in this article to assess if Facebook’s measure was a valid restriction to this right.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test.

  • UN, Report of the Working Group on the issue of human rights and transnational corporations and other business enterprises, A/75/212 (2020)

    The Board used Report A/75/212 to reference the relationship between human rights and transnational corporations and other business enterprises.

General Law Notes

Oversight Board Decisions:

  • Depiction of Zwarte Piet (2021-002-FB-UA)
    • The Board stated that it disagreed with Facebook’s argument that its decision to remove this content followed the Board’s rationale in this case decision. 
  • Former President Trump’s suspension (2021-001-FB-FBR)
    • The Board noted that its decision in this case recommended that Facebook “ensure adequate resourcing and expertise to assess risks of harm from influential accounts globally,” recognizing that the company should devote attention to regions with greater risks.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback