Facebook Community Standards, Objectionable Content, Hate Speech
Oversight Board Case of Myanmar Bot
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On January 28, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a user’s Facebook post featuring two photographs of a Syrian toddler drowned in the Mediterranean Sea, with accompanying text stating that there was “something wrong with Muslims (or Muslim men) psychologically or with their mindset”. Facebook removed the post arguing that it breached its Hate Speech Community Standard. The Board considered that although the user’s post could be considered offensive, it did not advocate hatred or incite imminent harm. Similarly, the Board considered that Facebook’s restriction on freedom of expression, although it pursued a legitimate aim, was not necessary since removing the content would not protect any particular group from discrimination and it was unlikely to reduce tensions.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies
In October 2020, a Facebook user in Myanmar posted “two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned in the Mediterranean Sea in September 2015” [p. 3]. In the accompanying text of the post, which was in Burmese, the user stated that “there is something wrong with Muslims (or Muslim men) psychologically or with their mindset” [p. 3]. Additionally, the user questioned the lack of response by Muslims generally to the way Uyghur Muslims have been treated in China in comparison to killings in France in retaliation for cartoon depictions of Prophet Muhammad. The post was published in a “group which describes itself as a forum for intellectual discussion” [p. 3].
Facebook (now Meta) translated the statement as “[there is] something wrong with Muslims psychologically”. The company decided to remove the content in November 2020, arguing that the post was “’Tier 2 hate speech under its Community Standards”, which forbids “generalized statements of inferiority about the mental deficiencies of a person or group of people on the basis of their religion” [p. 3].
Before the post was removed, the two photographs in the post “had warning screens placed on them under the violent and graphic content Community Standard” [p. 4]. The user appealed Facebook’s decision to remove their post to the Oversight Board.
The Oversight Board analyzed whether Facebook’s decision to remove a user’s post containing remarks about Muslims complied with the company’s Hate Speech Community Standard. The Board also assessed whether Facebook’s measure complied with Human Rights standards on freedom of expression.
In its appeal to the Board, the user argued that the removed content did not include hate speech. According to the user “their post was sarcastic and meant to compare extremist religious responses in different countries” [p. 5].
For its part, Facebook reiterated that the removed content was a Tier 2 attack under the Hate Speech Community Standard because it included “a generalization of mental deficiency regarding Muslims’ […] Facebook stated that the only component of the post that violated Community Standards was the statement that something is wrong with Muslims psychologically” [p. 6].
Compliance with Community Standards
The Board proceeded to analyze if Facebook’s measure complied with the company’s Community Standard on Hate Speech. This standard does not “allow hate speech on Facebook because it creates an environment of intimidation and exclusion, and in some cases, may promote real-world violence” [p. 4]. Hate speech, as defined by Facebook, consists of attacks based on characteristics that are protected, such as “race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability”, with some protection for age and immigration status” p. 4]. Attacks, which are classified into three tiers, “may be ‘violent or dehumanizing speech, harmful stereotypes, statements of inferiority, or calls for exclusion or segregation’”.
Under Tier 2, generalizations that state inferiority related to mental deficiencies (defined as those about intellectual capacity, education, or mental health), are considered prohibited content on Facebook.
With this in mind, the Board noted that when reading the first sentence of the post, on its own, the content “might appear to be making an offensive and insulting generalization about Muslims” [p. 7]. Nevertheless, for the Board, the post should be read as a whole, considering its context.
The Board underscored the fact that experts and human rights organizations, such as FORUM-ASIA or a report from the UN independent international fact-finding mission on Myanmar, “have indicated that hate speech against Muslim minority groups in Myanmar is common and sometimes severe, in particular around the general election on 8 November 2020” [p. 7].
Nonetheless, the Oversight Board considered there was no indication that the statements of the removed content “referring to Muslims as mentally unwell or psychologically unstable are a significant part of anti-Muslim rhetoric in Myanmar” [p. 7].
Moreover, for the Board, there was a discrepancy between the translation provided by Facebook and the one provided by the Board’s translator: “While Facebook translated the sentence as ‘[i]t’s indeed something’s wrong with Muslims psychologically”, the Board’s translators found it stated ‘[t]hose male Muslims have something wrong in their mindset’ (p. 7). Thus, the Board’s translator suggested that although the terms used could show intolerance, “they were not derogatory or violent” [p. 7].
The Board also considered that, in its context, the post should be understood “as a commentary pointing to the apparent inconsistency between Muslims’ reactions to events in France and in China” [p. 7]. In light of the above, the Board deemed that such expression was protected under the Community Standards and did not merit removal.
Compliance with Facebook’s Values
Facebook’s Community Standards establish “Voice” and “Safety” as fundamental values for the platform. The goal of “Voice”, a paramount value in the platform, is to foster an environment where users can openly discuss important matters. Nonetheless, the platform may limit “Voice” in service of several other values, including “Safety” [p. 5].“Safety” highlights the importance of making Facebook a safe place. Hence, expression that threatens people or “exclude[s] or silence[s] others isn’t allowed on Facebook” [p. 5].
The Oversight Board considered that, in this case the platform’s decision to remove the content did not align with its own values. According to the Board, the post does not “pose a risk to ‘Safety’ that would justify displacing ‘Voice’” [p. 8], even in the context of discrimination against Muslims prevalent in Myanmar.
Compliance with International Human Rights Standards
The Board underscored, in line with Article 19 of the ICCPR and the UN Human Rights Council General Comment No. 34, “that individuals have the right to seek and receive information, including controversial and deeply offensive information” [p. 8]. Likewise, the Board noted that freedom of expression is not an absolute right, on the contrary in can be limited under certain circumstances. For example, under Article 20, para. 2 of the ICCPR, states are required to prohibit content containing “advocacy of religious hatred constituting incitement to discrimination, hostility or violence” [p. 8].
The Oversight Board considered the post was not prohibited under Article 20, para 2, of the ICCPR. In reaching this conclusion, the Board “considered the factors cited in the UN Rabat Plan of Action, including the context, the content of the post and the likelihood of harm” [p. 8]. Although the tone of the post was pejorative, the Board noted, it did not advocate hatred nor intentionally incited imminent harm.
The Board also analyzed if Facebook’s measure to remove the content was valid under Article 19, para. 3 of the ICCPR, which requires “restrictions on expression to be defined and easily understood (legality requirement), to have the purpose of advancing one of several listed objectives (legitimate aim requirement), and to be necessary and narrowly tailored to the specific objective (necessity and proportionality requirement)” [p. 8].
For the Board, Facebook’s decision to remove the post pursued a legitimate aim: the protection of “the rights of others to life, to security of person, physical or mental injury, and to protection from discrimination” [p. 9]. This aim is especially important, as the Board opined, in the context of Myanmar where online hate speech “has been linked to serious offline harm, including accusations of potential crimes against humanity and genocide. As such, the Board recognized the importance of protecting the rights of those who may be subject to discrimination and violence, and who may even be at risk of atrocities” [p. 9].
Nonetheless, the Board argued that, while some may consider the post offensive, its removal was not necessary in order to protect the rights of others. The Board noted that the post did not articulate threats against identifiable individuals and considered “the user’s claim in their appeal that they are opposed to all forms of religious extremism” [p. 9]. The fact that the user posted the content in a group for philosophical and intellectual discussion, and called attention to the discrimination of the Uyghur Muslims in China, lent credibility to this claim by the user.
The Board emphasized the importance of open debate, “[e]ven in circumstances where discussion of religion or identity is sensitive and may cause offense” [p. 9]. The Board argued that removing the content will not protect any particular group from discrimination and was unlikely to reduce tensions.
Thus, the Board overturned Facebook’s decision considering the platform “was incorrect to remove the content”[p. 9]. The Board required that Facebook restore the content.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision by the Oversight Board expands freedom of expression by allowing controversial speech on religious and political issues on Facebook. Although the Board recognizes the challenges in moderating content under the category of hate speech in the specific context of Myanmar, the Board’s decision takes into account Human Rights standards on freedom of expression to assess whether the contested content really falls under the aforementioned category. This fosters a better environment for expression and debate on sensitive social issues.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression.
The Board referred to this report to underscore the situation in Myanmar regarding the use of hate speech against Muslims
The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.
The Board analyzed Facebook’s human rights responsibilities regarding freedom of expression.
The Board referred to this document to analyze Facebook’s human rights responsibilities regarding incitement to hatred.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.