Global Freedom of Expression

Oversight Board Case of “Two-Buttons” Meme

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    May 20, 2021
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2021-005-FB-UA
  • Region & Country
    United States, North America
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Oversight Board Policy Advisory Statement, Satire/Parody

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On May 22, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a comment on Facebook that included an adaptation of the “two buttons” meme. The meme depicted a cartoon character, sweating, with the Turkish flag substituting his face, in front of two buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it”. Facebook considered that the line “The Armenians were terrorists that deserved it” violated the company’s Hate Speech Community Standard. After analyzing the content as a whole, the Board considered that the comment was of satirical nature, and rather than mock or discriminate against Armenians, the post criticized, and raised awareness about, the Turkish government’s contradictory denialism of the Armenian genocide. Likewise, the Board considered that Facebook’s restriction of the user’s freedom of expression was not necessary or proportional, under international human rights standards, since the removed content did not endorse hateful speech against Armenians, on the contrary it criticized said speech. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On December 24, 2020, a Facebook user in the United States posted a comment “with an adaptation of the ‘daily struggle’ or ‘two buttons’ meme” [p. 4]. The meme featured a split screen cartoon. In the first panel the title character appears with a Turkish flag substituting his face. He seems to be sweating and with his right hand on his head. In the second panel “there are two red buttons with corresponding statements in English: ‘The Armenian Genocide is a lie’ and ‘The Armenians were terrorists that deserved it.’ The meme was preceded by a ‘thinking face’ emoji” [p. 4]. The meme was shared on a public Facebook page “that describes itself as a forum for discussing religious matters from a secular perspective” [p. 4].

The comment responded to a post with an image “of a person wearing a niqab with overlay text in English: ‘Not all prisoners are behind bars’”[p. 4]. At the time the comment was removed, it had 423 reactions, 260 views and 149 comments.

The comment was reported by a Facebook user in Sri Lanka who considered the content violated the Hate Speech Community Standard. On December 24, 2020, Facebook decided to remove the meme from its platform arguing that it violated their Community Standard on Cruel and Insensitive Content.  The user appealed this decision to Facebook, which upheld its decision “but found that the content should have been removed under its Hate Speech policy” [p. 4]. For the company, the statement “‘The Armenians were terrorists that deserved it’ specifically violated the prohibition on content claiming that all members of a protected characteristic are criminals, including terrorists” [p. 4]. Facebook never informed the user that it had “upheld the decision to remove their content under a different Community Standard” [p. 4].

On December 24, 2020, the user appealed to the Oversight Board.


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove a comment, including a meme depicting a cartoon character, sweating, with the Turkish flag substituting his head, in front of two buttons “with corresponding statements in English: ‘The Armenian Genocide is a lie’ and ‘The Armenians were terrorists that deserved it’” [p. 4], complied with the company’s Community Standard on Hate Speech, and its values. The Board also examined —through a three-part test— if Facebook’s measure complied with International Human Rights standards on freedom of expression. 

In its submission to the Board, the affected user claimed “that their comment was not meant to offend but to point out ‘the irony of a particular historical event’” [p. 7]. The user also considered Facebook’s policies to be too restrictive.

For its part, Facebook argued that the second statement of the meme (“The Armenians were terrorists that deserved it”) was “a Tier 1 attack under the Hate Speech Community Standard” [p. 7], since it alleged that a whole ethnic and national group (protected characteristics) were criminals, including terrorists. 

Facebook considered that the exception “for content that shares hate speech to condemn it or raise awareness of it” [p. 8] should not apply. For the company, “the sweating cartoon character in the meme could be reasonably viewed as either condemning or embracing the statements” [p. 8]. 

The company also mentioned that in the past there was another exemption to the Hate Speech Community Standard for humorous content, which was removed “in response to a Civil Rights Audit report (July 2020) and as part of its policy development” [p. 8]. However, Facebook maintained a narrower exception for satirical content, defined as “content that ‘includes the use of irony, exaggeration, mockery and/or absurdity with the intent to expose or critique people, behaviors, or opinions, particularly in the context of political, religious, or social issues. Its purpose is to draw attention to and voice criticism about wider societal issues” [p. 8]. The exception is not mentioned in the company’s Community Standards.

Facebook argued that its decision complied with international human rights standards on freedom of expression. The company considered that its Hate Speech policy was “‘easily accessible’ in the Community Standards” [p. 9]; that the decision was intended to protect others from harm and discrimination; and that the enacted measure was both necessary and proportionate ‘to limit harm against Armenians’” [p. 9].

Compliance with Community Standards

Facebook’s Hate Speech Community Standard defines hate speech “as a direct attack on people based on what [the company calls] protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability” [p. 5]. Tier 1 attacks prohibit content that targets “a person or group of people on the basis of a protected characteristic with: dehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form) to or about […] criminals” [p. 5].

According to Facebook’s Community Standards, content that is meant to raise awareness or condemn hate speech is allowed on the platform, since “speech that might otherwise violate [Facebook’s] standards can be used self referentially or in an empowering way” [p. 6]. This exception also requires the user to indicate this intent clearly.

In light of this provision, the majority of the Board, considered that rather than mocking the victims of the Armenian Genocide, the user’s intent with their meme was to raise awareness and condemn, through satire, the Turkish Government’s effort “to deny the Armenian genocide while, at the same time, justifying the same historic atrocities” [p. 11]. As the Board’s majority noted, the “two-button” meme’s purpose is “to contrast two different options to highlight potential contradictions or other connotations, rather than to indicate support for the options presented” [p. 11].

In this context, a majority of the Board considered that the removed content fell under the aforementioned exception to the Hate Speech policy as well as the company’s, not publicly available, satire exception. For the Board’s majority, in line with what public comment PC-10007 noted, the meme did condemn other’s speech and raised awareness, because the content “mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it” [p. 12].

Taking this into consideration, for the majority of the Oversight Board “the content fell within the policy exception in Facebook’s Hate Speech Community Standard” [p. 12]. With this in mind, the Board concluded that removing this content —”in the name of protecting Armenians, when the post is a criticism of the Turkish government, in support of Armenians” [p. 12]— would be wrong. 

For a minority in the Board, the user’s intent on the matter of the Armenian Genocide was not clear enough, hence “[t]he user could be sharing the content to embrace the statement rather than to refute it” [p. 10]. Additionally, the minority found that the user was not able to properly articulate what the alleged humor intended to express. Given the content includes a harmful generalization against Armenians, the minority found that it violated the Hate Speech Community Standard. 

Compliance with Facebook’s values

Facebook’s Community Standards establish “Voice” as a paramount value for the company. This value seeks the creation of a place for expression, in which users are “able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable” [p. 6]. This value may be limited in the service of others, such as “Safety” and “Dignity”. “Safety” prohibits content on the platform that threatens, intimidates, excludes or silences people, while “Dignity” states “that all people are equal in dignity and rights” [p. 6]. Expressions that harass or degrade others are forbidden on Facebook. 

The majority of the Board concluded that restoring the user’s content complied with Facebook’s values. Although the Board considered the Armenian community’s sensitivity regarding statements about the atrocities suffered by the Armenian people, it also explained that the deleted meme did not pose a risk to “Dignity” or “Safety” that would merit displacing “Voice”. 

Compliance with International Human Rights Standards

Upon analyzing Facebook’s decision in regard to International Human Rights standards on freedom of expression, the Oversight Board noted that Article 19, para. 2 of the ICCPR provides a broad protection for all kinds of political discourse. Additionally, the UN Human Rights Committee has expressed that the protection of Article 19 also “extends to expression that may be considered ‘deeply offensive’” [p. 13].

In the case at hand, the Board considered that the meme was of satirical nature and “took a position on a political issue: the Turkish government’s stance on the Armenian genocide” [p. 13]. The Board cited the UN Special Rapporteur on freedom of expression to highlight the idea that “memes that mock public figures” and “cartoons that clarify political positions […] may be considered forms of artistic expression protected under international human rights law” [p. 13]. 

As the Board argued, following the Human Rights Committee, that expressions concerning public figures in the political domain and public institutions are highly valued. On the other hand, prohibitions “of expressions with incorrect opinions or interpretations of historical facts, often justified through references to hate speech, are incompatible with Article 19 of the ICCPR, unless they amount to incitement of hostility, discrimination or violence under Article 20 of the ICCPR” [p. 13].

Far from being an absolute right, the Board opined, freedom of expression can be restricted as long as the limitations “meet the requirements of legality, legitimate aim, and necessity and proportionality” [p. 13], as laid out by Article 19, para. 3 of the ICCPR.

  • Legality

In line with General Comment 34, restrictions on freedom of expression must be “must be clear, precise, and publicly accessible […] Individuals must have enough information to determine if and how their speech may be limited, so that they can adjust their behavior accordingly” [p. 13-14]. The Board noted that Facebook kept an exception to its Hate Speech Community Standard for satire, which was not included in the aforementioned policy, nor was it publicly available to users. Similarly, the Board expressed concern for the fact that Facebook “wrongfully reported to the user that they violated the Cruel and Insensitive Community Standard, when Facebook based its enforcement on the Hate Speech policy” [p. 14]. 

The Board concluded, due to these situations, that Facebook’s approach to user notice failed the legality test, since “the lack of relevant information for users subject to content removal ‘creates an environment of secretive norms, inconsistent with the standards of clarity, specificity and predictability’” which may interfere with “the individual’s ability to challenge content actions or follow up on content-related complaints” [p. 14].  The Board cited the UN Special Rapporteur on freedom of expression on this point. 

  • Legitimate aim

Restrictions on freedom of expression, as General Comment 34 argues, must also pursue a legitimate aim. For the Board, Facebook’s measure was enacted to protect the rights of others, specifically the right to equality (Article 2, para 1, ICCPR) and nondiscrimination (Articles 1 and 2, ICERD). In that sense, the Board considered that Facebook’s restriction complied with this part of the test.

  • Necessity and proportionality

Following the Human Rights Committee General Comment 34, the Board mentioned that restrictions on freedom of expression must also “be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” [p. 15].

When assessing whether the content removal was necessary, the Board noted that freedom of expression in Turkey faces substantial restrictions that disproportionately affect ethnic minorities living in this country, such as Armenians. In 2016, the UN Special Rapporteur on freedom of expression mentioned that in Turkey censorship is operating in “all the places that are fundamental to democratic life: the media, educational institutions, the judiciary and the bar, government bureaucracy, political space and the vast online expanses of the digital age” [p. 15]. A follow-up report of 2019 noted that the situation had not improved in the country. 

The Board also referred to the case of Dink v. Turkey, by the ECtHR, to illustrate the situation in Turkey regarding freedom of expression. In 2007, Hrant Dink —a journalist of Armenian origin “who published a number of articles on the identity of Turkish citizens of Armenian origin” [p. 15]— was assassinated. For the ECtHR, Turkey failed to take appropriate measures to protect the life of the journalist which amounted to a violation of his freedom of expression. 

It is in this context that a majority of the Board concluded that Facebook’s restriction of freedom of expression was a mistake. The meme, as the Board reiterated again, did not endorse the statements in the buttons, on the contrary, the content condemned and raised awareness “of the government’s contradictory and self-serving position” [p. 16]. The Board also argued that the meme brought to an international audience a matter of public interest since “the content was shared in English on a Facebook page with followers based in several countries” [p. 16]. 

Although the majority of the Board considered that Facebook’s limitation of freedom of expression did not meet the necessity and proportionality requirement, a minority believed the comment could “be embracing the statements contained in the meme, and thus engaging in discrimination against Armenians” [p. 16].

The Board considered the user had a right to be informed, under Article 14, para. 3(a) of the ICCPR, and that Facebook failed in its responsibility in this case when it incorrectly notified the user that their content breached their Community Standards without pointing out the specific breached standard. 

The Board ultimately decided to overturn “Facebook’s decision to remove the content and require[d] the content to be restored” [p. 17].

Policy advisory statement

The Oversight Board made several recommendations to Facebook to foster transparency within the platform. For example, the Board urged the company to “make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company [and to] [i]nclude the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard” [p. 17].

The Board also requested Facebook to set forth adequate procedures “to assess satirical content and relevant context properly. This includes providing content moderators with: (i) access to Facebook’s local operation teams to gather relevant cultural and background information; and (ii) sufficient time to consult with Facebook’s local operation teams and to make the assessment” [p. 18]. The Board also recommended that the company prioritize human review in appeals based on policy exceptions.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

This decision provides a robust protection of freedom of expression regarding matters of public interest and satirical content. The Board’s analysis of Facebook’s exceptions to the Hate Speech Community Standard underscores the relevance of careful content moderation —with enough contextual cues and the capability to comprehend irony and linguistic nuances— when assessing satirical content that reproduces problematic ideas to criticize them or raise awareness about them. In doing so, the Board promotes a positive environment for political speech about relevant public matters, through memes.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback