Global Freedom of Expression

Oversight Board Case of Knin Cartoon

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    June 17, 2022
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2022-001-FB-UA
  • Region & Country
    Croatia, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Meta Spirit of the Policy allowance, Oversight Board Content Policy Recommendation, Oversight Board Policy Advisory Statement, Oversight Board Enforcement Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On June 17, 2022, the Oversight Board overturned Meta’s original decision to uphold a Facebook post in which ethnic Serbs were depicted as rats. Although Meta eventually removed the content, it initially considered the post did not infringe the company’s Hate Speech Community Standard. It later decided it infringed the “spirit” but not the letter of the Standard, since the policy did not prohibit attacks against groups under a protected characteristic identified implicitly. It then decided it did infringe the letter of the policy. In its decision, the Board found that post breached the Hate Speech Community Standard and the Violence and Incitement Community Standard. It found it was dehumanizing, hateful, and may contribute to a climate in which people could feel justified in attacking ethnic Serbs. In the Board’s view, removing the content from the platform was necessary to address the severe harms posed by hate speech based on ethnicity, and aligned with Meta’s human rights responsibilities.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In December 2021, a public Facebook page with over 50,000 followers described as a news portal for Croatia posted a video with a caption in Croatian. Meta translated the caption as “The Player from Čavoglave and the rats from Knin” [p. 4]. The video was an edited version of Disney’s cartoon “The Pied Piper.”

The two minutes and 10 seconds video portrayed a city overrun by rats. While the entrance to the city in the original Disney cartoon was marked as Hamelin, the city in the edited video was labeled as the Croatian city of “Knin”. At the start of the video, the narrator described how rats and humans lived in the royal city of Knin for many years and that the rats had decided they wanted to live in a “pure rat country” and thus started harassing and persecuting people living in the city. 

The narrator then explained that when rats had taken over the city, a piper from the Croatian village of Čavoglave appeared, yet initially, the rats had not taken the piper seriously and continued with “the great rat aggression”. However, after the piper started to play a melody with his “magic flute,” the rats, captivated by the melody, started to sing “their favorite song” and followed the piper out of the city. Meta translated the lyrics of the song sung by the rats as: “What is that thing shining on Dinara, Dujić’s cockade on his head […] Freedom will rise from Dinara, it will be brought by Momčilo the warlord” [p. 5]. 

The video then depicted the city’s people closing the gate behind the piper and the rats and ended with the piper herding the rats into a tractor, which then disappeared. The narrator concluded that once the piper lured all the rats into the “magical tractor,” the rats “disappeared forever from these lands” and “everyone lived happily ever after” [p. 5]. 

The content was viewed over 380,000 times, shared over 540 times, received over 2,400 reactions, and had over 1,200 comments. Among the comments in Croatian were statements translated by Meta as: “Hate is a disease” and “Are you thinking how much damage you are doing with this and similar stupidities to your Croatian people who live in Serbia?” [p. 6]. 

Users reported the content 397 times, of which 362 flagged it as hate speech; however, Meta did not remove the content. As a result, several users appealed to the Board. 

The decision of the Board in the immediate case was based on the appeal filed by one of these users, whose account appeared to be located in Serbia and whose report was automatically rejected by an automatic system. After the user appealed to the Board, Meta conducted an additional human review, again finding that the content did not violate its policies.

In January 2022, when the Board identified the case for a full review, Meta determined that, while the post did not violate the letter of its Hate Speech policy, it infringed the policy’s spirit and removed the post from Facebook. Yet, when drafting an explanation of its decision for the Board, Meta concluded that the post did violate the letter of the Hate Speech policy and noted that all previous reviews were in error. 


Decision Overview

The main issue for the Board to analyze in this case was if Meta’s original decision to keep the content on Facebook was in line with Facebook’s Hate Speech and Violence and Incitement Community Standards, as well as the company’s Values and human rights responsibilities. 

The user who reported the content and appealed to the Board stated that the Pied Piper symbolized the Croatian Army, which in 1995 conducted an expulsion of Croatia’s Serbs, portrayed here as rats. According to the user, Meta did not assess the video correctly since the content represented ethnic hate speech and fostered ethnic and religious hatred in the Balkans.

In the explanation Meta provided to the Board, the company elucidated its review process for making a decision yet focused on explaining why its eventual removal of the content under the Hate Speech policy was justified. Meta described that initially, it had determined that the content did not violate the letter of the hate speech policy and made a “spirit of the policy” decision to remove the content. 

Meta explained that its earlier determination that the content only violated the “spirit” of the Hate Speech policy was based on the assumption that the policy’s language did not prohibit attacks against groups under a protected characteristic identified implicitly. However, after additional review, the company concluded that it was more accurate to say that the policy language also prohibited attacks that implicitly identified a protected characteristic. 

The company then remarked that its eventual removal was consistent with its values of “Dignity” and “Safety” when balanced against the value of “Voice.” Meta noted that given the history of ethnic tensions and continuing discrimination against ethnic Serbs in Croatia, the video could have contributed to a risk of real-world harm. 

Meta commented that the removal was consistent with international human rights standards. It claimed that the policy was “easily accessible” on Meta’s Transparency Center website and that its decision to remove the content was legitimate to protect the rights of others from discrimination. Additionally, the company argued that removing the content was necessary and proportionate since it did “not allow users to freely connect with others without feeling attacked based on who they are” [p. 12]. Moreover, it considered that there were no less intrusive means to limit such content other than removal. 

When the Board asked Meta whether the content violated the Violence and Incitement Community Standard, the company responded that it did not because the content did not contain threats or statements of intent to commit violence against ethnic Serbs. Thus, Meta deemed that exclusion or expulsion without violence does not constitute a violent threat. According to the company, a more overt connection tying the rats in the video to the violent and forcible displacement of Serbs would have been necessary to remove the content under the policy. 

Compliance with Meta’s content policies

Regarding Meta’s content policies, the Board focused on whether the content violated the Hate Speech Community Standard and the Violence and Incitement Standard. It considered that since the video included a visual comparison of Serbs to rats and support for expelling Serbs from Knin, the content contained two “attacks” within the definition of that term in the Hate Speech policy. 

The Board explained that implied comparisons to “[a]nimals that are culturally perceived as intellectually or physically inferior” were prohibited by Meta’s Hate Speech policy. The Board noted that while the caption and video did not mention ethnic Serbs by name, the content of the video in its historical context, the replacement of the name “Hamelin” with “Knin,” the lyrics used in the video, the identification of the piper with Čavoglave and therefore with the song by Thompson about Operation Storm, and the use of the tractor image were unmistakable references to Serb residents of Knin. Moreover, the Board deemed that the comments on the post and the many user reports confirm that this connection was abundantly clear to people who viewed the content.

Further, the Board considered that the display of implicit praise for expelling Serbs from Knin in the video constituted support for ethnic cleansing in violation of the Hate Speech Standard. The Board also considered that the content infringed the company’s Violence and Incitement policy which prohibits content that threatens others by referring to historical incidents of violence.

The Board highlighted that in the video, the city was named Knin, and the rats flee on tractors, both references to Operation Storm, the 1995 military operation that reportedly resulted in the displacement, execution, and disappearance of ethnic Serb civilians. Thus, it stressed that the video could contribute to a climate where people feel justified in attacking ethnic Serbs. 

Moreover, the Board disagreed with Meta’s assessment that the content did not contain threats or statements of intent to commit violence and that calls for exclusion or expulsion without specifying means of violence may not constitute a violent threat. The Board considered that the use of the Pied Piper story was not an advocacy of peaceful removal but an apparent reference to known historical incidents of violence, in particular with the imagery of the tractor. It then recalled that, as evidenced by the users who reported this post and the public comments, in the eyes of observers, rats in the cartoon represent the ethnic Serb population of the Knin area, including those who remained there. 

The Board then examined why the company had concluded on several occasions that the content did not violate Meta’s policies. It first expressed that it would have been helpful if, in its submission, Meta had focused on this at the outset rather than on why its revised decision to remove the post was correct. In the Board’s view, if the company wished to reduce the level of violating content on its platform, it needed to treat the Board’s selection of enforcement error cases as an opportunity to explore the reasons for its mistakes. 

Aware of the complexity and volume of content that human reviewers assess each day and the difficulty of applying Facebook’s Community Standards while accounting for context, the Board believed that it was important for Meta to improve its instructions to reviewers, as well as its pathways, and processes for escalation. 

In the Boards view, two factors that may have prevented content escalation in the immediate case were, on the one hand, that Meta failed to provide at-scale reviewers with clear thresholds on when content is “trending” and on the other, the use of the automated review system.

Compliance with Meta’s values

The Board then found that the company’s decision to remove the content eventually was consistent with Meta’s values of “Voice,” Dignity,” and “Safety.” The Board deemed that those targeted by dehumanizing and negative stereotypes could also see their “Voice” affected, as their use may have a silencing impact on those targeted and inhibit their participation on Facebook and Instagram.  The Board considered that the continuing increase in cases of physical violence against ethnic Serbs in Croatia justified displacing the user’s “Voice” to protect the “Voice,” “Dignity,” and “Safety” of others. 

Compliance with Meta’s human rights responsibilities

To unravel if the restriction on freedom of expression was justified under Meta’s human rights responsibilities, the Board proceeded to employ the three-part test in Article 19 of the ICCPR. 

I. Legality (clarity and accessibility of the rules)

The Board stated that the Hate Speech Community Standard prohibits implicit targeting of groups based on protected characteristics, including dehumanizing comparisons to animals and statements advocating or supporting exclusion. However, in the immediate case, about 40 human reviewers had decided the content did not violate the Hate Speech Community Standard. For the Board, the reviewers had consistently interpreted the policy as requiring them to find an explicit comparison between ethnic Serbs and rats before finding a violation. The Board thus considered that the confusion throughout the process evidenced a need for clearer policy and implementation guidance. 

II. Legitimate aim

The Board then recalled that the Hate Speech Community Standard and the Violence and Incitement Standard pursue the legitimate aim of protecting the rights of others, especially the “rights to equality and non-discrimination” [p. 18].

III. Necessity and proportionality

Lastly, the Board examined the requirement of necessity and proportionality. It recalled that the Facebook Hate Speech Community Standard prohibits specific forms of discriminatory expression, including a comparison to animals and calls for exclusion. Given that content in the immediate case compared ethnic Serbs to rats and celebrated past acts of discriminatory treatment, and was dehumanizing and hateful, the Board found removing this content from the platform was necessary to address the serious danger that hate speech based on ethnicity poses. 

Moreover, the Board found that removing the content from the platform was a necessary and proportionate measure since less invasive interventions, such as warning screens, or other measures to reduce dissemination, would not have provided adequate protection against the cumulative effects of leaving the content on the platform. 

In conclusion, the Board determined that Meta’s decision to remove the post from the platform is consistent with Meta’s human rights responsibilities as a business.

Policy advisory statement

 The Board recommended Meta clarify the Hate Speech Community Standard and the guidance provided to reviewers, in order to explain that “even implicit references to protected groups are prohibited by the policy when the reference would reasonably be understood” [p. 21]. Additionally, the Board determined that Meta should notify all users who have reported content when, on subsequent review, it changes its initial determination. The company must disclose the results of any experiments assessing the feasibility of introducing this change with the public. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

The Oversight Board decision contracted expression by agreeing with Meta’s decision to remove the content. However, it did so under an international human rights law recognized exception since the content in question was dehumanizing and hateful given that it compared ethnic Serbs to rats and celebrated past acts of discriminatory treatment. In the Board’s view, removing the content from the platform was necessary to address the severe hate speech based on ethnicity.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board highlighted the heightened protection of political speech through this precept on freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression.

  • ICCPR, art. 2

    The Board referred to this provision to underscore Facebook’s human rights responsibilities regarding equality and non-discrimination.

  • ICCPR, art. 26

    The Board referred to this provision to underscore Facebook’s human rights responsibilities regarding equality and non-discrimination.

  • ICERD, art. 2

    The Board referred to this provision to underscore Facebook’s human rights responsibilities regarding equality and non-discrimination.

  • ICERD, Article 5

    The Board referred to this provision to underscore Facebook’s human rights responsibilities regarding equality and non-discrimination.

  • Committee on the Elimination of Racial Discrimination, General Recommendation No. 35, Combating Racist Hate Speech, of 12 September 2011

    The Board referred to this provision to underscore Facebook’s human rights responsibilities regarding equality and non-discrimination.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • OHCHR, Rabat Plan of Action on the prohibition of advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence (2011).

    The Board referred to this instrument to guide its analysis regarding the responsibility of businesses in relation to the protection of Human Rights.

General Law Notes

Oversight Board decisions:

  • Zwarte Piet (2021-002-FB-UA )
    • The Board recalled its decision in this case to note that “moderating content to address the cumulative harms of hate speech, even where the expression does not directly incite violence or discrimination, can be consistent with Facebook’s human rights responsibilities in certain circumstances.” Moreover, the Board explained that in this case, it had determined that “less severe interventions, such as labels, warning screens, or other measures to reduce dissemination, would not have provided adequate protection against the cumulative effects of leaving (…) content of this nature on the platform.”
  • Armenians in Azerbaijan (2020-003-FB-UA) 
    • By citing its decision in this case, the Board remarked that the context of language which targeted a group based on national origin during conflict might create an environment in which acts of discrimination and violence are more likely.
  • South Africa Slur ( 2021-011-FB-UA)
    • The Board referenced its decision in this case to highlight that it is in line with Meta’s human rights responsibilities to prohibit “some discriminatory expression” even “absent any requirement that the expression incite violence or discriminatory acts”.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

 According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback