Global Freedom of Expression

Oversight Board Case of Depiction of Zwarte Piet

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 13, 2021
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
    2021-002-FB-UA
  • Region
    International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Meta Newsworthiness allowance, Oversight Board Policy Advisory Statement

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On April 13, 2021, the Oversight Board upheld Facebook’s (now Meta) decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface contained in its Hate Speech Community Standard. The case originated after a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long video on their timeline. The video showed a young child meeting three adults, one dressed to portray “Sinterklaas” and two portraying “Zwarte Piet,” also referred to as “Black Pete.” The two adults portraying Zwarte Piet’s had their faces painted black and wore Afro wigs under hats and colorful renaissance-style clothes. All the people in the video appear to be white, including those with their faces painted black. Facebook removed the post for violating its Hate Speech Community Standard. 

In its decision, the Board considered that while Zwarte Piet represents a cultural tradition shared by many Dutch people without apparent racist intent, the use of blackface is widely recognized as a harmful racial stereotype. A majority of the Board saw sufficient evidence of harm to justify removing the content. They argued that allowing such posts to accumulate on Facebook would help create a discriminatory environment for Black people that would be degrading and harassing. They believed that the impacts of blackface justified Facebook’s policy and that removing the content was consistent with the company’s human rights responsibilities.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On December 5, 2020, a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long-video on their timeline. As translated into English, the post’s caption stated:”‘happy child!’ and ‘thanks, Sinterklaas and Zwarte Piets’” [p.3] . The video depicted a young child meeting three adults, one dressed to portray Sinterklaas and two portraying Zwarte Piet, also referred to as “Black Pete.” The two adults representing Zwarte Piets had their faces painted black and wore Afro wigs under hats and colorful renaissance-style clothes. 

In the video, festive music played in the background as the child shook hands with Sinterklaas and one Zwarte Piet. The other Zwarte Piet placed a hat on the child’s head and said to the child in Dutch: “[l]ook here, and I found your hat. Do you want to put it on? You’ll be looking like an actual Pete! Let me see. Look….” [p.4].

The post was viewed fewer than 1,000 times, received fewer than ten comments, and had fewer than 50 reactions. Other users did not share the content. Subsequently, the post was reported by a Facebook user for violating Facebook’s Hate Speech Community Standard. 

On December 6, 2020, Facebook removed the post for violating its Hate Speech Community Standard. The company determined that the portrayals of Zwarte Piet in the video violated its policy prohibiting caricatures of Black people in the form of blackface. Facebook notified the user that their post infringed its policy. 

On December 7, 2020, after Facebook rejected the user’s appeal against their decision to remove the content, they submitted an appeal to the Oversight Board. 


Decision Overview

The main issue for the Board to analyze was whether Facebook’s decision to remove the video violated the prohibition on posting caricatures of Black people in the form of blackface contained in its Hate Speech Community Standard and whether the post’s removal was in line with the company’s values and its human rights responsibilities. 

In their appeal, the user stated to the Board that the post was meant for their child and that they wanted the content back up on Facebook. They also noted that color did not matter in this case because, in their view, Zwarte Piet was important to children. 

In its response, Facebook claimed it had removed the post as a Tier 1 attack under the Hate Speech Community Standard, specifically for violating its rule prohibiting harmful stereotypes and dehumanizing generalizations in visual form, which included caricatures of Black people in the form of blackface. 

The company also noted that the portrayals of Zwarte Piet “insult, discriminate, exclude, and dehumanize Black people by representing them as inferior and even subhuman” [p.7] because the figure’s characteristics are “exaggerated and unreal.” Moreover, it stated that Zwarte Piet is “a servile character whose typical behavior includes clumsiness, buffoonery, and speaking poorly” [p.7].

Additionally, Facebook claimed that because “[t]he two people in the video were dressed in the typical Black Pete costume — their faces were painted in blackface and they wore Afro-wigs,” its decision to remove the content was consistent with its blackface policy. 

Compliance with Community Standards

After presenting the parties’ arguments, the Board proceeded to analyze if the company’s decision to remove the post complied with the Facebook Community Standards. To do so, it highlighted that Facebook enforces its Community Standard on Hate Speech by identifying a “direct attack” and a “protected characteristic” upon which the attack is based. It then explained that the Standard includes race and ethnicity among the list of protected characteristics. Further, it recalled that in this case, Facebook had notified the user that their content violated the Hate Speech Community Standard; however, the user was not informed that the post was explicitly removed under the blackface policy. 

The Board noted the user claimed they intended to share a celebration of a festive tradition and that it had no reason to believe this view was not sincerely held. Yet, it stated that the Hate Speech Community Standard, including the rule on blackface, is structured to presume that any use of blackface is inherently a discriminatory attack. On this basis, the Board considered Facebook’s action to remove the content was consistent with its content policies. 

 

 Compliance with Facebook’s values

Regarding whether the company had complied with Facebook’s values of “Voice,” “Safety,” and “Dignity” by removing this content, the Board deemed that the use of blackface, including portrayals of Zwarte Piet, was widely agreed to be degrading towards Black people. Specifically concerning the value of “Voice,” the Board considered that the user’s video did not constitute political speech or a matter of public concern.

Compliance with Facebook’s human rights responsibilities

Concerning removing the user’s content under the Community Standard on Hate Speech, the majority of the Board found the decision to be consistent with Facebook’s human rights responsibilities. They explained that Facebook’s rule on blackface resulted from a broader process to build a policy on harmful stereotypes. For the majority, this was in line with international standards for ongoing human rights due diligence to evolve the company’s operations and policies. 

The Board then proceeded to analyze if the decision was in line with Article 19 of the ICCPR by reiterating that the UN Human Rights Committee, in its General Comment No. 34, has made clear that the protection of Article 19 extends to expression that may be considered “deeply offensive.” 

Further, the Board noted that the right to participate in cultural life, protected under Article 15 of the ICESCR, was also relevant in the present case since participating in the Sinterklaas festival and posting related content on Facebook could be understood as taking part in the cultural life of the Netherlands. It emphasized that the right to freedom of expression and participation in cultural life should be enjoyed without discrimination based on race or ethnicity. Yet, the Board recognized that while the right to freedom of expression is fundamental, it may be restricted. Thus, it set out to employ the three-part test to unravel if Facebook’s decision complied with the permitted restrictions established by international human rights law. 

I. Legality

Regarding the requirement of legality, the Board found that Facebook’s Hate Speech Community Standard was sufficiently clear and precise to put users on notice that content featuring blackface would be removed unless a relevant exception was engaged. It further stated that the company had sought to raise awareness of the potential effects of this policy change in the Netherlands by releasing a video in Dutch ahead of the Sinterklaas festival in November 2020. 

II. Legitimate aim

The Board then considered that the restriction pursued the legitimate aim of protecting the rights of others was met since Facebook sought to prevent discrimination in equal access to the platform for expression. 

III. Necessity and proportionality

Finally, while examining whether the application of the rule on blackface, in this case, was necessary to protect the rights of Black people to equality and non-discrimination, particularly for children, the Board found that it was consistent with Facebook’s responsibility to adopt policies to avoid causing or contributing to adverse human rights impacts. It recalled its case decision 2020-003-FB-UA to highlight that moderating content to address the cumulative harms of hate speech, even when the expression does not directly incite violence or discrimination, can be consistent with Facebook’s human rights responsibilities in certain circumstances. For the majority, the expansion of degrading caricatures of Black people on Facebook builds an environment where acts of violence are more likely to be tolerated and reproduce discrimination in society. 

The majority of the Board further noted that repeated negative stereotypes about an already marginalized minority, including images, shared on social media, psychologically impact individuals. Additionally, the majority found the removal to be proportionate since less severe interventions, such as labels, warning screens, or other measures to reduce dissemination, would not have provided adequate protection against the cumulative effects of leaving this content of this nature on the platform. They also voiced that the challenge of assessing intent when enforcing content moderation on a large scale would require a case-by-case examination that would give rise to a risk of significant uncertainty, weighing in favor of a general rule that was more easily enforced. The majority further remarked that the prohibition Facebook imposed was not blanket and that the availability of human review would be essential for proper enforcement. As an example, they stated that the newsworthiness allowance would enable Facebook to permit violating content on the platform where the public interest in the expression outweighs the risk of harm. 

In conclusion, the Oversight Board upheld Facebook’s decision to remove the user’s content since it violated the express prohibition on posting caricatures of Black people in the form of blackface contained in its Hate Speech Community Standard. The majority of the Board found that removing the content complied with Facebook’s Community Standards, its values, and international human rights responsibilities. 

Policy advisory statement: 

The Board recommended that Facebook link the Hate Speech Community Standard rule prohibiting blackface to its reasoning for the rule. Additionally, it prompted the company to ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. 

Dissenting or Concurring Opinions:  

A minority of the Board found insufficient evidence to link the content in question to the harm allegedly being reduced by removing it. They noted that Facebook’s value of “Voice” specifically protects objectionable content and that, while blackface is offensive, depictions on Facebook will not always cause harm to others. Further, they argued that restricting expression founded on cumulative damage could be hard to distinguish from attempts to protect individuals from subjective feelings of offense. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

The Board decision contracted expression by upholding Facebook’s decision to remove the content. However, the Board considered that if the post in question remained online, it would have allowed the creation of a discriminatory environment for Black people that would be degrading and harassing. Given that Facebook’s actions fulfilled the criteria established under Article 19 of the ICCPR that allows certain restrictions on freedom of expression, the Board determined that the impacts of blackface justified Facebook’s policy and that removing the content was consistent with the company’s human rights responsibilities.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.

  • ICCPR, art. 2

    The Board referred to this article to highlight that all should enjoy the right to freedom of expression and participation in cultural life without discrimination based on race or ethnicity.

  • ICESCR, art. 12

    The Board referred to this article to highlight that all should enjoy the highest attainable physical and mental health standards.

  • ICESCR, art. 15

    The Board referred to this article as the international source of the right to participate in cultural life.

  • CRC, art. 2

    The Board cited the standard to argue that, for Black people, the cumulative effect of repeated exposure to images such as the one portrayed in this case and being on the receiving end of violence and discrimination may impact self-esteem and health, particularly for children.

  • CRC, art. 6

    The Board cited the standard to argue that, for Black people, the cumulative effect of repeated exposure to images such as the one portrayed in this case and being on the receiving end of violence and discrimination, may impact self-esteem and health, particularly for children.

  • CERD, art. 2
  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board cited the report to highlight that the scale and complexity of social media companies addressing hateful expression present long-term challenges and may lead companies to restrict such expression even if it is not clearly linked to adverse outcomes.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board quoted the Special Rapporteur report to stress that companies may remove hate speech that falls below the threshold of incitement to discrimination or violence.

  • CESCR, General Comment No. 21

    To underscore that even a deeply rooted cultural tradition does not justify discriminatory practices and stereotypes, the Board referenced the General Comment.

  • Report of the Working Group of Experts on People of African Descent, A/HRC/30/56/Add.1 (2015)

    The Board cited the report to stress that many people of African descent experience Zwarte Piet as a vestige of slavery and to highlight that even a deeply rooted cultural tradition does not justify discriminatory practices and stereotypes.

  • UN Special Rapporteur on racism, report A/HRC/44/57/Add.2 (2020)

    The Board cited the report to stress that many people of African descent experience Zwarte Piet as a vestige of slavery and to highlight that even a deeply rooted cultural tradition does not justify discriminatory practices and stereotypes.

  • Concluding Observations on the Netherlands (CERD/C/NLD/CO/19-21), the CERD Committee (2015)

    The Board referred to a remark that Zwarte Piet “is experienced by many people of African descent as a vestige of slavery” and is connected to structural racism in the country.

General Law Notes

Oversight Board decisions:

  • Armenians in Azerbaijan (2020-003-FB-UA)
    • By referring to this case, the Board reiterated that moderating content to address the cumulative harms of hate speech, even where the expression does not directly incite violence or discrimination, can be consistent with Facebook’s human rights responsibilities in certain circumstances.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar. In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback