Global Freedom of Expression

Oversight Board Case of Wampum Belt

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 12, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2021-012-FB-UA
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Objectionable Content, Hate Speech
  • Tags
    Meta Spirit of the Policy allowance, Oversight Board Policy Advisory Statement, Oversight Board Enforcement Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board overturned Meta’s original decision to remove a post from Facebook in which an Indigenous North American artist posted a picture of a wampum belt, a North American Indigenous form of woven art, that referenced the May 2021 discovery of unmarked graves at a school for Indigenous children in Canada, and was titled “Kill the Indian/Save the Man”. Meta removed the post under its Hate Speech Community Standard. The Board found that the content was covered by allowances to the Hate Speech policy as it was a clear example of “counter speech” in which hate speech is used to resist oppression and discrimination. The Board also expressed concern regarding Meta’s content moderation system when assessing critical art and the impact on the communities who bear the burden of such mistakes.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In August 2021, an Indigenous North American artist posted a picture of a wampum belt with a caption in English that read: “A wampum belt is a North American Indigenous art form in which shells are woven together to form images, recording stories and agreements” [p. 4]. The belt included references related to the “Kamloops story”, the May 2021 discovery of unmarked graves at a former residential school for Indigenous children in British Columbia, Canada. Authorities confirmed 200 probable burial sites in the area and “the Truth and Reconciliation Commission concluded that at least 4,100 students died while attending the schools, many from mistreatment or neglect, others from disease or accident” [p. 5].

The artwork’s title was “Kill the Indian/Save the Man”, and it contained a list of phrases that corresponded to the series of images depicted on the belt: “Theft of the Innocent, Evil Posing as Saviours, Residential School/Concentration Camp, Waiting for Discovery, Bring Our Children Home” [p. 4]. The text in the post described the meaning of the artwork and the history of wampum belts and their purpose as a means of education. The user concluded by apologizing for any pain the artwork could cause to survivors of the residential school system, stating that the “sole purpose is to bring awareness to this horrific story” [p. 5].

Meta’s automated systems identified the content as potentially violating the Facebook Community Standard on Hate Speech the day after it was posted. Later, a human moderator decided the content violated the policy. After the user appealed the decision, a second human reviewer confirmed that the post violated the standard. No users reported the content. After the Board selected the case, Meta identified its removal as an “enforcement error” and restored the content on August 27. Meta did not notify the user of the restoration until September 30. The company explained the delayed messaging was a result of human error. 


Decision Overview

The Oversight Board analyzed whether Meta was correct to remove a post from Facebook under its Hate Speech Community Standard. The post was uploaded by an Indigenous North American artist and it included references to the “Kamloops story”, the May 2021 discovery of unmarked graves at a school for Indigenous children in Canada, and was titled “Kill the Indian/Save the Man”.

In its submission to the Board, the user stated: “that their post was showcasing a piece of traditional artwork documenting history, and that it had nothing to do with hate speech” [p. 8]. Furthermore, the user argued that Meta’s removal of the content was an act of censorship against history that needed to be seen. 

For its part, Meta told the Board that “the phrase ‘Kill the Indian’ constituted a Tier 1 attack under the Facebook Community Standard on Hate Speech, which prohibits ‘violent speech’ targeting people on the basis of a protected characteristic, including race or ethnicity”. However, the company recognized that removing the content was a mistake since the Standard “permits sharing someone else’s hate speech to ‘condemn it or raise awareness’” [p. 8]. Thus, after further analysis, Meta considered that the contested phrase “originated in the forced assimilation of Indigenous children” [p. 8], and the user had stated their purpose was to raise awareness about the Kamloops story. 

Compliance with Community Standards

The Board determined that the removed content was an example of ‘counter-speech,’ “where hate speech is referenced or reappropriated in the struggle against oppression and discrimination” [p. 10]. Because of the above, it concluded that since the content did not constitute hate speech, it did not violate Facebook Community Standard on Hate Speech. Further, it highlighted that the standard allows “content that includes someone else’s hate speech to condemn it or raise awareness” [p.10]. 

The Board found that Meta’s reviewers could have determined that the post did not violate the standard by the context and content alone. Thus, it considered the user didn’t have to state that they were raising awareness expressly. The post explained the significance of the artwork and its meaning. The Board noted that while the words “Kill the Indian” could constitute hate speech, in light of the context surrounding the post, it was clear that the phrase was used to raise awareness of and condemn hatred and discrimination. For the Board, reviewers should have been more careful in assessing the deleted content since the user was cautious enough to explain the Kamloops story and the cultural significance of the wampum belt. Meta agreed that its original decision to remove this content was against the Facebook Community Standards and was an “enforcement error.”

Furthermore, the Board noted that Facebook’s internal “Known Questions,” which form part of the guidance given to human moderators, have no instructions “on how to assess evidence of intent in artistic content quoting or using hate speech terms, or in content discussing human rights violations, where such content is covered by the policy allowances” [p. 11]. 

Compliance with Meta’s values

The Board determined that the decision to remove this content was inconsistent with Meta’s values of “Voice” and “Dignity” and did not serve the value of “Safety”. The Board found that counter-speech is an expression of voice and that art that seeks to illuminate the horrors of past atrocities and educate people on their lasting impact was one of the most important and powerful expressions of the value of “Voice”. The Board suggested the company ensure that its content policies and moderation practices account for and protect counter-speech. Moreover, the Board highlighted its concerns regarding Meta’s moderation processes because they did not identify and adequately protect “the ability of people who face marginalization or discrimination to express themselves through counter-speech” [p. 12].

Compliance with Meta’s human rights responsibilities

The Board concluded that the removal of this post violated Meta’s human rights responsibilities as a business. Meta has committed to respecting human rights under the UN Guiding Principles on Business and Human Rights (UNGPs). The Board noted that this was its first case concerning artistic expression and its first case concerning expression where the user self-identifies as an Indigenous person. 

The Board cited several international human rights standards as the legal basis that protected expression in the form of art, such as a report from the UN Special Rapporteur in the field of cultural rights (A/HRC/23/34). Moreover, they mentioned human rights treaties that guarantee minorities and vulnerable groups’ protection from discrimination in the exercise of their right to freedom of expression, like the UN Declaration on the Rights of Indigenous Peoples and a report from the UN Special Rapporteur on freedom of opinion and expression (A/HRC/38/35). The Board emphasized that the content, in this case, engaged other rights, such as the rights of persons belonging to national, ethnic, or linguistic minorities to enjoy, in community with other members of their group, their own culture.  The Board applied Article 19 of the ICCPR and employed a three-part test to analyze whether Meta’s measure to remove the content complied with International Human Rights standards.

  • Legality 

The Board explained that the Community Standard on Hate Speech allowed content that condemned hate speech or raise awareness. It noted that the policy was clear and accessible enough for the user to act accordingly. The Board mentioned that since the policy allowances were not applied correctly by two human moderators in this case, further internal guidance to moderators could be required. 

  • Legitimate aim

The Board recognized that Facebook’s Hate Speech Community Standard pursued the legitimate aim of protecting rights such as equality and non-discrimination, freedom of expression, and the right to physical integrity. In this case, as the Board opined, restrictions on freedom of expression pursued one of the legitimate aims listed in Article 19, para. 3 of the ICCPR. 

  •  Necessity and proportionality

In this case, the Board concluded that the removal of the content was not necessary. The Board expressed its concerns regarding Meta’s content moderation system: every post that is wrongly removed harms freedom of expression. The causes of erroneous review of content must constantly be assessed and examined. Moreover, the Board considered that Meta should further investigate the root causes of the mistake in this case and evaluate, in a broader context, how effectively counter-speech is moderated.

Since members of marginalized groups raised concerns about the rate and impact of false positive removals for several years, the Board told Meta that the company must demonstrate that it has undertaken human rights due diligence to ensure its systems are operating fairly and are not exacerbating historical and ongoing oppression. Additionally, since two human reviewers had removed the content wrongly, the Board deemed that Meta’s guidance to moderators assessing counter-speech could be insufficient. More, it is considered likely that reviewers did not have sufficient resources regarding time or training to prevent the kind of mistake seen in this case, especially concerning content permitted under policy allowances. In the Board’s view, “guidance should include clear instructions to evaluate content in its entirety and support moderators in more accurately assessing context to determine evidence of intent and meaning” [p. 17].

According to the Board, it was clear that Meta was responsible for performing human rights due diligence. In the same vein, it believed that the company needed to identify “any adverse impacts of content moderation on artistic expression and the political expression of Indigenous peoples countering discrimination” [p. 17]. Moreover, it considered Meta “should further identify how it will prevent, mitigate and account for its efforts to address those adverse impacts” [p. 17].

Policy advisory statement:

The Board recommended Meta “provide users with timely and accurate notice of any company action being taken on the content their appeal relates to” [p. 18]. Where applicable, the notice to the user should acknowledge that the action resulted from the Oversight Board’s review process. Moreover, the messages sent to the users should be shared with the Board. Meta also needs to “study the impacts of modified approaches to secondary review on reviewer accuracy and throughput”  [p. 18]. The Board requested “an evaluation of accuracy rates when content moderators are informed that they are engaged in secondary review, so they know the initial determination was contested”  [p. 18]. Users should be able to provide relevant context that may help reviewers evaluate their content. The results of this accuracy assessment should be shared with the Board.

Finally, Meta shall “conduct accuracy assessments focused on Hate Speech policy allowances that cover artistic expression and expression about human rights violations (e.g., condemnation, awareness raising, self-referential use, empowering use)”  [p. 18]. These assessments must consider the reviewer’s location and how it could affect their ability to accurately assess hate speech and counter-speech from the same or different regions. The results should be shared with the Board. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision expanded expression because the Oversight Board directly protected artistic expressions and expressions about human rights violations such as condemnation, awareness raising, self-referential use, and, empowering us in Meta’s platforms. Moreover, the Board mentioned human rights treaties that guarantee minorities and vulnerable groups protection from discrimination in exercising their right to freedom of expression. They concluded that “counter-speech” must be protected so users can make references to examples of hate speech in the struggle against oppression and discrimination.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board used the UNGPs as the legal basis of Meta’s commitment to respect human rights.

  • ICCPR, art. 19

    The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression through any media and regardless of frontiers. It is also used to apply the three-part test of legality (clarity), legitimacy, necessity and proportionality.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test.

  • International Convention on the Elimination of All Forms of Racial Discrimination, art. 5

    The Board used Article 5 of the ICERD as a legal basis for the protection from discrimination in the exercise of the right to freedom of expression.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board used Report A/74/486 as a legal basis for the protection from discrimination in the exercise of the right to freedom of expression.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board used Report A/HRC/38/35 to emphasize social media platforms’ responsibilities to human rights standards.

  • ICCPR

    The Board used Articles 2 and 26 of the ICCPR as a legal basis for the relationship between the rights to equality and non-discrimination with freedom of expression.

  • International Convention on the Elimination of All Forms of Racial Discrimination 1965

    The Board used Article 5 of the ICERD as a legal basis for the protection from discrimination in the exercise of the right to freedom of expression.

  • Article 27 of ICCPR

    The Board used Article 27 of the ICCPR as a legal basis for the rights of persons belonging to national, ethnic or linguistic minorities to enjoy, in community with other members of their group, their own culture.

  • ICESCR, art. 15

    The Board used Article 15 of the ICESCR as a legal basis for the right to participate in cultural life and enjoy the arts.

  • UN Declaration on the Rights of Indigenous People

    The Board used Articles 7, 8 and 19 of the UN Declaration on the Rights of Indigenous People as the legal basis of the Rights of Indigenous People.

  • UN Special Rapporteur in the field of cultural rights, report on artistic freedom and creativity, A/HRC/23/34, 2013

    The Board used Report A/HRC/23/34 as an argument on artistic freedom and creativity.

Other national standards, law or jurisprudence

  • Oversight Board Decision 2020-005-FB-UA

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback