Global Freedom of Expression

Oversight Board Case of South Africa Slurs

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 28, 2021
  • Outcome
    Oversight Board Decision, Agreed with Meta’s initial decision
  • Case Number
    2021-011-FB-UA
  • Region & Country
    South Africa, Africa
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Oversight Board Enforcement Recommendation, Oversight Board Policy Advisory Statement, Racism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 28, 2021, the Oversight Board upheld Facebook’s (now Meta) decision to remove a post discussing South African society under its Hate Speech Community Standard. The post, published in a Facebook public group, discussed “multi-racialism” in South Africa and argued that poverty, homelessness, and landlessness have increased for black people in the country since 1994. Among other things, it stated that white people hold and control most of the wealth and that, in contrast, wealthy black people may have ownership of some companies but not control them. The post then concluded with “[y]ou are” a “sophisticated slave,” “a clever black,” “’n goeie k**ir” or “House n***er”. The Board found that in the South African context, the slur contained in the post was degrading, excluding, and harmful to the people it targeted. Therefore, the Board found that Facebook acted according to its Community Standard on Hate Speech when it decided to remove this content.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In May 2021, a Facebook user shared in English, in a public group that described itself as focused on unlocking minds, a post in which they discussed “multi-racialism” in South Africa and argued that poverty, homelessness, and landlessness have increased for black people in South Africa since 1994. It remarked that white people held most of the wealth and that wealthy black people owned some companies but did not have control over them. It also stated that if “you think” sharing neighborhoods, language, and schools with white people makes you “deputy-white” then “you need to have your head examined.” The post then concluded with “[y]ou are a ‘sophisticated slave,’ ‘a clever black,’ ‘n goeie kaffir’ or ‘House nigger’ (hereafter redacted as ‘k***ir’ and ‘n***er’)” [p. 1].

 The post was viewed more than 1,000 times and was shared over 40 times.  A Facebook user reported the publication for violating Facebook’s Hate Speech Community Standard. Following a review by a moderator, Facebook removed the post under its Hate Speech policy, which prohibits content that “describes or negatively targets people with slurs, where slurs are defined as words that are inherently offensive and used as insulting labels” (p. 4) based on their race, ethnicity and/or national origin. The company stated that while its prohibition against slurs was global, the designation of slurs on its internal slurs list was market-oriented. Particularly, it highlighted that both “k***ir” and “n***er” were on Facebook’s list of prohibited slurs for the Sub-Saharan market.

Facebook then notified the user that their post violated Facebook’s Hate Speech Community Standard. As a result, the user appealed the decision to Facebook, and, following a second review by a moderator, Facebook confirmed the post was violating. The user then submitted an appeal to the Oversight Board.


Decision Overview

The Board’s main issue to analyze was whether Facebook’s decision to remove the post complied with the company’s Hate Speech Community Standard, its values, and its human rights responsibilities.

The user stated in their appeal to the Board that they “did not write about any group to be targeted for hatred or for its members to be ill-treated in any way by members of a different group” [p. 6]. The user argued that their post sought to encourage members of a particular group to introspect and re-evaluate their priorities and attitudes. Additionally, they remarked that there was nothing in the post or “in its spirit or intent” that would foster hate speech.

Facebook held that its decision to remove the content was based on the Hate Speech Community Standard, specifically for violating its policy prohibiting the use of slurs targeted at people based on race, ethnicity, and/or national origin. The company noted that it “prohibits content containing slurs, which are inherently offensive and used as insulting labels unless the user demonstrates that that content ‘was shared to condemn, to discuss, to raise awareness of the slur, or the slur is used self-referentially or in an empowering way’” [p. 6]. Facebook argued that these exceptions did not apply in the immediate case. Further, the company argued the post addressed itself to “Clever Blacks,” a phrase used to criticize Black South Africans perceived to be excessively anxious to appear impressively clever or intelligent. Facebook also stated that the post used the words “k***ir” and “n***er”, both contained in its confidential list of prohibited slurs. Moreover, the company claimed that the word “k***ir” was deemed as “South Africa’s most charged epithet” and historically used by white people in South Africa as a derogatory term to refer to black people. Facebook added that the Black community had never reclaimed the term.  The company also stated that the word “n***er” was also highly offensive in South Africa but that the Black community had reclaimed it for use in a positive sense.

Compliance with Community Standards

The Board explained that the  Hate Speech Community Standard prohibits attacks based on protected characteristics. In this case, the Board noted that the post targeted a group of black South Africans. The Board considered that the user’s critique discussed this group’s presumed economic, educational and professional status and privilege. Thus, the Board deemed that using the “k***ir” term, with the prefix “good” in Afrikaans, had a clear historical association that carried significant weight in South Africa. Therefore, it found that Facebook acted according to its Community Standard on Hate Speech when it decided to remove this content. 

Compliance with Facebook’s values

The Board found Facebook’s value of “Voice” particularly significant to political discourse regarding racial and socio-economic equality in South Africa. It considered that arguments about the distribution of wealth, racial division, and inequality were highly relevant, especially in a society still transitioning from apartheid towards greater equality. Additionally, the Board highlighted that people targeted by slurs could see “Voice” affected, as derogatory terms could have a silencing impact on those targeted and inhibit their participation on the platform. 

The Board also analyzed the values of “Dignity” and “Safety”. It found that using the slur “k***ir” in South Africa could be degrading, excluding, and harmful to the people targeted by the slur. The Board considered it possible for the user to engage in political and socio-economic discussions on Facebook in ways that appealed to the emotions of their audience without referencing such slur. Thus, it considered it was justified to supersede the user’s “Voice” to protect the “Voice,” “Dignity,” and “Safety” of others.

Compliance with Facebook’s human rights responsibilities

The Board then proceeded to examine if the removal of the content complied with Facebook’s human rights responsibilities as a business. It based its analysis on Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which provides broad political expression and debate. Further, it relied on Article 5 of the International Convention on the Elimination of All Forms of Racial Discrimination (CERD), which safeguards the protection of freedom of expression.  The CERD Committee, in General Recommendation 35, has emphasized the importance of this right to assist “vulnerable groups in redressing the balance of power among the components of society” and to offer “alternative views and counterpoints” [p. 11] in discussions.

To assess if Facebook’s decision to restrict the content met the allowed restrictions on expression, the Board moved to analyze the company’s determination following the three-part test established under Article 19 of the ICCPR. 

I. Legality (clarity and accessibility of the rules)

First, it held that Facebook had met its responsibility of legality in this case since the term “k***ir” was widely understood as a charged racial epithet. 

II. Legitimate aim

The Board then stated that the slur prohibition sought to protect people’s rights to equality and non-discrimination, among other rights, which constituted legitimate aims.

III. Necessity and proportionality

Finally, the Board assessed if removing the content was appropriate to achieve a protective function. It highlighted that Facebook’s Hate Speech Community Standard prohibits some discriminatory expressions, including slurs. Moreover, the Board remarked that in the current case, the historical and social context was crucial since the use of the word “k***ir” was closely linked with discrimination and the history of apartheid in South Africa. Thus, it found that Facebook acted according to its Community Standard on Hate Speech when it decided to remove this content.

Additionally, the Board discussed concerns Facebook stakeholders raised about the company’s attempt to determine users’ racial identities. The Board agreed that Facebook gathering or maintaining data on users’ perceived racial identities presented serious privacy concerns. Regarding intent, the Board considered that while the user said they wished to encourage introspection, the post invoked a racial slur with charged historical implications to criticize some black South Africans. The Board concluded that removing the content was appropriate to achieve a protective function.

Policy advisory statement:

To ensure procedural fairness for users, the Board recommended that Facebook notify users of the specific rule within the Hate Speech Community Standard that has been violated in the language in which they use Facebook. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

The Oversight Board contracts expression by agreeing with Facebook’s decision to remove the post. Yet, while acknowledging the importance of expression that discusses relevant and challenging socio-economic and political issues in South Africa, it highlighted that being targeted by slurs could have a silencing impact on those targeted and inhibit their participation on the platform. 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 2

    The Board referred to this Article to highlight Facebook’s human rights responsibilities as a business, particularly to the right to non-discrimination.

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression and employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited; the Board referred to the General Comment for guidance.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

  • ICERD, Article 5

    The Board referred to this Article to highlight Facebook’s human rights responsibilities as a business, particularly to the right of equality and non-discrimination in light of the General Recommendation 35 of the Committee on the Elimination of Racial Discrimination (2013).

  • OSB, Armenians in Azerbaijan, 2020-003-FB-UA (2021)

    In its policy advisory statement, the Board referred to this case to exemplify how Facebook should notify users of the specific rule within the Hate Speech Community Standard that has been violated in the language in which they use Facebook.

  • OSB, Depiction of Zwarte Piet, 2021-002-FB-UA (2021)

    In its policy advisory statement, the Board referred to this case to exemplify how Facebook should notify users of the specific rule within the Hate Speech Community Standard that has been violated in the language in which they use Facebook.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback