Oversight Board Case of Planet of the Apes Racism

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    November 16, 2023
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2023-035-FB-UA
  • Region & Country
    France, Europe and Central Asia
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct
  • Tags
    Facebook, Racism, Oversight Board Enforcement Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board issued a summary decision overturning Meta’s original decision to leave up a Facebook post that included a video and caption comparing Black men to apes. While Meta ultimately removed the post after the Board accepted the case, the Board found the content was a clear example of dehumanizing hate speech intended to denigrate individuals based on race. The case highlighted Meta’s inconsistent enforcement of its own policies, despite a specific prohibition against comparing Black people to apes. The Board emphasized that such enforcement failures risk normalizing discriminatory content, further marginalizing racialized groups, and potentially contributing to offline harm. It reiterated the need for Meta to improve reviewer guidance and fully implement prior recommendations, such as those issued in the Knin cartoon case, to reduce error rates in hate speech moderation.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.


Facts

In January 2023, a Facebook user posted a video showing a car driving through neighborhoods at night, ending with a group of Black men appearing and chasing the car. The English caption stated: “France has fell like planet of the friggin apes over there rioting in the streets running amok savages.” The post received fewer than 500 views and was reported by another Facebook user.

According to Meta’s Hate Speech policy, content that dehumanizes individuals based on a protected characteristic—for example, by comparing them to animals perceived as intellectually or physically inferior—should be removed. Despite this, Meta initially declined to remove the post. The reporting user subsequently appealed Meta’s decision to the Oversight Board. As noted in the decision, “After the Board brought this case to Meta’s attention, the company determined that the content violated the Hate Speech Community Standard and its original decision to leave the content up was incorrect.” [p. 1]


Decision Overview

On 16 November 2023, the Oversight Board issued a summary decision. The central issue was whether Meta’s initial decision to leave up a Facebook post that compared Black men to apes was compatible with its content policies and human rights responsibilities.

The Board considered the content unequivocally used dehumanizing hate speech to denigrate a group based on race and should have been removed immediately. It opined that, unlike cases requiring contextual analysis to distinguish hate speech from legitimate commentary, the language in this post was clearly discriminatory and degrading.

The Board noted the case underscored persistent problems in Meta’s enforcement of its content policies, particularly regarding hate speech. It emphasized that such enforcement failures allow discriminatory content to remain on the platform, contributing to the marginalization of minority groups and potentially leading to offline harm—especially in contexts of heightened hostility toward immigrants or racialized communities.

The Board also referenced its prior recommendation from the Knin Cartoon decision, which called on Meta to clarify its Hate Speech Community Standard and reviewer guidance to ensure that even implicit references to protected groups could be recognized as violations. The Board stressed the need for more robust and accurate enforcement mechanisms to reduce error rates in hate speech moderation.

The Oversight Board ultimately overturned Meta’s original decision to leave the post up. It acknowledged Meta’s corrective action in removing the content after the case was brought to its attention but reiterated that such enforcement errors must be addressed systemically.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This decision has a mixed outcome on expression. The Board held that content using dehumanizing language fell outside the scope of protected speech under Meta’s content policies and international human rights standards. It found the post served no legitimate purpose and was intended solely to denigrate a racial group, reinforcing harmful stereotypes. The Board emphasized that inconsistent enforcement of Meta’s Hate Speech policy allowed discriminatory content to persist, contributing to the marginalization of minority communities and risking real-world harm. The decision underscores the importance of clear policies and robust enforcement to ensure hate speech is effectively removed, in line with Meta’s human rights responsibilities.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

This case did not set a binding or persuasive precedent either within or outside its jurisdiction. The significance of this case is undetermined at this point in time.

The decision was cited in:

Official Case Documents

Official Case Documents:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback