Oversight Board Case of Negative Stereotypes of African Americans

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 18, 2024
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2024-021-FB-UA, 2024-022-FB-UA, 2024-023-FB-UA
  • Region & Country
    United States, North America
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct, Safety, Bullying and Harassment
  • Tags
    Facebook, Oversight Board Enforcement Recommendation, Oversight Board Content Policy Recommendation, Racism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board issued a summary decision on April 18, 2024, overturning Meta’s original decisions to leave up three racist images enforcing offensive stereotypes against African Americans on Facebook, such as depictions of absent fathers, reliance on welfare, and looting. The Board found this case demonstrated Meta’s failure to properly enforce its Hate Speech and Bullying and Harassment policies, highlighting how such lapses disproportionately harm protected groups, urging Meta to explicitly prohibit implicit discriminatory content that audiences could reasonably interpret as harmful. Meta reversed its decision after the Board notified it of the user’s appeal.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.


Facts

In late 2023, the Oversight Board received three appeals regarding three different Facebook posts containing negative material about African Americans.

The first post contained a computer-generated image depicting a shop named “Loot” on fire, with cartoon Black figures stealing merchandise and fleeing the store. The user captioned it as the “next Pixar film.” The post received around 500,000 views.

The second post was also of a computer-generated image parodied a movie poster titled EBT (a reference to the U.S. social welfare system), showing a Black woman with exaggerated features holding a shopping cart filled with Cheetos. The names of African American victims of police brutality, Trayvon Martin and George Floyd, replaced the actors’ names at the top of the poster.

The third post was of a meme falsely claiming that Adobe had developed software to detect photoshopped images. It included an image of a woman with colorful markings on her face (implying digital manipulation) alongside an image of a Black family eating a meal, where the father and food were covered in similar markings—suggesting they had been artificially added. The post reinforced stereotypes of the absence of father figures in Black families. The post was viewed over 14 million times.

Despite user reports, Meta initially kept all three posts on Facebook, prompting the reporting users to escalate their appeals to the Oversight Board.


Decision Overview

The main issue before the Board was whether Meta’s original decision to leave these three posts up was compatible with its content policies and human rights obligations.

In their appeals to the Board, the users argued that the posts perpetuated harmful racial stereotypes against African Americans.

On the other hand, Meta removed all three posts after being notified by the Board of the appeals. Meta justified the removals under its Hate Speech policy, which bans attacks against people on the basis of race and ethnicity, including dehumanizing speech, comparisons to thieves, mocking victims of hate crimes, and implications of inferiority.

Additionally, the second post violated Meta’s Bullying and Harassment policy by including the names of Trayvon Martin and George Floyd. This policy prohibits celebrating or mocking deaths, and Meta stated that the post trivialized their killings by suggesting they were starring in a fictional film.

The Board noted that these cases demonstrated Meta’s failure to enforce its Hate Speech and Bullying and Harassment policies effectively. This was further highlighted by the fact that two of these posts received significantly high views. The Board emphasized the disproportionate harm such under-enforcement inflicts on protected groups and their right to freedom from discrimination on Meta’s platforms. The Board urged Meta to strategically prioritize combating hate speech against marginalized communities.

Moreover, the Board reiterated its recommendation from the Knin Cartoon decision, calling on Meta to clarify its Hate Speech policy and internal guidance to prohibit implicit discriminatory references, particularly when audiences could reasonably interpret them as such. Meta reported only partial implementation of this recommendation.

Ultimately, the Board overturned Meta’s original decisions to retain the three posts but acknowledged Meta’s correction of the initial errors.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This decision represents a mixed outcome for freedom of expression, balancing necessary restrictions with human rights protections. While it contracts the boundaries of permissible speech by mandating stricter removal of harmful racial stereotypes, including implicit or coded content, these limitations could arguably constitute legitimate and proportionate measures under international human rights law in line with Article 19(3) of the ICCPR. The ruling affirms that such restrictions are justified to prevent discrimination and protect human dignity, particularly for marginalized groups vulnerable to systemic harm.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • OSB, Knin Cartoon, 2022-001-FB-UA (2022)

    The Board recalled recommendation no. 1 from this decision calling on Meta to clarify its Hate Speech policy and internal guidance to prohibit implicit discriminatory references, particularly when audiences could reasonably interpret them as such.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

 

Official Case Documents

Official Case Documents:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback