Facebook Community Standards, Hate Speech/Hateful Conduct, Safety, Bullying and Harassment, Referral to Facebook Community Standards
Oversight Board Cases of Gender Identity Debate Videos
United States
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s original decision to leave up a Facebook comment featuring antisemitic and racist imagery, including claims that Jewish people control the media and a depiction comparing Black people to a monkey. While Meta initially failed to act, it removed the content after the Board brought the case to its attention, acknowledging it violated the Hate Speech policy. The Board concluded that this case highlighted ongoing enforcement failures and stressed the need for Meta to fully implement prior recommendations aimed at reducing such errors—particularly those that allow harmful stereotypes and dehumanizing content to remain on the platform.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In May 2023, a user posted a comment containing a caricature of a Jewish man with an exaggerated hooked nose, wearing a Star of David inscribed with the word “Jude”—resembling the badges Jewish people were forced to wear during the Holocaust. The caricature showed the man holding an old-fashioned music box labelled “media,” with a monkey on his shoulder labelled “BLM” (Black Lives Matter). The comment received fewer than 100 views.
Under Meta’s Hate Speech policy, content that references harmful stereotypes historically used to intimidate—such as the claim that Jewish people control the media—is prohibited, as is dehumanizing imagery, including depictions that compare Black people to apes. Despite the apparent policy violation, Meta initially left the content on Facebook.
A user who reported the comment appealed Meta’s decision to the Oversight Board. After the Board brought the case to Meta’s attention, the company acknowledged that the content violated its Hate Speech policy and that its original decision to leave it online was incorrect. Consequently, the company removed the post.
On 22 November 2023, the Oversight Board issued a summary decision. The central issue was whether Meta’s decision to leave up content that claimed Jewish people control the media and compared Black people to monkeys was consistent with its content policies and human rights responsibilities.
In their appeal, the user emphasized the antisemitic and racist nature of the post. Meta initially left the content online but reversed its decision and removed the post for violating its Hate Speech policy after the Board notified it of the appeal.
The Board considered that this case highlighted enforcement failures in applying the Hate Speech policy, particularly regarding harmful content targeting marginalized groups. It emphasized the need to correct such errors promptly to mitigate resulting harm.
The Board reiterated its recommendation from the Knin Cartoon decision, urging Meta to revise its Hate Speech policy and reviewer guidance to explicitly prohibit implicit references to protected groups. Meta has only partially implemented this recommendation.
The Board also recalled its recommendation from the Breast Cancer Symptoms and Nudity decision, calling on Meta to reduce enforcement errors by analyzing a statistically representative sample of reversed automated removal decisions. While Meta claimed to have done this, it did not produce evidence to substantiate the claim.
Consequently, the Board called on Meta to fully implement these recommendations, particularly those aimed at reducing the prevalence of harmful stereotypes and dehumanizing imagery.
Accordingly, the Oversight Board overturned Meta’s original decision to leave the content online and acknowledged the company’s subsequent correction of its error.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision yields a mixed outcome for freedom of expression. The Board found that the content in question—depicting antisemitic tropes and racist imagery comparing Black people to monkeys—constituted dehumanizing hate speech that violated Meta’s policy. The Board underscored that protecting expression must not come at the expense of tolerating hate speech that targets marginalized groups and reiterated that the enforcement of the Hate Speech policy remains inconsistent. This decision calls on Meta to fully implement prior Board recommendations to reduce enforcement errors and ensure that speech inciting hatred and dehumanization is effectively addressed and promptly removed.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.