Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct
Oversight Board Case of Alleged Crimes in Raya Kobo
Ethiopia
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
This case is available in additional languages: View in: العربية
The Oversight Board issued a summary decision on April 18, 2024, overturning Meta’s original choice to leave up a Facebook post that claimed Hamas reflected the innermost desires of Gaza’s population and compared Gazans to a “savage horde.” The Board found Meta’s Hate Speech policy enforcement inefficient, particularly in addressing content targeting protected characteristics, since the post in question involved clear dehumanization of an entire population (a direct attack on protected groups) with especially severe consequences during armed conflicts. After being notified of the appeal by the Board, Meta reversed its decision and removed the post.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In December 2023, a Facebook user reposted an image featuring text claiming that Gaza’s general public were not victims of Hamas and that the group reflected the innermost desires of “a savage horde.” The caption endorsed the post with the words “the truth.” The content received fewer than 500 views.
The report claimed that the post violates Meta’s Hate Speech policy, which prohibits attacks on groups based on protected characteristics, including ethnicity and nationality, and explicitly bans comparisons to sub-humanity, such as labeling people “savages.” The post in question targeted Palestinians in Gaza based on these protected characteristics.
Despite receiving reports, Meta initially left the post up, prompting a user to appeal the decision to the Board.
The main issue before the Board was whether Meta’s decision to retain the post dehumanizing Gazans was compatible with Meta’s values, content policies and human rights obligations.
In their appeal to the Board, the reporting user stated that the content generalized and dehumanized Gaza’s population. After the Board notified Meta of the appeal, the company removed the post, recognizing that it violated its Hate Speech policy.
The Board noted that this case demonstrated Meta’s enforcement errors of its Hate Speech policy regarding hate speech targeting protected groups. It also emphasized the heightened impact of such errors during armed conflicts and the need for stronger content moderation.
Furthermore, the Board drew a comparison between this case and the Knin Cartoon decision, which involved hate speech implicitly comparing an ethnic group to rats. While the Knin Cartoon required historical and cultural context to fully grasp its meaning, the Board found that the current post directly tied dehumanizing language to an entire population, making it a clear attack based on protected characteristics.
The Board also referenced its earlier recommendation in the Knin Cartoon decision, urging Meta to clarify in its Hate Speech policy that implicit hate speech violates its policy when the reference would be reasonably understood. Meta reported the implementation of this recommendation by adding a clarification to its Community Standards stating that “implicit hate speech will be removed if escalated by at-scale reviewers to expert review, where Meta can reasonably understand the user’s intent.” However, the Board considered this a partial implementation, as the update was only reflected in the introductory section of the Community Standards, not within the Hate Speech policy itself. It stressed the need for full implementation to reduce enforcement errors.
Ultimately, the Board overturned Meta’s original decision and acknowledged the company’s correction of its initial decision.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
While this decision by the Oversight Board restricts freedom of expression, it does so in compliance with international human rights law, as the limitation aims to prevent hate speech targeting individuals based on ethnicity and nationality.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board compared between the content in the case at hand and the content in Knin Cartoon. Additionally, the Board recalled recommendation no. #1 from this decision.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.