Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct
Oversight Board Case of South Africa Slurs
South Africa
Closed Contracts Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s initial choice to leave up a video montage featuring antisemitic, racist, homophobic, and transphobic memes. The content included claims of Jewish control over media institutions, praise for the Nazi military, derogatory remarks about interracial relationships, comparisons of Black people to gorillas, slurs targeting Black and LGBTQIA+ individuals, and calls for violence against these communities. Although Meta initially failed to act, it removed the content after the Board brought the case to its attention, acknowledging that it violated the Hate Speech policy. In taking up the appeal, the Board raised serious concerns about Meta’s inconsistent enforcement of its policies, particularly in response to slurs and dehumanizing content.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In August 2023, a Facebook user posted a video with antisemitic, racist, homophobic, and transphobic memes. These memes alleged Jewish control of the media, praised the Nazi, compared “Black people to gorillas, displayed anti-Black and anti-LGBTQIA+ slurs, and advocated for violence against these communities.” [p. 1] The caption claimed that the post would get the user’s profile suspended but that it would be “worth it” and asked viewers to “show these degenerates your utter contempt.” The post received around 4,000 views and fewer than 50 reports.
The content violated multiple aspects of Meta’s Hate Speech policy, including promoting harmful antisemitic stereotypes by implying Jewish control of the media, using dehumanizing imagery comparing Black people to gorillas, and displaying racialized slurs such as the n-word. The caption further encouraged contempt toward targeted groups and urged users to download the video.
A user filed an appeal to the Oversight Board (OSB) regarding the Facebook post. Although Meta initially left the post online, it removed the content after the Board brought the case to its attention, acknowledging violations of its Hate Speech Policy.
The Oversight Board issued a summary decision on March 7, 2024. The main issue it had to decide was whether a video with memes containing antisemitic, racist, homophobic, and transphobic content violated Meta’s Hate Speech Policy. In the present case, the company decided to leave up the contested video and later reversed the decision, acknowledging it violated the aforementioned policy. Hence, the Board issued a summary decision.
The OSB first recalled the South Africa Slurs decision, where it examined the Hate Speech policy in relation to the usage of degrading racial slurs. Subsequently, it referred to the Knin Cartoon case, where it analyzed the policy in relation to content referring to a group of people as sub-human. There, it brought to Meta’s attention enforcement issues related to its Hate Speech policy.
The Board then noted that in the present case, the Facebook post contained multiple violations of the Hate Speech policy, since it included anti-Black people slurs and accused Jewish people of media control. Additionally, the OSB noted that the user was aware that their content violated Meta’s policies and constituted hateful speech, as they expressed so in the post’s caption. However, the content was not removed until the Board “identified the case for review based on another user’s appeal.” [p. 3] The OSB underscored that it had previously issued other summary decisions where it highlighted Meta’s difficulty in enforcing its Hate Speech Policy, such as in the Planet of the Apes Racism and Media Conspiracy Cartoon cases—regarding hate speech against Black and Jewish people, respectively.
It further noted that Meta’s enforcement shortcomings extended to content against the LGBTQIA+ and other marginalized communities, as previously flagged to Meta in the Post in Polish Targeting Trans People decision. On this point, the Board showed concern that Meta was not living up to its values regarding the safety of marginalized communities and urged Meta “to close enforcement gaps under the Hate Speech Community Standard.” [p. 4]
Thus, the Oversight Board overturned Meta’s original decision to keep the content on Facebook and acknowledged Meta’s correction of the initial error after the Board brought the case to the company’s attention.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision contracts expression by upholding the removal of content containing slurs, dehumanizing imagery, and calls for violence, which Meta ultimately acknowledged as violating its Hate Speech policy. While the Oversight Board issued a summary decision and did not conduct a full assessment, it reiterated concerns raised in previous cases about Meta’s inconsistent enforcement of its own rules—particularly its failure to promptly remove harmful content. However, the decision does not explicitly engage with international human rights standards or apply the Rabat Plan of Action’s threshold test to assess whether the restriction was necessary and proportionate. As a result, it misses an opportunity to ground the outcome in a broader freedom of expression framework under international law.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.