Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct, Safety, Bullying and Harassment
Oversight Board Case of Image of Gender-Based Violence
Iraq
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s original decision to remove a Facebook post raising awareness about human trafficking in Thailand. The user’s appeal highlighted shortcomings in Meta’s content moderation systems regarding the recognition of sensitive context. After the Board notified Meta of the appeal, the company reversed its decision and restored the content. The Board emphasized that this case illustrates the need for moderation systems capable of understanding posts intended to raise awareness, including those that use satire, irony, or sarcasm.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
A Facebook user posted a screenshot of messages from a business that appeared to recruit individuals for human trafficking to Myanmar. The Thai-language caption described common tactics used to lure victims and included ironic statements such as, “If you want to be a victim of human trafficking, don’t wait.”
Meta initially removed the content for violating its Human Exploitation policy, which prohibits content that recruits or facilitates human trafficking, defined as “the business of depriving someone of liberty for profit.” However, the policy permits content that condemns or raises awareness about human trafficking.
The user appealed the removal, and the case was brought before the Oversight Board. When the Board brought the case to Meta’s attention, the company restored the content, acknowledging the removal was an enforcement error.
On 22 November 2023, the Oversight Board issued a summary decision. The central issue was whether removing a Facebook post raising awareness about human trafficking in Thailand aligned with Meta’s content policies and human rights responsibilities.
The user did not submit a statement with their appeal. After being notified, Meta reversed its original decision and restored the content. The company explained that while the screenshots alone could appear to violate policy, the broader context of the post meant the content was not violating.
The Board decided to proceed with the case, emphasizing that it highlighted the critical importance of moderation systems that can recognize context—especially when content uses irony, sarcasm, or satire to raise awareness. It recalled its recommendation from the Breast Cancer Symptoms and Nudity decision, urging Meta to analyze a representative sample of reversed automated removals to identify and address systemic errors. While Meta claimed to have implemented this process, it did not publish information demonstrating its effectiveness.
The Board also reiterated two recommendations from the Two Buttons Meme decision: first, that Meta ensure adequate procedures are in place to analyze satire and contextual cues; second, that appeals citing policy exceptions be prioritized for human review. Meta reported implementing the first recommendation, though no public evidence was provided. Regarding the second, the company stated it is finalizing mechanisms to allow users to indicate when their appeals relate to policy exceptions.
Accordingly, the Board overturned Meta’s original removal decision and welcomed the company’s correction of its error.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision expands expression. It reinforces the importance of context in content moderation, particularly for posts using irony or satire to raise awareness about serious issues like human trafficking. The Board underscored the need for moderation systems that do not unduly restrict speech intended to inform or condemn harmful practices. The decision affirms that content aimed at awareness-raising, even when conveyed through unconventional forms of expression, should be protected in line with Meta’s human rights responsibilities.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.