Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct, Violence And Criminal Behavior, Dangerous Individuals and Organizations
Oversight Board Case of Politician’s Comments on Demographic Changes
France
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s initial decision to remove two Facebook posts referencing the terrorist organization Al-Shabaab. Meta removed both posts for violating its Dangerous Organizations and Individuals (DOI) policy. However, after the Board notified Meta of the users’ appeals, the company reassessed the cases, restored the content, and acknowledged the posts did not support or praise the group. The Board found these cases illustrated an over-enforcement of the DOI policy, particularly in countries affected by armed conflict and terrorism. It emphasized that such enforcement errors can undermine legitimate efforts to report on, condemn, or raise awareness about terrorist groups and their human rights abuses. The Board also reiterated earlier recommendations for Meta to improve the clarity and enforcement of its DOI policy exceptions by providing illustrative examples of permitted content and auditing enforcement errors to support more accurate moderation.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In July 2023, a Facebook user, who appeared to be affiliated with a news outlet, posted a photo showing military equipment near soldiers’ feet, with a caption reporting that Somali government forces and local residents had fought and killed Al-Shabaab fighters in the Mudug region.
Around the same time, another Facebook user posted two photos: one showing a woman painting a blue pillar black, and another displaying the Al-Shabaab emblem painted in black on the pillar. The accompanying caption read, “The terrorists that used to hide have come out of their holes, and the world has finally seen them.”
Harakat al-Shabaab al-Mujahideen, commonly known as Al-Shabaab (“the Youth”), is an Islamist terrorist group affiliated with al-Qa’ida. Its goal is to overthrow the Somali government, and while its operations are primarily based in Somalia, it has carried out terrorist attacks in neighboring countries.
Both posts were initially removed by Meta for violating its Dangerous Organizations and Individuals (DOI) policy, which prohibits content that praises or supports designated entities. However, the policy allows content that reports on, condemns, or neutrally discusses such groups or their activities. The users appealed the removals to the Oversight Board. After the Board brought the appeals to Meta’s attention, the company reversed its decisions and restored the content.
On 22 November 2022, the Oversight Board issued a summary decision. The central issue was whether Meta’s decision to remove two Facebook posts mentioning the terrorist group Al-Shabaab was compatible with the company’s policies and human rights responsibilities. Through this decision, the Board highlighted the over-enforcement of Meta’s DOI policy in contexts affected by armed conflict and terrorism. Such enforcement errors risk silencing legitimate efforts to report on, criticize, or expose the actions of terrorist groups, including possible human rights violations and atrocities.
In their submissions to the Board, the first user explained their post aimed to report on government operations against Al-Shabaab. The second user stated their post was intended to condemn and raise awareness about the group’s activities.
Following notification of the appeals, Meta reviewed the cases and reversed its original decisions, determining that the posts did not violate the DOI policy. While both posts referenced Al-Shabaab—a designated organization under the DOI policy—Meta acknowledged they neither praised nor supported the group.
The Board considered these cases reflected an over-enforcement of the DOI policy in a context of ongoing armed conflict and terrorism. It stressed that such errors risk undermining legitimate efforts to report on, condemn, or raise public awareness about terrorist groups and their human rights abuses.
The Board also recalled its earlier recommendation in the Mention of the Taliban in News Reporting decision, urging Meta to assess the accuracy of its enforcement under the DOI policy and identify root causes of such errors. It reiterated a recommendation from the Shared Al Jazeera Post case requesting that Meta provide clearer criteria and examples to distinguish neutral discussion and news reporting. Additionally, it referenced the Breast Cancer Symptoms and Nudity decision, where it recommended that Meta implement an internal audit process to analyze reversed automated removals and learn from enforcement mistakes. Meta has made progress on the first two recommendations and indicated it already conducts work aligned with the third, though it has not published implementation details.
The Oversight Board held that Meta should not have removed the posts and acknowledged the company’s correction of its initial error.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision expands expression. It affirms users’ ability to report on, criticize, and raise awareness about terrorist organizations—particularly in conflict-affected contexts—without fear of removal. The Board emphasized that an overly broad enforcement of Meta’s policies risks suppressing legitimate speech, including content that serves the public interest by documenting violence or human rights abuses.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.