Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
Oversight Board case of a Nazi quote
United States
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On December 8, 2023, the Oversight Board overturned, in a summary decision, Meta’s decision to remove a user’s post on Facebook expressing concern about the education of girls in Afghanistan after the Taliban takeover. The user urged people to raise their concerns about the importance of girls’ education and highlighted the negative consequences of failing to do so. Meta originally removed the content arguing that it violated the Dangerous Organizations and Individuals policy which prohibits content supporting designated dangerous groups such as the Taliban. The company overturned its original decision when it was notified of the case by the Board. Meta held that the content did not violate any policy and that the post’s removal was an error. The Board noted that this case was an example of an enforcement error of the Dangerous Organizations and Individuals policy—which allows content criticizing designated organizations or individuals, or talking about them in a neutral way—that could impair political commentary on the impact of the Taliban takeover on girls’ education.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In July 2023, a Facebook user in Afghanistan posted a text in Pashto about the importance of educating girls in the country. They called on people “to continue raising their concerns” and highlighted the negative consequences of “failing to take such concerns to the Taliban.” [p. 1]
Meta removed the post, arguing that it violated its Dangerous Organizations and Individuals policy. Although the policy prohibits content that supports dangerous individuals or organizations—such as the Taliban—it allows content criticizing them or talking about them in a neutral way. After the Board brought the case to the company’s attention, Meta restored the post considering that its original decision was a mistake.
The main issue before the Oversight Board was whether the removal of a Facebook post discussing girls’ education in Afghanistan, after the Taliban takeover, was consistent with Meta’s policies and human rights obligations.
The Board noted that this case was an example of an enforcement mistake of Meta’s Dangerous Organizations and Individuals policy. The Board held that these errors “can have a negative impact on users’ capacities to share political commentary on Meta’s platforms.” [p. 2] In the case at hand, Meta’s enforcement decision hindered discussions regarding “women’s education in Afghanistan after the Taliban takeover.” [p. 2]
The Board noted that in the past it has recommended the company to “add criteria and illustrative examples to its Dangerous Organizations and Individuals policy to increase understanding of exceptions for neutral discussion, condemnation and news reporting,” [p. 2] as stated in the Shared Al Jazeera Post case. Meta said that this recommendation had been fully implemented, as reported in its Q2 2023 quarterly update. Furthermore, the Board also said that it recommended Meta implement internal audit mechanisms that allow the company to examine samples of automated content removal decisions to correct and learn from enforcement errors, as laid out in the Breast Cancer Symptoms and Nudity decision. According to Meta, the company already implemented this recommendation but has not published any further information to demonstrate this claim.
The Board overturned Meta’s original decision to remove the content and acknowledged Meta’s correction of its error. Furthermore, the Board urged Meta to reduce enforcement errors by accelerating the implementation of still-open recommendations.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
By overturning Meta’s original decision to remove the post, the Oversight Board expands expression, thus allowing users to share political commentary without interference. Considering Afghanistan’s context, and the role of social media in the country, the Board’s decision fosters a better online environment where discussions about matters of public interest will benefit from open debate. By asking Meta to refine its content moderation systems and reduce its enforcement errors, users will be able to take advantage of social media platforms to discuss sensitive issues.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board recalled this case to recommend Meta to add criteria and provide examples regarding exceptions to the Dangerous Organizations and Individuals Policy.
The Board cited this decision to urge Meta to implement internal mechanisms to reduce enforcement errors.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.