Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of Öcalan’s Isolation
Turkey
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On February 27, 2024, the Oversight Board issued a summary decision overturning Meta’s original decision to remove an Instagram post that encouraged Syrians to rebel against the regime of Bashar Al-Assad. An Instagram user posted a video of a prominent figure in the Syrian resistance against Syrian president Bashar Al-Assad. Meta originally removed the post under its Dangerous Organizations and Individuals policy. The user appealed the decision claiming the post was meant to gather support for the Syrian resistance and not to glorify any designated organization or individual. When the Board brought Meta’s attention to the case, the company reversed its original decision. The Board noted that this case highlighted enforcement errors of the Dangerous Organizations and Individuals policy since Meta removed content that didn’t refer to any designated organization or individual.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. Decisions, except summary decisions, are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
In August 2023, an Instagram user posted a video of Abul Baset al-Sarout—a Syrian football player, activist, and prominent figure in the resistance against Syrian president Bashar Al-Assad. In the video, al-Sarout, who was killed in 2019, is heard stating in Arabic: “We have one liberated neighborhood in Syria, we are a thorn in this regime, we will return to this neighborhood,” and that “the revolution continues,” encouraging Syrians to resist the regime. The video garnered approximately 30,000 views.
Meta removed the post arguing that it breached its Dangerous Organizations and Individuals (DOI) policy which prohibits any content representing or supporting designated organizations and individuals. The user contested Meta’s decision to the Oversight Board, asserting that the post sought to spread awareness and mobilize support for the Syrian resistance, not to promote or glorify any designated dangerous entity or individual.
After the Board notified Meta of the user’s appeal, the company reversed its original decision and restored the content as it found that the content did not violate the DOI policy. In this context, the Board issued a summary decision. Summary decisions examine cases where Meta reversed its original decision. Although they do not have precedential value, they aim to provide transparency regarding Meta’s corrections and call attention to areas in which the company could improve.
The main issue before the Oversight Board was whether a post containing a video of a prominent figure of the Syrian resistance, calling on people to resist the government, violated Meta’s Dangerous Organizations and Individuals Policy.
In their appeal to the Board, the user argued that their social media account was a conduit for disseminating pivotal information and fostering engagement around the Syrian protest movement. The appellant argued that the content in question did not infringe upon any of Instagram’s stipulated guidelines.
On the other hand, Meta reversed its original decision and restored the video after it was notified of the appeal. The company determined that the content didn’t refer to any designated organization or individual, and therefore, did not violate its policies.
The Board highlighted that this case exemplified the incorrect removal of content that did not refer to any designated organizations or individuals, and the over-enforcement of the DOI policy. The Board expressed concern over the frequency of those enforcement errors and urged Meta to ensure that reducing such errors is a high priority.
Furthermore, the Board recalled one of its recommendations in the Mention of the Taliban in news reporting decision, which encouraged Meta to enhance the capacity allocated to the high-impact false positive override system review to guarantee that content moderation decisions caused by enforcement errors receive additional human review. Meta reported progress on the implementation of this recommendation. Additionally, the Board reiterated a recommendation from the Öcalan’s Isolation case urging Meta to “evaluate automated moderation processes for enforcement of the Dangerous Organizations and Individuals policy.” [p. 2] However, Meta declined its implementation.
The Board stressed that full adoption of these recommendations could reduce the rate of enforcement errors under the DOI policy. The Board overturned Meta’s original decision to remove the video and acknowledged Meta’s correction of the initial error.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
By overturning Meta’s original decision to remove the post, the Oversight Board expands expression in matters of public interest. The Board’s recommendations regarding enforcement errors foster a better environment for online expression and highlight the risks that automated review systems pose to content that might require a contextual and nuanced analysis.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board recalled recommendation no. 7 from this decision to highlight the importance of additional human review in some content moderation decisions.
The Board recalled recommendation no. 2 from this decision to urge Meta to evaluate its automated content moderation systems.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.