Global Freedom of Expression

Oversight Board Case of Öcalan’s Isolation

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    July 8, 2021
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2021-006-IG-UA
  • Region & Country
    Turkey, Europe and Central Asia
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations, Instagram Community Guidelines, Referral to Facebook Community Standards
  • Tags
    Oversight Board Policy Advisory Statement, Meta Spirit of the Policy allowance, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation, Oversight Board Transparency Recommendation, Terrorism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

This case is available in additional languages:    View in: Español    View in: Français    View in: العربية

Case Analysis

Case Summary and Outcome

On July 8, 2021, the Oversight Board overturned Facebook´s (now Meta) original decision to remove an Instagram post encouraging people to discuss the solitary confinement of Abdullah Öcalan, a founding member of the Kurdistan Workers’ Party (PKK). When the user appealed the company’s decision and the Board selected the case for review, Facebook found that a piece of internal guidance on the Dangerous Individuals and Organizations policy was “inadvertently not transferred” to a new review system and therefore decided to restore the content. While analyzing the company’s original decision, the Board found that the content should never have been removed. It determined that the user did not advocate violence but sought to highlight human rights concerns about Öcalan’s prolonged solitary confinement. Thus, the Board concluded that the post was unlikely to result in harm, and its removal was not necessary or proportionate under international human rights standards. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The board can also choose to issue recommendations on the company’s content policies.


Facts

The PKK was founded in the 1970s to establish an independent Kurdish state in South-Eastern Turkey, Syria, and Iraq.

On January 25, 2021, an Instagram user posted a picture of Abdullah Öcalan, a founding member of the PKK, who had been imprisoned in Turkey since his arrest and sentencing in 1999 for carrying out violent acts aimed at the secession of a part of Turkey’s territory. The uploaded content included the text: “y’all ready for this conversation” in English. The picture’s caption said it was time to talk about ending Öcalan’s isolation in prison. Additionally, the user encouraged readers to “engage in conversation about Öcalan’s imprisonment and the inhumane nature of solitary confinement, including through hunger strikes, protests, legal action, op-eds, reading groups, and memes” [p. 5]. 

On February 12, 2021, the post was removed after a human moderator found that it violated the company’s policy on Dangerous Individuals and Organizations. The user appealed the decision, yet Facebook replied that it was final and could not be reviewed because of a temporary reduction in capacity due to COVID-19. Nevertheless, a second moderator reviewed the content and confirmed the first decision. Consequently, the user appealed to the Oversight Board.

Facebook restored the content to Instagram on April 23, after the case got selected by the Board for review. The company notified the Board that it had found a piece of internal guidance, developed in 2017, on the Dangerous Individuals and Organizations policy that allowed discussion or debate about the conditions of confinement for individuals designated as dangerous. It explained that in 2018 the guidance was “inadvertently not transferred” to a new review system and, thus, had not been applied in this case. Upon discovering the error, the company restored the content.


Decision Overview

The main issue for the Board to analyze was whether Facebook’s original decision to remove the Instagram post encouraging people to discuss the solitary confinement of Abdullah Öcalan complied with the company’s content policies, values, and human rights responsibilities.

In their appeal to the Board, the user explained that they posted the content to spur discussion about Öcalan’s philosophy and to end his isolation. They highlighted that they believed banning the discussion about Öcalan prevents discussions that could lead to a peaceful settlement for Kurdish people in the Middle East. They also stated that they did not wish to promote violence but believed that there should not be a ban on posting pictures of Öcalan on Instagram. 

In its submission, Facebook explained that it initially concluded that the content was a call to action to support Öcalan and the PKK, thus violating the Dangerous Individuals and Organizations policy. Facebook further remarked that after the Board had selected the case for review, it re-evaluated the content against its policies and found that in 2017, it had developed internal guidance allowing content calling for the freedom of a terrorist when the context of the content is shared in a way that advocates for peace or debate of the terrorist’s incarceration. However, it stated that it had inadvertently failed to transfer this guidance when it switched to a new review system in 2018. After applying the guidance to this case, Facebook found that the content fell within it and restored the content.

Compliance with Facebook’s content policies

The Board noted that Instagram’s Community Guidelines, which included a link to Facebook’s Community Standard on Dangerous Individuals and Organizations, stated that Instagram is not a place to support or praise terrorism, organized crime, or hate groups.  It then highlighted that the Community Standards apply to Instagram as they do to Facebook. The Board then explained that the Dangerous Individuals and Organizations Community Standard seeks to “prevent and disrupt real-world harm, [by] not allow[ing] any organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.” Moreover, the Board pointed out that when the Standard was enforced, it established that Facebook could remove content that voiced support or praise for groups, leaders, or individuals involved in these activities.

Following a request from the Board, Facebook shared their internal guidance for content moderators regarding the meaning of “support” of designated individuals and organizations. In it, a “call to action in support” was defined as a “call to direct an audience to do something to further a designated dangerous organization or its cause” [p. 11]. Yet, the Board highlighted that at the time the content was posted, such information was not included in the public-facing Community Standards nor was contained in the update published on June 23, 2021.

Further, the Board considered the misplaced and non-public guidance created in 2017 to highlight that discussion of the conditions of a designated dangerous individual’s confinement was permitted and did not constitute support. The Board noted that Facebook’s policy of defaulting towards removing content showing “support” for designated individuals while keeping key exceptions hidden from the public allowed the mistake to go unnoticed by the company for approximately three years without any accountability.  Therefore, in the instant case, the Board determined that even without discovering the misplaced guidance, the content should not have been removed for “support” since the user only encouraged people to discuss Öcalan’s solitary confinement through hunger strikes, protests, legal action, op-eds, reading groups, and memes. Thus, the Board considered that the content removal did not serve the policy’s aim of preventing and disrupting real-world harm.

Compliance with Facebook’s values

The Board found that Facebook’s decision to remove the content did not comply with Facebook’s values of “Voice” and “Safety.” It explained that the user sought to highlight possible human rights violations, and that challenging such violations was central to the value of “Voice”. Additionally, the Board noted that while the value of “Safety” was notionally engaged given that the content concerned a designated dangerous individual, removing the content did not address any clear “Safety” concern since the content did not include language that incited or advocated for the use of violence. Instead, it considered that Facebook’s decision illegitimately suppressed a person’s voice raising a human rights concern.

Compliance with Facebook’s human rights responsibilities

 The Board recalled that discussing the conditions of an individual’s detention and alleged violations of their human rights in custody was a type of expression protected by Article 19 of the ICCPR. It also remarked that international bodies had raised human rights concerns about practices that sought to highlight concerns about an individual’s solitary and prolonged confinement. To unravel if the restriction on freedom of expression was justified, the Board proceeded to employ the three-part test in Article 19 of the ICCPR. 

I. Legality 

The Board cited human rights standards that state that restrictions on expression should be formulated with sufficient precision so that individuals understand what is prohibited and act accordingly. In the Board’s view, while Facebook had provided reviewers with extensive internal and confidential guidance to interpret the company’s content policies, it had failed to reflect essential rules on what is excluded from Facebook’s definition of support in the public-facing Community Standards.

 The Board noted that Facebook has now publicly defined the terms “representation,” “praise,” and “support” in the Community Standard on Dangerous Individuals and Organizations. However, the Board noted that the UN Special Rapporteur on freedom of expression has described social media platforms’ prohibitions on both “praise” and “support” as “excessively vague” (A/HRC/38/35, para. 26; see also: General Comment No. 34, para. 46) [p.14]. 

II. Legitimate aim

The Board considered that since Facebook had reversed its original decision following the Board’s selection of the case, the company did not seek to justify the removal as pursuing a legitimate aim but instead framed it as an error.

III. Necessity and proportionality

In the Board’s view, by reversing its decision following the Board’s selection of the case, Facebook had implicitly acknowledged that removing the content was not necessary or proportionate. It further stressed that the breadth of the term “support” in the Community Standards, combined with the misplacement of internal guidance on what this excluded, meant an unnecessary and disproportionate removal occurred. Additionally, The Board considered that “there was no demonstrable intent of inciting violence or likelihood that leaving this statement or others like it on the platform would result in harm” [p.16].

Right to remedy

The Board expressed several concerns that indicated that Facebook failed to respect the right to remedy in contravention of its Corporate Human Rights Policy. In this case, the Board explained that the user was informed an appeal was not available due to COVID-19, but later it was carried out. While it recognized that the content was later restored, the Board expressed concern regarding the possibility of a considerable number of removals that should not have happened because Facebook lost its internal guidance, which had allowed for discussion on conditions of confinement for designated individuals. Similarly, the Board believed that Facebook’s transparency reporting was not sufficient to meaningfully assess if the type of error identified in this case meant there was a systemic problem.

Policy advisory statement:

The Board, among other things, recommended Facebook: “1. Restore the misplaced 2017 guidance to the Internal Implementation Standards and Known Questions (the internal guidance for content moderators). 2. Evaluate automated moderation processes for enforcement of the Dangerous Individuals and Organizations policy. Where necessary, Facebook should update classifiers to exclude training data from prior enforcement errors that resulted from failures to apply the 2017 guidance. 3. Publish the results of the ongoing review process to determine if any other policies were lost, including descriptions of all lost policies, the period they were lost for, and steps taken to restore them. 5. Add to the Dangerous Individuals and Organizations policy a clear explanation of what “support” excludes. Users should be free to discuss alleged violations and abuses of the human rights of members of designated organizations. 8. Ensure internal guidance and training are provided to content moderators on any new policy […] 9. Ensure that users are notified when their content is removed” [p. 19-20].

Similarly, the Board also suggested to Facebook that “notification[s] should note whether the removal is due to a government request or due to a violation of the Community Standards, or due to a government claiming a national law has been violated (and the jurisdictional reach of any removal). 10. Clarify to Instagram users that Facebook’s Community Standards apply to Instagram in the same way they apply to Facebook. 12. Include more comprehensive information on error rates for enforcing rules on “praise” and “support” of dangerous individuals and organizations, broken down by region and language” [p. 20-21].


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision expanded expression because the Board determined that users of Facebook platforms should be able to generate discussions regarding violations of human rights, abuses that relate to terrorism and counter-terrorism, even if they include the names of people/organizations recognized as Dangerous Individuals and Organizations by the company’s policy. The Board encouraged Facebook to protect content about public matters in countries where national, legal and institutional protections for human rights, especially freedom of expression, are weak. Finally, it expands expression by asking the company to support users’ discussions on the rights of detained people, who may be unable to advocate in support of their rights effectively.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board used the UNGPs as the legal basis of Facebook’s commitment to respect human rights.

  • ICCPR, Art. 19, para. 3

    The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression through any media and regardless of frontiers. They also used it to apply the three-part test of legality (clarity), legitimacy, and necessity and proportionality.

  • UN General Assembly, Declaration on Human Rights Defenders, UN Doc. A/RES/53/144 (1999), article 6(c)

    The Board used Article 6 as a reference to the right of people to study, discuss, form and defend human rights and fundamental freedoms.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board used Report A/HRC/38/35 to argument how social media platforms prohibitions on the terms “praise” and “support” are “excessively vague”.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board used Report A/74/486 as the legal basis to analyze the right to remedy.

  • ICCPR, art. 2

    The Board used Article 2 of the ICCPR as the legal basis to analyze the right to remedy.

  • UNHR Comm., General Comment No. 31 (2004)

    The Board used Article 2 of the ICCPR as the legal basis to analyze the right to remedy.

  • UN Standard Minimum rules for the Treatment of Prisoners (the Nelson Mandela Rules), A/Res/70/175 (2016)

    The Board used the Nelson Mandela Rules to analyze freedom of expression and its relationship with human rights and solitary and prolonged confinement.

  • OSB, Nazi quote, 2020-005-FB-UA (2021)

    By referring to this case the Board highlighted that it had recommended the company amend its Community Standard on Dangerous Individuals and Organizations to define “representation,” “praise,” and “support. 

  • OSB, Breast cancer symptoms and nudity, 2020-004-IG-UA (2021)

    The Board recalled that through its decision in this case it had recommended that Facebook clarify the relationship between Instagram’s Community Guidelines and the Facebook Community Standards. 

  • OSB, Punjabi concern over the RSS in India, 2021-003-FB-UA (2021)

    The Board stressed the need for Facebook to restore the appeals process in line with recommendations in this case. 

  • OSB, Pro-Navalny protests in Russia, 2021-004-FB-UA (2021)

    The Board cited this case to reiterate its concern regarding Facebook removing content on matters in the public interest in countries where national legal and institutional protections for human rights, particularly freedom of expression, are weak. 

  • OSB, “Two Buttons” Meme, 2021-005-FB-UA (2021)

    The Board cited this case to reiterate its concern regarding  Facebook removing content on matters in the public interest in countries where national legal and institutional protections for human rights, particularly freedom of expression, are weak.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.” 

The decision was cited in:

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback