Global Freedom of Expression

Oversight Board Case of Mention of the Taliban in News Reporting

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 15, 2022
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2022-005-FB-UA
  • Region & Country
    Afghanistan, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Content Policy Recommendation, Oversight Board Policy Advisory Statement, Meta Newsworthiness allowance, Meta Spirit of the Policy allowance, Glorification of terrorism, Terrorism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 15, 2022, the Oversight Board overturned Meta’s original decision to remove a Facebook post from a news outlet page reporting a positive announcement from the Taliban regime in Afghanistan on women and girls’ education. The case originated in January 2022 when a popular Urdu-language newspaper in India reported on its Facebook page that Zabiullah Mujahid, a member of the Taliban regime in Afghanistan and its official central spokesperson,  announced that schools and colleges for women and girls would reopen in March 2022. Meta removed the post, imposed “strikes” against the page administrator, and limited their access to certain Facebook features. This was because the company determined the post violated its Dangerous Individuals and Organizations Community Standard under its prohibition on praising a designated terrorist group. Nevertheless, after the Board selected the case for review, Meta determined that this was an enforcement error; that the content fell into the Dangerous Individuals and Organizations Community Standard policy exception for reporting and, thus, should not have been removed. 

According to the Board, Meta’s original decision to remove the post was inconsistent with Facebook’s Dangerous Individuals and Organizations Community Standard since the content fell under the policy’s allowance on “reporting on” designated entities. The Board also deemed that Meta’s decision was inconsistent with the company’s human rights responsibilities since it unjustifiably restricted freedom of expression, which encompasses the right to impart and receive information, including on terrorist groups. While Meta reversed its decision due to the Board selecting the case, the Board concluded that the user had already experienced several days of feature-limits that were not fully rectified.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In January 2022, an Urdu-language newspaper based in India posted on its Facebook page a report that Zabiullah Mujahid, a member of the Taliban regime in Afghanistan and its official central spokesperson, had announced that schools and colleges for women and girls would reopen in March 2022. The post linked to an article on the newspaper’s website and was viewed around 300 times.

On January 20, 2022, a Facebook user clicked “report post” on the content but did not complete their complaint. However, this triggered a classifier that assessed the content as potentially violating the Dangerous Individuals and Organizations policy, and thus, the post was sent for human review. An Urdu-speaking reviewer determined that the content violated the “Dangerous Individuals and Organizations policy which “prohibits ‘praise’ of entities deemed to ‘engage in serious offline harms,’ including terrorist organizations” [p.2]. Meta removed the content, imposed “strikes” against the page administrator, and limited their access to certain Facebook features.

Consequently, the user appealed Meta’s decision. After a second human reviewer agreed with the first assessment that the content was violating, the post was placed in a queue for the High-Impact False Positive Override (HIPO), “a system Meta uses to identify cases where it has acted incorrectly, for example, by wrongly removing content” [p.2]. However, since fewer than 50 Urdu-speaking reviewers were allocated to HIPO at the time, and the post was not deemed a high priority, it was never reviewed in the HIPO system.

After the Board selected the case, Meta determined that the post should not have been removed in light of its rules allowing “reporting on” terrorist organizations. On February 25, 2022, the company restored the content, reversed the strike, and removed the restrictions on the user’s account.


Decision Overview

The main issue for the Oversight Board to analyze was whether Meta’s decision to remove the post was in line with Facebook’s Dangerous Individuals and Organizations Community Standard, Meta’s values, and the company’s human rights responsibilities.

In their statement to the Board, the user noted they were a media organization representative and did not support extremism. They argued that their articles were based on national and international media sources and that the content in question aimed to provide information about women’s and girls’ education in Afghanistan.

In its submission to the Board, Meta explained that upon re-examining its original decision, it had decided that the content should not have been removed as “praise” of a designated organization under the Dangerous Individuals and Organization policy. The company explained that “the underlying news context meant the content should have benefitted from the policy allowance for users to report on designated entities [p.9]. Further, the company noted that “users may share content that includes references to designated dangerous organizations and individuals to report on, condemn, or neutrally discuss them or their activities.” [p.9]. While Meta explained that its policies were designed to “allow room for these types of discussions while simultaneously limiting risks of potential offline harm” [p.9], it required “people to clearly indicate their intent when creating or sharing such content.” [p. 9]. The company stated that if a user’s intention were ambiguous or unclear, it would remove the content by default. 

Additionally, the company informed the Board that it could not “explain why two human reviewers incorrectly removed the content and did not properly apply the allowance for reporting” [p. 9]. In response to the Board questioning whether praise of dangerous organizations could be disseminated as part of news reporting, Meta remarked that its policy “allows news reporting where a person or persons may praise a designated dangerous individual or entity” [p.9]. Likewise, the company explained that the “strike system contains two tracks for Community Standards enforcement: one that applies to all violation types (standard), and one that applies to the most egregious violations (severe)” [p.10]. It noted that all Dangerous Individuals and Organizations violations were treated as severe. According to Meta, while the content was sent to the HIPO channel after its removal, it was not prioritized for human review “due to the capacity allocated to the market” and because Meta’s automated systems did not give the content a high priority score in the HIPO queue.  

The Board stated that this case was of particular importance because it demonstrated that a lack of clarity in the definition of praise seemed to result in uncertainty among reviewers and users. The Board noted that the complexity of the case relied upon the interest in ensuring that terrorist groups or their supporters do not use platforms for propaganda and recruitment efforts. However, when applied too broadly, the Board explained that such interest could lead to censorship of any content that reports on these groups. 

Compliance with Meta’s content policies

The Board considered that the enforcement actions taken by Meta, in this case, should not have been imposed since there was no underlying violation of the Community Standards. The Board remarked that Meta’s cross-check system guaranteed secondary human review to users on the Early Response Secondary Review List. It noted that “being on an Early Response Secondary Review list guarantees that Meta employees, and not at-scale reviewers, review the content before it can be removed” [p.14]. Nevertheless, it highlighted that while some news outlets’ Facebook pages were on that list, this page was not. The Board considered it unlikely that the content at issue would have been removed if the page were on the Early Response Secondary Review list. 

The Board commended Meta for introducing its HIPO system but expressed concern that it did not lead to a secondary review of a post that conformed with Meta’s Community Standards. It noted that in this case, the content did not receive an additional human review “due to the capacity allocated to the market and because it was not given a priority score by Meta’s automated systems as high as other content in the HIPO queue at that time” [p14]. Yet, the Board considered that given the public interest nature of the reporting and since a news outlet posted the content, it should have scored highly enough for additional review to have taken place. Likewise, the Board noted its concern that the Urdu language queue only had less than 50 reviewers at the time. The Board deemed that the size of the Indian market, the number of groups Meta had designated as dangerous in the region, and the heightened importance of independent voices, warranted “greater investment from the company in correcting (and ideally preventing) errors from occurring on such important issues” [p.14]. 

Compliance with Meta’s values

The Board held that “content containing praise of dangerous groups may threaten the value of “Safety” for Meta’s users and others because of its links to offline violence and its potential to “intimidate, exclude or silence others” [p.15]. However, in the instant case,  the Board considered that there was no significant safety issue as the content only reported on an announcement by a designated organization. Likewise, the Board stressed that the value of “Voice” was important since media outlets not only provide the public audiences with crucial information but also hold governments accountable. Thus, the Board deemed that Meta’s decision to remove  content did not materially contribute to “Safety” and constituted an unnecessary restriction of “Voice.” 

Compliance with Meta’s human rights responsibilities

The Board recalled that the UN Human Rights Committee in General Comment No. 34 declared that “a free, uncensored and unhindered press or other media is essential in any society to ensure freedom of opinion and expression and the enjoyment of other Covenant rights” [p. 8]. Further, it noted that “social media platforms like Facebook have become a vehicle for transmitting journalist’s reporting around the world, and Meta has recognized its responsibilities to journalists and human rights defenders in its corporate human rights policy” [p. 15].

 The Board stressed that the right to freedom of expression encompassed the ability of Meta’s users to access information about events of public interest in Afghanistan, particularly when a designated dangerous group forcibly removed the recognized government. In the Board’s view, the information posted by the user was essential for people concerned about girls and women’s equal right to access education, regardless of whether the Taliban met those commitments.

By employing the three-part test in Article 19 of the International Covenant on Civil and Political Rights (ICCPR), the Board then proceeded to analyze if Meta’s initial decision to remove the content was consistent with its human rights responsibilities as a business.

 I. Legality (clarity and accessibility of the rules)

While the Board recognized Meta’s Dangerous Individuals and Organizations policies contained more detail than when the Board issued its first recommendations, serious concerns remained. The Board highlighted that it was unclear whether the Taliban remained a designated dangerous entity when it forcibly removed the recognized government of Afghanistan. By referring to its previous decision in the cases of “Nazi quote” and “Ocalan’s isolation” the Board stressed that to bring users clarity, it had previously recommended that Meta disclose either a full list of designated entities or an illustrative one. The Board stated that it regretted the lack of progress on this recommendation.

The Board considered the definition of “praise” in the public-facing Community Standards as “speaking positively about” a designated entity too broad. It noted that “[f]or people engaged in news reporting, it is unclear how this rule relates to the reporting allowance built into the same policy” [p.17]. The Board recalled that according to Meta, “this allowance permits news reporting even where a user praises the designated entity in the same post” [p. 17]. Yet, the Board deemed that the “relationship between the “reporting” allowance in the Dangerous Individuals and Organizations policy and the overarching newsworthiness allowance remains unclear to users” [p. 17]. The Board recalled that in its decision in the case of “Shared Al Jazeera post,” it had recommended Meta provide criteria and illustrative examples in the Community Standards on what constituted news reporting. However, the Board stated that Meta was in the process of consulting with several teams internally to develop criteria and that it expected to conclude this process by Q4 2022.  The Board expressed concerns that changes to the Community Standard were not being translated into all available languages and that there were inconsistencies across languages. Therefore, the Board determined that since the policy was not equally accessible to all users, it made it difficult for them to understand what was permitted.

Further, the Board commented that it was concerned that Meta had not done enough to clarify how the strikes system worked. The Board deemed that Meta’s Transparency Centre page on “Restricting Accounts” did not comprehensively list the feature-limits the company could apply, their duration, nor set periods for severe strikes as it did for standard strikes. The Board found particularly concerning that while Meta established that severe strikes carried more significant penalties, it failed to provide a mechanism for appealing account-level sanctions separately from the content decision. Moreover, the Board considered that feature-limits could not always be fully reversed even when the content was restored. In the immediate case, the Board found that the user experienced several days of feature-limits which were not fully rectified when Meta reversed its decision.

The Board noted that the Known Questions document (Meta’s internal guidance for moderators) defined praise as content that “makes people think more positively about” a designated group. In the Board’s opinion, this was arguably broader than the public-facing definition in the Community Standards, making the meaning of “praise” less about the speaker’s intent than the effects on the audience. Also, the Board stressed that neither the Community Standards nor the Known Questions document constrained the reviewer’s discretion on restricting freedom of speech.

II. Legitimate aim

 The Oversight Board cited its decision in the case of “Punjabi Concern over the RSS in India,” to emphasize that it had previously “recognized that the Dangerous Individuals and Organizations policy pursues the aim of protecting the rights of others, including the right to life, security of person, and equality and non-discrimination” [p. 18]. The Board acknowledged that propaganda from designated entities, including through proxies presenting themselves as independent media, could pose risks of harm to the rights of others, thus determined that aiming to mitigate those harms through this policy was a legitimate aim. 

III. Necessity and proportionality

The Board noted that under General Comment 34, the UN Human Rights Committee had underscored that “the media plays a crucial role in informing the public about acts of terrorism and its capacity to operate should not be unduly restricted. In this regard, journalists should not be penalized for carrying out their legitimate activities” [p. 19]. Accordingly, the Board stressed that Meta is responsible for preventing and mitigating its platforms’ negative human rights impact on news reporting. 

For the Board, the type of enforcement errors that occurred while revising the content, in this case, could indicate broader failures. The Board noted that “those engaged in regular commentary on the activities of Tier 1 dangerous individuals and organizations face heightened risks of enforcement errors leading to their accounts facing severe sanctions” [p. 19], which could “undermine their livelihoods and deny the public access to information at key moments” [p. 19]. 

Moreover, the Board pointed out that the policy of defaulting to remove content under the Dangerous Individuals and Organizations policy when users did not make it clear that they intend to “report” could lead to over-removal of non-violating content. The Board stated that in this case, the mistake prevention and correction system did not benefit the user as it should have. It stated that this indicated issues on how the HIPO system prioritized content decisions for additional review. For the Board, in this case, “the enforcement error and failure to correct it denied a number of Facebook users access to information on issues of global importance and hampered a news outlet in carrying out its journalistic function to inform the public” [p. 8]. The Board warned that “Journalists may report on events in an impartial manner that avoids the kind of overt condemnation that reviewers may be looking to see. To avoid content removals and account sanctions, journalists may engage in self-censorship, and may even be incentivized to depart from their ethical professional responsibilities” [p. 19].

 Likewise, the Board noted that by issuing “spirit of the policy” exceptions related to the Taliban, Meta implicitly recognized that, at times, its approach under the Dangerous Individuals and Organizations policy could produce inconsistent results with the policy’s objectives. Thus, they do not meet the necessity requirement.

According to the Board, “[I]nternal company materials obtained by journalists revealed that in September 2021, the company created an exception “to allow content shared by the [Afghanistan] Ministry of Interior” on matters such as new traffic regulations, and to allow two specific posts from the Ministry of Health in relation to COVID-19″ [p. 20]. Moreover, the Board noted that other exceptions have reportedly been more tailored and shorter-lived. It exemplified this point by noting that in August 2021, for 12 days, “government figures” could reportedly acknowledge the Taliban as the “official gov of Afghanistan [ sic]” without risking account sanctions.” Similarly, the Board noted that from late August 2021 to September 3, Facebook users could “post the Taliban’s public statements without having to ‘neutrally discuss, report on, or condemn’ these statements” [p. 20]. 

The Board noted that on January 25, 2022, during a Policy Forum on Crisis Policy Protocol, Meta had stated that it would deploy “policy levers” in crisis situations and provided the example of allowing “praise of a specifically designated organization.” [p. 20] The Board considered that such exceptions to the general prohibition on praise could cause more uncertainty on when exceptions apply, not only for reviewers but for users too.

In light of the preceding, the Board concluded that Meta’s decision to remove the content from Facebook was an unnecessary and disproportionate measure.

Policy advisory statement:  

Regarding the Dangerous Individuals and Organizations policy, the Board recommended Meta investigate why the December 2021 changes to the policy were not updated within the target time of six weeks and ensure such delays or omissions are not repeated. Moreover, the Board urged Meta to make a public explanation of its “strike” system more comprehensive and accessible.

Concerning the enforcement of the policy, the Board recommended Meta: i). investigate why changes to the Dangerous Individuals and Organizations policy were not translated within the target timeframe and prevent such delays from being repeated. ii). Make the public explanation of its “strike” system more comprehensive and accessible. iii). Narrow the definition of “praise” in the Known Questions (internal guidance for moderators) by removing the example of content that “seeks to make others think more positively about” dangerous organizations. iv). Revise its Implementation Standards to clarify that the reporting allowance in the Dangerous Individuals Organizations policy permits positive statements. The Known Questions should explain the importance of protecting news reporting in conflict or crisis situations. v). Assess the accuracy with which reviewers enforce the reporting allowance to the Dangerous Individuals and Organizations policy to identify the cause of errors. vi). Conduct a review of the HIPO system to examine whether it can more effectively prioritize potential errors in enforcing exceptions to the Dangerous Individuals and Organizations policy. vii). Increase its capacity allocated to HIPO review across all languages.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Oversight Board’s decision expands expression by recognizing that freedom of expression encompasses the right to impart and receive information, including on terrorist groups, particularly important in times of conflict and crisis, including where terrorist groups exercise control of a country.  

 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Meta’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Meta’s actions were a justifiable limitation on expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Meta’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

General Law Notes

Oversight Board Decisions:

  • Shared Al Jazeera post  (2021-009-FB-UA).
    • The Board noted that in its decision, in this case, it had urged  Meta to provide public criteria and examples in its Dangerous Individuals and Organizations Community Standard for the following allowances to the policy: “neutral discussion”; “reporting”; and “condemnation.”
  • Öcalan’s isolation (2021-006-IG-UA).
    • The Board referred to this case to emphasize that it had previously suggested  Meta clarify how users can make their intent clear when posting in its public-facing Dangerous Individuals and Organizations Community Standards.
  • Punjabi Concern over the RSS in India (2021-003-FB-UA).
    • The Board noted that in its decision on this case it had recommended that Meta should aim to make its Community Standards accessible in all languages widely spoken by its users. 
  • Nazi quote (2020-005-FB-UA). 
    • The Board recalled that in its decision on this case it had recommended that Meta provide examples of “praise,” “support” and “representation” in the Community Standard on Dangerous Individuals and Organizations.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback