Global Freedom of Expression

Oversight Board Case of Shared Al Jazeera Post

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 14, 2021
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
    2021-009-FB-UA
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Policy Advisory Statement, Oversight Board Content Policy Recommendation, Oversight Board Transparency Recommendation, Terrorism

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 14, 2021, the Oversight Board agreed that Facebook (now Meta) was correct to reverse its original decision to remove content on Facebook that shared a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas. The company had initially removed the content under the Dangerous Individuals and Organizations Community Standard and restored it after the Board selected this case for review. The Board concluded that removing the content did not reduce offline harm and restricted freedom of expression on an issue of public interest. 

*The Oversight Board is a separate entity from Meta that provides its independent judgment on individual cases and policy questions.  An independent trust funds both the Board and its administration . The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue  company content policy recommendations.

 


Facts

The content in question, in this case, relates to the May 2021 armed conflict between Israeli forces and Palestinian militant groups in Israel and Gaza, a Palestinian territory governed by Hamas. The clash broke out after weeks of rising tensions and protests in Jerusalem tied to a dispute over ownership of homes in the Sheikh Jarrah neighbourhood of East Jerusalem and an Israeli Supreme Court ruling concerning the planned expulsion of four Palestinian families from the disputed properties. These tensions had escalated into a series of sectarian assaults by both Arab and Jewish mobs. On May 10, Israeli forces raided the Al-Aqsa Mosque, injuring hundreds of worshippers during Ramadan prayers. After this raid, the Al-Qassam Brigades issued an ultimatum, demanding that Israeli soldiers withdraw from the Mosque and Sheikh Jarrah by 6 pm. After the deadline expired, Al-Qassam and other Palestinian militant groups in Gaza launched rockets at the civilian center of Jerusalem, which began 11 days of armed conflict.
On May 10, 2021, a Facebook user in Egypt with more than 15,000 followers shared a post on the verified Al Jazeera Arabic page consisting of text in Arabic and a photo. The photo portrayed two men in camouflage with faces covered, wearing headbands with the symbol of the Al-Qassam Brigades, a Palestinian armed group and the militant wing of Hamas. The armed group has been accused of committing war crimes by the UN Independent Commission of Inquiry on the 2014 Gaza Conflict (A/HRC/29/CRP.4). The text read: “The resistance leadership in the common room gives the occupation a respite until 18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman” [p. 4]. Al Jazeera’s caption read: “‘He Who Warns is Excused’. Al-Qassam Brigades military spokesman threatens the occupation forces if they do not withdraw from Al-Aqsa Mosque”[p. 4]. The user shared Al Jazeera’s post and added “Ooh” in Arabic as a caption. Facebook removed the content for violating the Dangerous Organizations and Individuals policy.

On the same day, a different user in Egypt reported the previous post, selecting “terrorism” from the fixed list of reasons Facebook gives people who report content. An Arabic-speaking moderator in North Africa evaluated the content and removed the post for violating the Dangerous Individuals and Organizations policy. The Al-Qassam Brigades and their spokesperson Abu Ubaida were both designated as dangerous under this Community Standard.

As a result, the user appealed. The content was reviewed by a different reviewer in Southeast Asia who did not speak Arabic but had access to an automated translation of the content and also found a breach of the Dangerous Individuals and Organizations policy. Consequently, the user received a notification explaining that Facebook upheld the initial decision. Part of the communication from Facebook is a notice to receive a three-day read-only restriction on their account due to the violation. Moreover, the company restricted the user’s ability to broadcast live-streamed content and use advertising products on the platform for 30 days. Thus, the user appealed to the Oversight Board. Because the Board selected the case for review, Facebook restored the content stating that the company removed it in error.


Decision Overview

The main issue before the Board was whether Facebook was correct in its original decision to remove the user’s republication of the Al Jazeera report on the platform. The Board also analyzed whether Facebook’s original measure to remove the post complied, through a three-part test, with International Human Rights standards on freedom of expression.

In their appeal to the Board, the user explained that they shared the Al Jazeera post to update their followers on the developing crisis since it was an important issue that more people should be aware of. They stressed that they shared the Al Jazeera content and captioned it with a mere “ooh.”

Facebook stated that it could not explain why the two human reviewers decided that the content infringed the Dangerous Individuals and Organizations policy, noting that moderators are not required to record their reasoning for individual content decisions.

Moreover, the company stated that because the Board selected the case for review, it had reexamined its decision and found that the content did not contain praise, substantive support, or representation of the Al-Qassam Brigades of Hamas, their activities, or their members. Thus, it stressed that it reversed its decision since Al Jazeera’s post was non-violating, and the user shared it using a neutral caption. Additionally, Facebook mentioned in its submission that the company did not remove the original Al Jazeera post from the Al Jazeera Arabic Facebook page. The company explained that the media outlet’s page is subject to a cross-check system, an additional layer of review that Facebook applies to some high-profile accounts to minimize the risk of errors in enforcement. However, the company noted that cross-checking is not performed on content that third-party shares unless the third party is a high-profile account subject to cross-checking. Therefore, in the immediate case, although the root post by Al Jazeera was subject to cross-checking, the post by the user in Egypt was not.

Reacting to allegations that Facebook had censored Palestinian content due to the Israeli government’s demands, the Board asked Facebook, among other questions, if it had received official and unofficial requests from Israel to take down content related to the conflict. The company responded by stating that it had not received a valid legal request from a government authority related to the content the user posted in this case.

Compliance with Community Standards

The Board then examined if Facebook’s actions aligned with the recently updated version of the Dangerous Individuals and Organizations policy. According to the Board, “individuals have no less right to repost news than news media organizations have to publish it in the first place. In some contexts, the republication of material from a news source might be violated. However, in this case, the user has explained that their purpose was to update their followers on a matter of current importance. Facebook’s conclusion (on reexamination) that the user’s addition of the expression “ooh” was most likely neutral is confirmed by the Board’s language experts” [p. 11]. Therefore, the Board considered that the post did not violate the Community Standard, and thus Facebook did not err in restoring it.

Compliance with Facebook’s values

Regarding whether the decision to restore the content in question complied with Facebook’s values, on the one hand, the Board examined the importance of “Voice,” which Facebook described as “paramount”, especially in the context of conflict where the ability of people to express themselves is highly restricted. The Board stressed that social media platforms, including Facebook, are the primary means Palestinians communicate news and opinion and freely express themselves. There are severe limitations on the freedom of expression in territories governed by the Palestinian Authority and Hamas.

On the other hand, the Board examined the value of “Safety” given that the user had shared a post from a media organization that contained an explicit threat of violence from the Al-Qassam Brigades, implicating the value of “Safety”. In the Board’s reasoning, the content in question was broadly available worldwide on and off Facebook, given that it was a news media report of the threat. Hence, the Board did not find that the post posed an additional threat to the value of “Safety.”

Compliance with Facebook’s human rights responsibilities

The Board recalled that Article 19 of the International Covenant on Civil and Political Rights (ICRRP) states that everyone has the right to freedom of expression, which includes the freedom to seek, receive and impart information. Moreover, it underscored that, as indicated by the UN General Comment 34, the enjoyment of this right is intrinsically tied to access to free, uncensored, and unhindered press or other media. The Board stated that the media “plays a crucial role in informing the public about acts of terrorism and its capacity to operate should not be unduly restricted” [p. 12]. Yet, the Board highlighted that terrorist groups might exploit the media’s duty and interest to report on their activities. Notwithstanding the latter, Facebook should not use counterterrorism and counter-extremism efforts to repress media freedom. In the same vein, the Board stressed that social media platforms have an essential role to play during the first moments of a terrorist act by supporting the dissemination of information.

To unravel the justification of the restriction on freedom of expression, the Board employed the three-part test in Article 19 of the ICCPR.

I. Legality (clarity and accessibility of the rules)
The Board began by explaining its criticism of the vagueness of the Dangerous Individuals and Organizations Community Standard. It noted that it had previously called on the company to define the terms “praise”, “support”, and “representation”. The Board highlighted that Facebook had revised the policy, releasing an update in which the company “defined or gave examples of some key terms in the policy. It organized its rules around three tiers of enforcement according to the connection between a designated entity and offline harm. It further stressed the importance of users’ clear intent when posting content related to dangerous individuals or organizations” [p. 13].

However, the Board stressed that the policy remained unclear on how users can make their intentions clear and failed to provide examples of the “news reporting”, “neutral discussion”, and “condemnation” exceptions. In the Board’s opinion, criteria for assessing these exceptions, including illustrative examples, would help users understand what posts are permissible.
In addition, the Board voiced concern regarding the updated version of the Community Standards, which was not translated into languages other than US English for almost two months, especially since Facebook applied global changes to policies, even when translations were delayed. The Board emphasized that such delays left the rules inaccessible for too many users for too long.

II.Legitimate aim
The Board then proceeded to analyze if the restriction set out by the Dangerous Individuals and Organizations policy had a legitimate aim in the immediate case. It highlighted that the mission of this content policy is to prevent and disrupt real-world harm with the legitimate aim of protecting the rights of others, which in this case included the right to life and the security of persons.

III. Necessity and proportionality

Finally, the Board examined if the restriction was necessary and proportionate to achieve its legitimate aim, which in this case was protecting the rights of others. In the Board’s view, removing the content in the present case was unnecessary. Furthermore, it recognized that journalists face a challenge in balancing the potential harm of reporting statements from terrorist organizations and keeping the public informed on evolving and dangerous situations. Yet, some Board members expressed concern that the reporting in this instance provided little or no editorial context for Al-Qassam’s statements and thus could be seen as a conduit for Al-Qassam’s threat of violence. However, the content posted by Al Jazeera was widely available globally. As a result, the Board determined that the removal of the user’s republication of the Al Jazeera report did not materially reduce the terroristic impact the group presumably intended to induce but instead affected the ability of the user, in a nearby country, to communicate the importance of these events to their readers and followers.

Moreover, the Board pinpointed that the Israeli government, the Palestinian Authority, and Hamas unduly restrict free speech, negatively impacting Palestinians and other voices. In the same vein, it held that discriminatory enforcement of the Community Standards violated a fundamental aspect of freedom of expression. The Board further commented that it had received public comments alleging that Facebook has disproportionately removed or demoted content from Palestinian users and content in the Arabic language. The comments are compared to its treatment of posts threatening or inciting anti-Arab or anti-Palestinian violence within Israel. The company has also been criticized for not doing enough to remove content that incites violence against Israeli civilians.

In conclusion, the Board affirmed Facebook’s decision to restore the content, agreeing that the original decision to take down the post was in error.

Policy advisory statement:

Regarding Meta’s Content Policy, the Board recommended the company clarify its rules to users by adding criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation, and news reporting. The Board also urged Meta to ensure swift translation of updates to the Community Standards into all available languages.

Similarly, concerning the application of Meta’s transparency, the Board recommended the company engage an independent entity to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, is applied without bias. Moreover, the Board recommended that Meta normalize a transparent process on how it receives and responds to all government requests for content removal and ensure that they are included in transparency reporting.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Oversight Board’s decision expands expression by affirming Facebook’s decision to restore the content. It highlighted that the original decision to remove the user’s content did not materially reduce the offline terroristic impact the group presumably intended to induce but instead affected the user’s ability in a nearby country, to communicate the importance of these events to their readers and followers. Moreover, the Board stressed that individuals have as much right to repost news stories as media organizations have to publish them in the first place, which provides robust protection to the right to access information.  

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.

  • ICCPR, art. 6

    The Board interpreted the right to life contained in this Article by General Comment No. 36 of the Human Rights Committee (2018).

  • ICCPR, art. 9

    The Board interpreted the right to security of person contained in this Article as interpreted by General Comment 35 of the Human Rights Committee (2014).

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Article 2 of the Oversight Board Charter states, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The Board’s resolution of each case will be binding, and Facebook (now Meta) will implement it promptly unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with similar context – which the Board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the Board’s decision to that content as well. When a decision includes policy guidance or an advisory policy opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance. It will be considering it in the formal policy development process of Facebook (now Meta) and transparently communicating about actions taken.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback