Global Freedom of Expression

Oversight Board Case of Al-Shifa Hospital

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    December 19, 2023
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2023-049-IG-UA
  • Region & Country
    Palestine, State of, Middle East and North Africa
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violent and graphic content
  • Tags
    Public Interest, Social Media

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board overturned Meta’s original decision to remove an Instagram video that depicted deceased and severely injured people after a strike on Al-Shifa Hospital in Gaza during Israel’s military operations. A user posted a video of the aftermath of a strike on Al-Shifa Hospital in Gaza claiming that the Israeli army attacked the hospital. Meta’s automatic classification system removed the content for violating the Graphic and Violent Content policy. The user appealed this decision; nonetheless, the appeal was closed automatically without human review. After the Board selected the case, Meta restored the content with a warning screen and limited the post’s visibility to people over 18. Meta also removed the post from the recommendations of adult Instagram users. The Board found that the warning screen and age restriction complied with Meta’s policies, values, and responsibilities under international human rights law. However, the Board held that the original decision to remove the content did not comply with Meta’s Violent and Graphic Content policy. The Board underscored the very high public interest in the post and advised Meta to create a policy exception for content raising awareness and condemning potential human rights abuses and war crimes, especially in times of war or armed conflict. Moreover, the Board expressed its concern over the unintentional bias in Meta’s practices against Palestinian and Arabic-speaking users. It also stressed Meta’s responsibility regarding the preservation of potential evidence about human rights abuses and humanitarian law violations.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On October 7, 2023, Hamas, which Meta designated as a Tier 1 organization under its Dangerous Organisations and Individuals Community Standard, led an attack on Israel, killing 1200 people and taking 240 people as hostages. In response to the attacks, Israel began a military campaign in Gaza and killed more than 18,000 people as of mid-December 2023. The conflict became a subject of wide global interest and debate on and off social media, including Meta’s platforms. Meta immediately designated the events of 7 October as a terrorist attack and prohibited any content praising or supporting it.

Meta put several temporary measures in place, such as reducing the confidence thresholds for its Graphic and Violent Content automatic classification system (classifier) to identify and remove violent and graphic content from Israel and Gaza across all languages. The temporary measures were implemented to prioritize the value of safety. These measures led to increased content removal. While they reduced the possibility of failing to remove violating content, they also increased the possibility of removing non-violating content related to the conflict.

To mitigate users’ concerns about being suspended—or subjected to other restrictions following multiple violations of Meta’s content policies—, Meta withheld strikes that would ordinarily be implemented alongside content removals. The changes in the classifier confidence thresholds and the withholding of the strikes were meant to be temporary and limited to the Israel-Gaza conflict. As of December 11, 2023, Meta had not restored confidence thresholds to their previous levels.

In the second week of November, a video was posted on Instagram showing dead or injured people, including children on, or near to, Al-Shifa Hospital in Gaza after a strike. One of these children appeared to be dead with a severe head injury. The user captioned the video stating that the hospital was targeted by the “usurping occupation,” referring to the Israeli army, and tagged human rights and news organizations. The caption was both in Arabic and English.

Meta’s automated systems removed the content considering it violated the Violent and Graphic Content Standard. At the time the content was published, this community standard allowed posts regarding the “violent death of a person or people by accident or murder” by placing a warning screen and making the content visible to people over the age of 18. The user appealed the removal decision. The appeal was automatically rejected due to the classifiers’ “high confidence level” that the content was violating. Following that, the user appealed Meta’s decision to the Oversight Board.

After the Board selected this case, Meta reversed its original decision arguing that the post should have been kept with a warning screen—even if it was on the “borderline” of violating the company’s standards. Meta emphasized that “such content is permitted when shared to raise awareness ‘about important issues such as human-rights abuses, armed conflicts or acts of terrorism.’” [p.4]

Therefore, Meta restored the content with a warning screen that informed that the content may be disturbing and limited the post’s visibility to people over 18. Meta also removed the post from the recommendations of adult Instagram users. Meta added a separate instance of the same video to a Media Matching Service bank to automatically include a warning screen on similar videos and only allow the content on the feeds of people over 18.


Decision Overview

The main issue before the Oversight Board was whether Meta’s original decision to remove a video—and its subsequent restorement under limited conditions—depicting the aftermath of a strike on Al-Shifa Hospital, during the Israel-Gaza conflict, was in line with Meta’s content policies, and human rights responsibilities.

The user who appealed Meta’s decision to the Board stated that the post did not incite violence and was only shared to show the suffering of Palestinians, including children. The user also claimed that the removal was biased against the suffering of Palestinians.

In its submission to the Board, Meta stated that the decision on how to treat content similar to the one in the case is challenging. Therefore, the company claimed that it valued the Board’s input on this case.

1. Compliance with Meta’s content policies

The Board noted that even though its members share different opinions about Israel’s military actions, and their justification, they unanimously agreed that it was crucial for Meta to uphold the right to freedom of expression of all of those affected by the events. It highlighted that there was an exceptional public interest value in protecting content raising awareness about the impact of the conflict. Thus content “on the borderline” of violating the Violent and Graphic Content policy shouldn’t be removed. The Board also considered that the warning screen, and the restriction of the content to adults, was in line with the Violent and Graphic Content policy as the video showed a person’s violent death and a severe head injury.

The Board agreed with Meta’s subsequent determination that even if the video showed visible internal organs, it should have been kept on the platform behind a warning screen—and only available to adult users—as the caption’s language made it clear that the user was condemning and raising awareness about the violence. Moreover, the Board highlighted that the Violent and Graphic Content policy’s rationale allows graphic content with some limitations, in severe contexts such as human rights abuses, armed conflicts and acts of terrorism, “to help people condemn and raise awareness about these situations.” [p.6]

The Board underlined that the Graphic and Violent Content policy prohibited “’visible internal organs’ in a non-medical context, without providing reviewers the option of adding a warning screen where the policy rationale exception is engaged.” [p. 6] Thus, it expressed concerns about Meta’s automated classifiers which appear not to be configured to apply warning screens when necessary. Moreover, it was not clear to the Board if the classifiers could send the content to human review if the context is not clear.

2. Compliance with Meta’s human rights responsibilities

The Board reiterated that Meta’s moderation of graphic content must comply with the right to freedom of expression under the International Covenant on Civil and Political Rights (ICCPR). The Board highlighted its decision in the case of Armenian Prisoners of War Video, to state that the protection of freedom of expression under Article 19 of the ICCPR should remain engaged during armed conflicts alongside complementary rules of international humanitarian law in relation to Meta’s human rights responsibilities during such conflicts. The Board also made a reference to the UN Guiding Principles on Business and Human Rights which discusses the responsibility of businesses operating in conflict settings.

The Board emphasized the crucial role of Meta’s platforms in the dissemination of real-time information about violent events as highlighted by the Board in the Mention of the Taliban in News Reporting decision. It also stressed the great public interest value in content depicting human rights abuses and violent attacks as underlined by the Board in the case of Sudan Graphic Video.

Subsequently, the Board evaluated Meta’s decisions in light of  the three-part test—as enshrined in Article 19(3) of the ICCPR—to assess whether  Meta’s original decision was compatible with the company’s obligations under International Human Rights Law.

Legality (clarity and accessibility of the rules)

The principle of legality, the Board said, requires that rules restricting freedom of expression should be accessible and sufficiently clear to guide what is permitted and what is not. In this respect, the Board, following the Sudan Graphic Video and Video After Nigeria Church Attack cases, showed concern about the non-compliance of the Violent and Graphic Content policy with the rationale of the policy that allows graphic content, with some limitations, in severe contexts such as human rights abuses, armed conflicts and acts of terrorism, if it raises awareness about these situations. Considering this, the Board reiterated “the importance of recommendations no. 1 and no. 2 in the Sudan Graphic Video case, which called on Meta to amend its Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human-rights abuses (that case concerned visible dismemberment).” [p. 7] The Board stated that this recommendation should apply to rules about the depiction of internal organs to specifically include warning screens as an enforcement measure when the awareness or condemnation exceptions are engaged.

 Legitimate aim

Article 19(3) of the ICCPR, the Board held, provides an exhaustive list of reasons under which freedom of expression may be restricted. The Board reiterated that the Violent and Graphic policy aims to protect the rights of others, including the right to privacy of  deceased individuals, as argued in the Sudan Graphic Video and Video After Nigeria Church Attack cases. In the present case the Board also pointed out to another legitimate aim: “restricting access to the content for people under 18 served the legitimate aim of protecting the right to health of minors (Convention on the Rights of the Child, Article 24).” [p. 8]

Necessity and proportionality

The principle of necessity and proportionality, as the Board explained, requires that restrictions on freedom of expression must be the least intrusive measures to achieve their protective function. In this regard, the Board reiterated its stance in the Sudan Graphic Video case. According to it, warning screens “[do] not place an undue burden on those who wish to see the content while informing others about the nature of the content and allowing them to decide whether to see it or not.” [p. 8]

The Board drew a distinction between this case and the Russian Poem case. In the latter, the Board found that applying a warning screen was unnecessary and disproportionate because it was applied on a still image of a body lying at long range where the victim’s face was not visible and there were no indicators of violence. However, in this case the contested video showed dead and injured people at close distance with clear indicators of violence. The depiction of deceased and injured children, the Board held, made the video especially distressing which justified the warning screen in light of the necessity and proportionality requirements.

Nonetheless, the Board found that restrictions preventing adult users from accessing content raising awareness about potential human rights abuses,  violations of the laws of war, conflicts, or acts of terrorism, are not necessary or proportionate. The Board explained that restrictions such as warning screens and the removal of a post from recommendations “serve separate functions, and should in some instances be decoupled,” [p. 8] for example in crisis situations. As the Board noted, removing content from recommendations systems reduces the reach the content would otherwise get. The Board held that this practice was disproportionate, especially in cases where the content was already limited to adult users and regarded matters of public interest.

The Board recognized that responses to a crisis require exceptional temporary measures and that it is legitimate, in some cases, to limit freedom of expression to prioritize safety concerns. However, such measures, the Board opined, should comply with human rights. Hence, the Board stated that safety concerns didn’t justify removing graphic content raising awareness about, or condemning, possible war crimes, crimes against humanity, and grave human rights violations. Such restrictions, the Board said, could even limit the dissemination of information which is essential to the safety of those on the ground during the conflict.

The Board noted that although not imposing strikes mitigated the disproportionate effects of enforcement errors, it did not protect users who shared content to raise awareness about potential human rights abuses and violations of humanitarian law. The Board reiterated the need to develop a content moderation framework for crises and conflict zones. To it, such a framework should consider resources to ensure freedom of expression is not unjustly restricted. The Board also considered that since journalistic sources are subject to attacks during armed conflicts, social media platforms are an essential tool for ordinary citizens to report news during such times.

Furthermore, the Board highlighted that more graphic and violent content is potentially shared in times of war to raise awareness about abuses or to document them. Therefore, allowances for this type of content should be included in the policy. The Board acknowledged that Meta was currently modifying its Violent and Graphic Standard to implement recommendations in this regard, and it expected the company to be ready to implement temporary measures to allow this type of content behind warning screens without removing it from recommendations.

The Board also argued that the situation in Gaza at the time the content was posted was not as challenging, for Meta, as the October 7 attacks. At the time the video was posted, journalists’ access to the territory was limited, internet connectivity was disrupted, and terrorists were not using Meta’s platforms to disseminate their attacks. In light of this context, the Board held that Meta should ensure that its actions do not limit people’s ability to post content showing the struggles of civilians during armed conflicts or that might be used to determine the existence of human rights/humanitarian law violations.

The Board concluded that upon reviewing content in similar situations, it should be determined whether it was shared to raise awareness or condemn abuses. It also held that Meta’s classifiers should be designed to avoid removing content that benefits from policy exceptions. The Board considered that this case illustrated how the insufficient human oversight of the automated moderation systems in times of crisis might lead to unlawful speech removal—taking into consideration both the removal decision and the subsequent rejection of the user’s appeal without any human review. This, the Board explained, was further exacerbated by the temporary lowering of the confidence threshold enacted by Meta.

The Board, following the Colombia Police Cartoon case, recommended Meta to ensure “that content with high rates of appeal and high rates of successful appeal be reassessed for possible removal from its Media Matching Service banks.” [p. 10] Meta implemented this recommendation, previously outlined in the aforementioned case, by designating a working group committed to governance improvements across its Media Matching Service banks. The Board argued that the working group should pay attention to the use of these banks during armed conflicts. Furthermore, it reiterated its recommendations from the Breast Cancer Symptoms and Nudity case. According to them “Meta [should] inform users when automation is used to take enforcement action against their content, and disclose data on the number of automated removal decisions per Community Standard and the proportion of those decisions subsequently reversed following human review.” [p. 10] The Board highlighted that these measures were significantly important when confidence thresholds were lowered.

Non-Discrimination

The Board considered that any restriction on freedom of expression must be non-discriminatory, following Articles 2 and 26 of ICCPR. It recalled the Shared Al Jazeera Post case, to underscore its concerns about Meta’s errors when moderating content in Israel and the Occupied Palestinian Territories and called for an independent investigation. Considering that Business for Social Responsibility (BSR) was commissioned by Meta to identify the human rights impacts on Palestinian and Arabic speaking users due to the unintentional bias present in Meta’s policies and practices—combined with other external dynamics—the Board encouraged the company to fulfill the commitments it made after the BSR report.

Obligation to preserve evidence

The Board highlighted Meta’s responsibilities regarding the preservation of evidence about human rights and international humanitarian law violations, as recommended by the BSR report and advocated by civil society groups. It stressed that even if content is removed, Meta should preserve evidence in the interest of future accountability. While Meta retains all policy-violating content for one year, the Board urged Meta to preserve content related to potential war crimes, crimes against humanity, and grave human rights violations, for longer and in an accessible way. The Board repeated its call to Meta to develop a protocol to preserve and share evidence with competent authorities.

In light of this arguments, the Board overturned Meta’s original decision to remove the content from Instagram considering the high public interest in the content and the context of the conflict. The Board found that Meta’s subsequent decision to restore the content behind a warning screen and with age-restrictions was consistent with Meta’s content policies, values and human rights responsibilities. However, the Board decided that removing the content from the recommendations of adult users was unnecessary and disproportionate.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision issued by the Oversight Board in this case expanded freedom of expression. Through this decision, the Board stressed the importance of preserving content that documents potential human rights abuses and humanitarian law violations during armed conflicts where journalistic access is severely limited. Considering the importance of social media platforms in times of crisis, the Board’s decision provides robust protection for the right to disseminate information—which might be essential to guarantee the safety of civilians, whose access to media, in light of the context, could be heavily curtailed already. The decision also fosters a better online ecosystem for expression about matters of public interest.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board referred to this article to assess Meta’s responsibilities towards human rights through the lens of freedom of expression. It also relied on this provision to analyze whether Meta’s measures met the requirements of the three-part test

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s human rights responsibilities.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this instrument for guidance on how to apply the three-part test.

  • OSB, Sudan graphic video, 2022-002-FB-MR (2022)

    The Board referenced this case to reiterate that content depicting violent attacks and human rights abuses is of great public interest and to stress its concerns about the mismatch between the Violent and Graphic Content policy and the policy rationale.

  • OSB, Case of Armenian Prisoners of War Video, 2023-004-FB-MR (2023)

    The Board referred to this case to argue that warning screens are necessary and proportionate when applied to disturbing content and to recommend Meta to preserve evidence of potential human rights abuses and potential war crimes.

  • OSB, Mention of the Taliban in News Reporting, 2022-005-FB-UA (2022)

    The Board recalled this case to underline the crucial role of social media platforms in news reporting during violent events.

  • OSB, Video After Nigeria Church Attack, 2022-011-IG-UA (2023)

    The Board referred to this case to explain in which instances warning screens are proportionate restrictions to freedom of expression.

  • OSB, Russian Poem, 2022-008-FB-UA (2022)

    The Board mentioned this case to clarify that warning screens are not appropiate measures to restrict freedom of expression when the content does not include visual indicators of violence and the face of the victim is not visible.

  • OSB, Haitian Police Station Video, 2023-21-FB-MR (2023)

    The Board referenced this case to recommend Meta to develop a framework for content moderation during crises and conflicts.

  • OSB, Tigray Communication Affairs Bureau, 2022-006-FB-MR (2022)

    The Board referenced this case to recommend Meta to develop a framework for content moderation during crises and conflicts.

  • OSB, Colombian Police Cartoon, 2022-004-FB-UA (2022)

    The Board referred to this case to recommend Meta to reassess its practices regarding content with high succesful appeal rates and their inclusion in Media Matching Service banks.

  • OSB, Breast cancer symptoms and nudity, 2020-004-IG-UA (2021)

    The Board recalled this case to recommend Meta to inform its users if their content was removed by automated systems and to disclose data about the volume of automated decisions reversed after human review.

  • OSB, Shared Al Jazeera Post, 2021-009-FB-UA (2021)

    The Board referred to this case to reiterate its concern about Meta’s differential treatment in its content moderation in Israel and Palestine.

National standards, law or jurisprudence

Other national standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition to Article 4 of the Oversight Board Charter, “The board’s resolution of each case will be binding and Meta will implement it promptly, unless implementation of a resolution could violate the law. In instances where Meta identifies that identical content with parallel context — which the board has already decided upon — remains on Meta, it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision or a policy advisory opinion includes recommendations, Meta will take further action by analyzing the operational procedures required to implement the recommendations, considering those recommendations in the formal policy development process of Meta, and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback