Oversight Board Case of Posts That Include “From the River to the Sea”

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 4, 2024
  • Outcome
    Oversight Board Decision, Agreed with Meta’s initial decision
  • Case Number
    2024-004-FB-UA, 2024-005-FB-UA, 2024-006-FB-UA
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    International/Regional Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct, Violence And Criminal Behavior, Dangerous Individuals and Organizations, ​​Violence and Incitement
  • Tags
    Facebook, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 4, 2024, the Oversight Board upheld Meta’s decision to leave three pieces of content containing the phrase “From the River to the Sea” on Facebook. In the first case, a user’s comment included the hashtag “#FromTheRiverToTheSea”, along with hashtags like “#DefundIsrael” and heart emojis in the colors of the Palestinian flag. In the second case, a user posted a generated image of floating watermelon slices forming the words of the phrase, accompanied by the text “Palestine will be free.” In the third case, a Facebook page administrator reshared a post by a Canadian community organization. The post featured founding members expressing support for the Palestinian people, condemning their “senseless slaughter”, and referring to “Zionist Israeli occupiers.” The Board acknowledged that the phrase has been used by some to incite hatred. However, it emphasized that the phrase has multiple interpretations and predates the existence of Hamas. A blanket ban, the Board argued, would infringe on users’ freedom of expression. The Board also noted that the contextual clues in all three cases suggested the users were expressing support for Palestine and the Palestinian people, not inciting hatred or violence.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

The Board reviewed three cases involving Facebook content posted in November 2023, following the October 7 Hamas attacks and Israel’s military campaign in Gaza. Each piece of content included the phrase “From the River to the Sea.”

In the first case, a user commented on another user’s video that encouraged viewers to “speak up” and included hashtags like “#ceasefire” and “#freepalestine.” The comment contained the hashtag “#FromTheRiverToTheSea,” along with “#DefundIsrael” and heart emojis in Palestinian flag colors. The user had fewer than 500 friends and no followers. The comment received about 3,000 views and was reported seven times by four users. However, Meta’s automated systems didn’t prioritize these reports for human review within 48 hours, so the appeals were automatically closed. One reporting user later appealed directly to Meta.

In the second case, a user posted a generated image showing floating watermelon slices forming the words “From the River to the Sea” alongside the text “Palestine will be free.” The user had fewer than 500 friends and no followers, yet the post gained approximately 8 million views and was reported 951 times by 937 users. While the first report was automatically closed, some subsequent reports were reviewed by human moderators who found no policy violations. Several reporting users appealed Meta’s decision.

In the third case, a page administrator reshared a post from a Canadian community organization. The post, written by the organization’s founding members, condemned what they called the “senseless slaughter” of Palestinians by the “Zionist State of Israel” and “Zionist Israeli occupiers,” while expressing solidarity with Palestinian Muslims, Christians, and anti-Zionist Jews. The post concluded with the phrase “From The River To The Sea.” It received fewer than 1,000 views and was reported once, but the report was automatically closed. The reporting user later appealed to Meta.

After reviewing these cases, Meta upheld its original decision to keep the content on the platform. The users who had reported the content then appealed to the Board. Following the Board’s announcement that it would review these cases, the user in the third case deleted the post from Facebook.


Decision Overview

The Board examined whether Meta’s decision to retain content containing the phrase “From the River to the Sea” in reference to Palestine aligned with Meta’s content policies, and human rights obligations.

Appellants argued the phrase violated Meta’s Hate Speech, Violence and Incitement, or Dangerous Organizations and Individuals policies. The first reporting user claimed it promoted violence and terrorism, while others contended it constituted hate speech, antisemitism, and a call for genocide against Israel.

Meta’s Hate Speech policy prohibits attacks on protected characteristics like race, ethnicity or religion. This includes dehumanizing language, inferiority claims, contempt, and exclusion calls. Tier 1 bans harm advocacy while Tier 2 prohibits exclusion demands. The Violence and Incitement policy bans serious violence threats, especially against protected groups, including veiled or coded threats. The Dangerous Organizations policy prohibits glorifying designated groups like Hamas or justifying their violent acts.

Meta maintained the phrase alone doesn’t violate its policies. Therefore, Meta only removes content with the phrase if other parts of the content independently breach its Community Standards.

While acknowledging not conducting full stakeholder consultation, Meta reviewed the phrase’s usage post-October 7 attacks, recognizing its complex history: some view it as antisemitic or threatening to Israel, while others see it as Palestinian solidarity and that labeling it antisemitic is either inaccurate or rooted in Islamophobia. Due to these divergent interpretations, Meta couldn’t conclude the phrase inherently incites violence or promotes exclusion without additional context.

Meta’s also noted that a Dangerous Organizations and Individuals assessment found the phrase isn’t exclusively linked to Hamas despite appearing in their 2017 charter, noting it predates Hamas and is used by people not affiliated with or supportive of the group’s ideology. Meta concluded that none of the three reviewed posts suggested Hamas support or glorification.

Additionally, Meta’s Policy team reviewed how the phrase was being used on its platforms and assessed it in light if the company’s policies. They also analyzed whether to block hashtags containing the phrase but found only a small amount of content using it violated their policies, and for reasons unrelated to the phrase itself.

Meta finally confirmed receiving requests from the German government to restrict content using the phrase under local law, and in response, Meta restricted access to the content in Germany.

(1) Compliance with Meta’s content policies

The Board concluded none of the three posts violated Meta’s policies. Examining each in full context, the Board found the posts didn’t violate the Hate Speech policy as none attacked Jewish/Israeli people, incited violence, or called for exclusion. Instead, the posts showed solidarity with Palestinians through ceasefire hashtags, watermelon symbolism (a solidarity emblem), or support for Palestinians of all faiths.

The Board also noted the content didn’t violate the Violence and Incitement policy as it contained no threats of violence or physical harm. The Board highlighted that Meta’s policy requires both a “threat” and a “target” to be present for content to be considered in violation, and the Board found no such threats in these cases. Although the policy also prohibits “coded statements” with veiled or implicit threats, the Board determined that no “threat signals” or “contextual signals” suggesting imminent violence were present in the reviewed content. The Board acknowledged that the phrase “From the River to the Sea” could sometimes be used to incite violence. However, the Board stressed there was no indication that these specific posts could lead to imminent violence.

The Board further determined the posts didn’t violate the Dangerous Organizations and Individuals policies as none of them glorified Hamas. While some public comments suggested that using the phrase “From the River to the Sea” inherently supports Hamas, the Board rejected this view, noting that the phrase predates Hamas and didn’t have a single, universally understood meaning. Furthermore, the Board saw that none of the content reviewed attempted to justify Hamas’ actions, including the attacks of October 7.

The Board emphasized that the use of a phrase by an extremist group doesn’t make it inherently hateful or violent, considering its broader usage by other actors. The Board noted that, in accordance with General Comment 37, symbols and emblems with multiple meanings should not be restricted unless they are directly associated with incitement to discrimination, hostility, or violence.

(2) Compliance with Meta’s values

The Board noted that these three cases highlighted the tension between Meta’s commitment to protecting its value of voice and freedom of expression, especially political speech during times of conflict, and its responsibility to ensure safety and dignity by preventing intimidation, exclusion, and violence. The Board underlined that this tension became particularly important during the conflict following Hamas’s attack in October 2023 and Israel’s military response, amid global protests and accusations of international law violations.

(3) Compliance with Meta’s human rights responsibilities

The Board emphasized that Article 19 of the International Covenant on Civil and Political Rights (ICCPR) broadly protects freedom of expression, including the right to “seek, receive and impart information and ideas of all kinds,” encompassing political discourse and public affairs commentary. This protection extends even to expression that may be considered deeply offensive. Such broad protections also apply to political assemblies, as outlined in General Comment No. 37, which states that “assemblies with a political message should enjoy a heightened level of accommodation and protection,” whether occurring online or offline.

The Board applied the three-part test, which is used to assess whether a limitation meets requirements of legality, legitimacy, and necessity and proportionality, to evaluate Meta’s fulfillment of its freedom of expression obligations under ICCPR Article 19.

 I) Legality (clarity and accessibility of the rules)

The legality principle requires that expression-limiting rules be clear, accessible, and precise enough to enable individuals to regulate their conduct accordingly. Such rules cannot grant unlimited discretion to enforcers and must provide adequate guidance about which types of expression may be restricted. The Board determined that Meta’s policies met this clarity standard for the reviewed cases.

II) Legitimate aim

Any restriction on freedom of expression must pursue a legitimate aim under the ICCPR. The Board has consistently recognized that Meta’s Hate Speech policy serves the legitimate aim of protecting rights to life, equality, and non-discrimination, as hate speech often leads to intimidation, exclusion, and offline violence. However, the Board reaffirmed its position from the Depiction of Zwarte Piet decision that preventing mere offense does not constitute a legitimate aim for restriction.

The Board referenced its Alleged Crimes in Raya Kobo decision, where it found Meta’s Violence and Incitement policy legitimately aims to protect the right to life by preventing offline violence through removal of violent speech targeting protected groups. Similarly, the Board noted that Meta’s Dangerous Organizations and Individuals policy legitimately seeks to prevent real-world harm by addressing organizations promoting hate and violence, protecting rights to life, equality, and non-discrimination, as established in the Sudan’s Rapid Support Forces Video Captive and Greek 2023 Elections Campaign decisions.

III) Necessity and proportionality.

Under Article 19(3) of the ICCPR, restrictions on expression must be necessary to achieve their protective purpose while using the least intrusive means available. The Board emphasized Meta’s responsibility to identify, prevent, and mitigate adverse human rights impacts, particularly during conflicts when risks to vulnerable minorities increase. The Board had previously called for a transparent hate speech moderation framework in crisis situations through its “Two Buttons” Meme, Haitian Police Station Video and Tigray Communication Affairs Bureau decisions.

In the present case, the Board majority found that retaining the content aligned with the necessity principle, stressing the importance of context-specific evaluation. While “From the River to the Sea” could potentially incite violence or support Hamas, the phrase itself, especially in political speech, couldn’t be universally interpreted as violent or exclusionary without additional context. The Board urged Meta to dedicate more resources to understanding online-offline harm connections.

The Board applied the Rabat Plan of Action’s six factors test (context, speaker identity, intent, content/form, dissemination extent, and harm likelihood) to assess whether the content and phrase posed serious risks of inciting discrimination, violence, or harm. These factors, designed to evaluate when advocacy constitutes incitement, were previously used in the Knin Cartoon decision.

Context

The Board noted that the content emerged during an ongoing conflict with significant global repercussions. All three posts appeared shortly after the October 7 attacks amid Israel’s Gaza offensive. The users appeared to highlight Palestinian suffering or critique Israeli military actions. The Board observed that debates about the legitimacy of Israel’s military operations in Gaza were already before the International Court of Justice and International Criminal Court.

The Board stressed that the phrase carries multiple interpretations influenced by the evolving conflict. While some use it to endorse Hamas and violence, others employ it to support Palestinian self-determination, equal rights, or ceasefire calls. The Board noted increased dangerous rhetoric targeting various groups (Arabs, Israelis, Jews, Muslims, Palestinians).

The Board highlighted that while context-based removal of violent content remains appropriate, a blanket ban would be improper given the phrase’s multiple meanings. The Board expressed concern about Meta’s aggressive automated tools potentially over-censoring political speech on public issues. It also recognized the phrase’s role in pro-Palestinian protests and stressed the importance of protecting peaceful assembly rights, especially during conflicts.

Identity of the Speaker

The Board found no evidence that the users or sharing pages supported designated organizations like Hamas or promoted discrimination and exclusion.

Intent, Content, and Form of Expression

The Board noted that the reviewed content showed no intent to incite discrimination/violence, advocate exclusion, or support designated entities. While the phrase’s usage surged after October 7 with varying meanings, Board-commissioned research indicated most posts addressed Gaza’s humanitarian crisis or called for ceasefires. Experts noted Meta typically removed phrase-containing content only when accompanied by explicit violence/discrimination signals. The research further demonstrated that Meta didn’t provide comprehensive data on removed content or phrase prevalence. While acknowledging the phrase’s potential harmful uses, the Board emphasized needing more data to fully assess such content’s nature and prevalence.

Likelihood, Imminence, and Reach

None of the three posts posed likely or imminent risks of violence/discrimination. Given the phrase’s multiple meanings, the majority determined it couldn’t be inherently harmful, violent, or discriminatory in all contexts. While sometimes used with threats or violence celebration, Meta’s human rights duties require balancing affected communities’ voices.

The Board highlighted risks in removing content documenting Palestinian suffering or individuals dehumanization during military campaigns. Meta’s platforms serve vital functions for Gaza events documentation, international support mobilization, combating antisemitism/Islamophobia, and education, all needing protection in safe, respectful environments.

The Board noted that the first and third posts had minimal visibility, and while the second post had about 8 million views, its scale didn’t warrant removal given unclear harm risks.

The majority of the Board declined to exclusively associate the phrase with Hamas, noting its pre-Hamas origins and multiple meanings. They echoed the UN Special Rapporteur’s warning about civil society delegitimization risks through loose “terrorist” labeling, which could increase their vulnerability to abuse by state and non-state actors.

The Board acknowledged Meta’s policies address online discriminatory content risks. Evidence from the Holocaust Denial decision shows the harm from rapid antisemitic content spread, necessitating balanced moderation tools, ensuring effective enforcement without unnecessarily restricting political expression. Properly enforced, Meta’s policies can prevent terrorist group exploitation while protecting political expression. Following the Mention of the Taliban in News Reporting case recommendation, Meta committed to better Dangerous Organizations policy enforcement data, an approach the Board recommended extending to Hate Speech and Violence and Incitement policies, as seen in the Holocaust Denial and United States Posts Discussing Abortion decisions respectively.

Data Access

 The Board emphasized that both the Board itself and external stakeholders would be better positioned to evaluate the necessity and proportionality of Meta’s content moderation during armed conflicts if the company maintains robust access to platform data. In March 2024, Meta announced plans to discontinue CrowdTangle by August 2024, replacing it with the Meta Content Library & Content Library API. CrowdTangle, a Meta-owned data analysis tool that tracks public content from major pages, groups, and accounts across all countries and languages (though it excludes certain platform content and removed material), has been widely used for research and monitoring. While the Board welcomed the development of new tools, it raised significant concerns about phasing out CrowdTangle before fully establishing an effective replacement. These concerns were echoed by numerous organizations and the European Commission, particularly given the timing during a critical election year, ultimately prompting formal proceedings under the EU Digital Services Act. The Board further highlighted existing limitations in both CrowdTangle and Meta’s transparency reporting, especially regarding the tracking of sudden increases in antisemitic, anti-Muslim, and racist content. Reiterating its position from the Shared Al Jazeera Post decision, the Board urged Meta to implement Recommendation No. 16 from the BSR Human Rights Due Diligence report, which calls for establishing mechanisms to track content targeting protected groups, noting that a year after the report’s release, Meta was still assessing the recommendation’s feasibility.

Ultimately, the Board upheld Meta’s decision to leave up the three posts.

Policy advisory statement

The Board recommends that Meta ensure qualified researchers, civil society organizations, and journalists who previously had access to CrowdTangle are granted access to the company’s new Content Library within three weeks of submitting their application. Meta must guarantee that the Content Library provides functionality and data access that meets or exceeds what was available through CrowdTangle. Furthermore, the Board urges Meta to implement Recommendation No. 16 from the BSR Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine report by developing a robust mechanism to systematically track and measure the prevalence of content attacking individuals based on protected characteristics, including but not limited to antisemitic, Islamophobic, and homophobic content.

Dissenting Opinions

A minority of Board members maintained that while the three content pieces in question didn’t violate Meta’s policies, the phrase “From the River to the Sea” should generally be presumed to glorify Hamas, a designated organization under Meta’s policies, and thus warrant removal unless clearly not endorsing Hamas or its objectives. These members argued that the phrase’s context fundamentally shifted after the October 7 attacks, and that ambiguous uses should be presumed to support Hamas. However, they concurred that in these specific cases, sufficient contextual indicators showed the content didn’t glorify Hamas or the October 7 events.

This minority recommended that Meta develop clearer guidance for content moderators on distinguishing permissible uses of the phrase. They viewed this approach as striking an appropriate balance between protecting free expression for those advocating Palestinian solidarity and addressing current risks of violence associated with the phrase’s usage.

Another minority of the Board strongly emphasized that acknowledging a phrase’s adoption by terrorist groups shouldn’t equate those using it with terrorism itself. These members stressed that content evaluation requires careful analysis of phrases’ origins and meanings, cautioning that such analysis must not be conflated with attempts to delegitimize legitimate civil society actors.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Board’s decision expands expression by protecting the phrase “From the River to the Sea” from arbitrary restrictions, even when considered offensive by some, in accordance with established principles from the Human Rights Committee and European jurisprudence. The Board acknowledged both the phrase’s historical and linguistic complexity and its role in advocating for Palestinian rights. Applying the internationally recognized Rabat Plan of Action framework, the Board systematically evaluated whether the phrase constituted incitement. After assessing it against the Rabat criteria, the Board concluded the phrase does not meet the threshold for incitement.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Meta’s obligations towards freedom of expression as laid out by this article. It also referred to this norm to apply the three-part test to assess whether a restriction to freedom of expression is valid.

  • ICCPR, art. 6

    The Board highlighted that the protection of the right to life, stipulated in this article, was among the legitimate aims of Meta’s Hate Speech, Violence and Incitement, and Dangerous Organizations and Individuals policies.

  • ICCPR, art. 2

    The Board highlighted that the protection of the right to equality and non-discrimination, stipulated in this article, was among the legitimate aims of Meta’s Hate Speech policy.

  • ICERD, art. 2

    The Board highlighted that the protection of the right to equality and non-discrimination, stipulated in this article, was among the legitimate aims of Meta’s Hate Speech policy and Dangerous Organizations and Individuals policy.

  • ICCPR, art. 26

    The Board highlighted that the protection of the right to equality and non-discrimination, stipulated in this article, was among the legitimate aims of Meta’s Dangerous Organizations and Individuals policy.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used this General Comment as a guide to explain and apply the three part test, and to provide that offensive speech is protected under article 19 of the ICCPR.

  • OHCHR, Rabat Plan of Action on the prohibition of advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence (2011).

    The Board utilized this instrument to analyze whether the phrase “From the River to the Sea” incited violence.

  • OSB, Depiction of Zwarte Piet, 2021-002-FB-UA (2021)

    The Board highlighted this case to reiterate that protecting people from offensive content is not a legitimate aim.

  • OSB, Alleged Crimes in Raya Kobo, 2021-014-FB-UA (2021)

    The Board highlighted this case to reiterate that Meta’s Violence and Incitement policy aims to protect the right to life.

  • OSB, Sudan's Rapid Support Forces Video Captive, 2023-039-FB-UA (2024)

    The Board highlighted this case to reiterate that Dangerous Organizations and Individuals policy aimed to protect the right to life and the right to non-discrimination and equality.

  • OSB, Greek 2023 Elections Campaign, 2023-30-FB-UA, 2023-31-FB-UA (2024)

    The Board highlighted this case to reiterate that Dangerous Organizations and Individuals policy aimed to protect the right to life and the right to non-discrimination and equality.

  • OSB, “Two Buttons” Meme, 2021-005-FB-UA (2021)

    The Board cited this case to highlight the need to develop a principled and transparent framework for content moderation of hate speech during crises and in conflict settings.

  • OSB, Haitian Police Station Video, 2023-21-FB-MR (2023)

    The Board cited this case to highlight the need to develop a principled and transparent framework for content moderation of hate speech during crises and in conflict settings.

  • OSB, Tigray Communication Affairs Bureau, 2022-006-FB-MR (2022)

    The Board cited this case to highlight the need to develop a principled and transparent framework for content moderation of hate speech during crises and in conflict settings.

  • OSB, Knin Cartoon, 2022-001-FB-UA (2022)

    The Board referenced this case to highlight its past use of the Rabat Plan of Action to assess the severity of content.

  • OSB, Al-Shifa Hospital, 2023-049-IG-UA (2023)

    The Board cited this decision to highlight the importance of context analysis when analyzing content related to the situation in Gaza.

  • OSB, Dehumanizing comments about people in Gaza, 2024-026-FB-UA (2024)

    The Board cited this case to highlight the i to importance of protecting freedom of expression while fighting terrorism in order to avoid delegitimizing the civil society.

  • OSB, Holocaust Denial, 2023-022-IG-UA (2024)

    The Board highlighted this case to stress the importance of having adequate enforcement tools and measures to moderate harmful content without unduly curtailing political expression on issues of public interest and to recall recommendation #1.

  • OSB, Mention of the Taliban in News Reporting, 2022-005-FB-UA (2022)

    The Board referenced this case to highlight Meta’s commitment to developing new tools that would allow it to gather more granular details about enforcement of the Dangerous Organizations and Individuals news reporting policy allowance.

  • OSB, United States Posts Discussing Abortion, 2023-011-IG-UA, 2023-012-FB-UA, 2023-013-FB-UA (2023)

    The Board referenced this case to recall recommendation #1.

  • OSB, Shared Al Jazeera Post, 2021-009-FB-UA (2021)

    The Board referenced this case to highlight that BSR’s Human Rights Due Diligence report was commissioned based on one of the Board’s recommendations from this decision.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities

  • BSR's Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine Report

    https://www.bsr.org/en/reports/meta-human-rights-israel-palestine

  • Attachments:

    Have comments?

    Let us know if you notice errors or if the case analysis needs revision.

    Send Feedback