Oversight Board Case of Haitian Police Station Video

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    December 5, 2023
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2023-21-FB-MR
  • Region & Country
    Haiti, International
  • Judicial Body
    Oversight Board
  • Type of Law
    International/Regional Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Facebook, Incitement, Public Order, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On December 5, 2023, the Oversight Board overturned Meta’s decision to remove a Facebook video post showing a crowd threatening violence against an alleged gang member in a Haitian police station. While the Board agreed that the content violated the Violence and Incitement policy, it noted that Meta should have applied its “newsworthiness allowance” and left the content up. The Board highlighted that because there was a nearly three-week delay in removing it, the immediate risk had diminished and the post held significant public interest value. The Board also urged Meta to improve its content moderation effectiveness and timeliness in Haiti. It specifically recommended that Meta assess and improve its response times to content flagged through its Trusted Partner program.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook, Instagram, and Threads should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In May 2023, a Facebook user identifying as a media page posted a video showing a group of people entering a police station in Haiti and approaching a cell holding an alleged member of the “5 Seconds Gang,” a well-armed and prominent criminal organization. The footage shows an individual attempting to break the cell’s lock while others shout threats and encouragement, including the phrase “bwa kale na boudaw.” Meta interpreted this phrase literally as “wooden stick up your ass” and contextually as a direct call to lynch the detainee in the style of the “Bwa Kale” vigilante movement, which is a civilian-led effort characterized by extrajudicial violence against suspected gang members. The post’s caption, which described the event and stated that the police could do nothing, was seen by linguistic experts as reflecting a loss of faith in authorities and a pessimistic outlook on the situation. The post gained widespread attention, amassing over 500,000 views.

This incident occurred within a context of extreme crisis in Haiti, where gangs control significant territory and terrorize the population amid a political vacuum following the 2021 assassination of President Jovenel Moïse. The police are widely perceived as ineffective or complicit, and the first quarter of 2023 saw more than double the criminal incidents compared to the same period in 2022. In response, the Bwa Kale movement emerged, culminating in a pivotal event on April 24, 2023, when a crowd burned alive 14 alleged gang members as police stood aside. Videos of this event circulated widely and galvanized further vigilante violence, leading to over 350 lynching deaths by mid-August and triggering brutal gang reprisals.

Meta runs a program called the “Trusted Partner” program, which is a global network of NGOs, humanitarian agencies, human rights defenders, and researchers. These partners can report content to Meta for violations and provide policy feedback.  A Trusted Partner  flagged the video 11 days after its posting, warning that it could incite further violence. After an eight-day review, Meta removed the content for violating its Violence and Incitement policy, deciding not to apply its newsworthiness allowance, claiming the post’s risk of harm outweighed its public interest value.

Meta referred the case to the Board for a ruling, seeking guidance on the difficult moderation questions posed by content related to the “Bwa Kale” movement.


Decision Overview

The main issue before the Board was whether Meta’s decision to remove the video post and not apply its newsworthiness allowance was compatible with Meta’s content policies and human rights obligations.

Following Meta’s referral of the case, the user was notified of the Board’s review and invited to submit a statement but did not do so. On the other hand, Meta provided responses to all 18 of the Board’s written questions. These inquiries focused on Met’s linguistic capacity for policy enforcement in Haiti, its processes for reviewing Trusted Partner reports, the integration of this program with other crisis mechanisms, and the application of its Crisis Policy Protocol in the region.

In its assessment, Meta applied the UN Rabat Plan of Action framework to conclude the video constituted both a statement of intent and an incitement to violence against the alleged gang member in the cell. This was based on explicit threats heard in the video, such as, “We’re going to break the lock…They’re already dead.” The company supported this decision with a broad analysis of Haiti’s political and humanitarian crisis, highlighting endemic gang violence and the rise of vigilantism as critical context for the imminent risk of harm.

In its policy analysis, Meta evaluated and ultimately rejected three separate exceptions that could have permitted the content to remain on the platform. First, it considered the awareness exception, which allows violating content if it condemns violence or raises awareness, provided the user makes this intent explicitly clear. Meta found no such clear intent in the post and determined that its publication on a self-described media page was insufficient to satisfy this criterion. Second, Meta reviewed the dangerous organizations exception, which may permit calls for violence against entities designated under its Dangerous Organizations and Individuals policy. Although the “5 Seconds Gang” itself holds this designation, the exception was deemed inapplicable because Meta could not confirm that the specific individual in the cell was a member. Third, Meta assessed the newsworthiness allowance, concluding that the significant risk of the video inciting further violence against either the gang or the “Bwa Kale” movement outweighed its public-interest value in documenting ongoing events.

Finally, Meta informed the Board that it had not designated the situation in Haiti under its Crisis Policy Protocol, as it considered existing mitigation measures sufficient when the protocol was launched in August 2022.

(1) Compliance with Meta’s Content Policies

The Board found the content in violation of the Violence and Incitement policy; however, majority of the Board disagreed with Meta’s analysis on the application of the newsworthiness allowance. The majority saw that Meta should have applied the allowance given the nearly three weeks delay in policy enforcement.

a. Violence and Incitement policy

The Board affirmed that the content violated Meta’s Violence and Incitement policy, as it included both statements of intent to commit high-severity violence and direct calls for violence. This determination was based on the high-risk context in which the video was shared, where the threat of offline harm to the detained individual and others was significant. Specific statements from the crowd, such as shouts that they would break into the cell and that the man was “already dead”, demonstrated clear intent to use lethal force. In addition, the phrase “bwa kale na boudaw” was found to constitute a call to extreme violence, particularly given its association with recent deadly vigilante attacks against suspected gang members in Haiti.

The Board also examined two internal policy exceptions. It agreed with Meta that the “awareness or condemnation” exception did not apply, as the user’s caption was merely descriptive and did not clearly indicate intent to condemn or raise awareness about the violence. This exception, the Board noted, remains absent from the public-facing Community Standards, limiting user understanding.

Furthermore, the Board addressed the exception permitting calls for violence against members of designated dangerous organizations or individuals, a provision referenced internally but not included in Meta’s published rules. Although the exception was not applicable in this instance, the Board expressed serious concerns about its lack of public transparency. The list of designated entities remains non-public, preventing users from knowing when the exception might apply. The Board reiterated previous recommendations from the Mention of the Taliban in News Reporting”, “Shared Al Jazeera Post, “Öcalan’s Isolation, andNazi Quote decisions that Meta improve clarity and transparency around its Dangerous Organizations and Individuals policy, citing several prior decisions. Finally, the Board highlighted a troubling aspect of this exception: Meta does not assess the credibility of threats against designated entities, effectively permitting serious threats without evaluating their real-world risk.

b. Newsworthiness allowance

Although the Board found the content in violation of the Violence and Incitement policy, a majority disagreed with Meta’s decision not to apply the newsworthiness allowance. The Board emphasized that the balance of risk and public interest should be evaluated at the time of Meta’s review, not when the content was originally posted, and urged Meta to clarify this timing distinction in its public-facing policy.

In this case, nearly three weeks elapsed between the post’s publication and its removal. The majority concluded that, by the time Meta assessed the content, the immediate risk of harm had significantly diminished. Therefore, the public interest value in informing both Haitian and international audiences about the severity of violence and amplifying calls for aid and intervention outweighed the residual risk. The Board agreed that had Meta evaluated the content immediately after it was posted, the risk of harm would have outweighed its public interest value, a rationale similar to the Communal Violence in Indian State of Odisha decision where swift removal was justified due to ongoing violence and a high imminent risk. In contrast, here, the post had already been viewed 500,000 times, suggesting that any potential harm had likely already materialized.

The Board noted that Meta’s internal teams evaluate newsworthiness based on escalation; therefore, the company possesses the necessary resources and expertise to conduct more nuanced assessments that account for evolving contexts and circumstances.

(2) Compliance with Meta’s human rights responsibilities

The majority of the Board found that the removal of the video post after a three-week delay was unnecessary and disproportionate. It also emphasized that, to fulfill its human rights responsibilities, Meta must ensure content moderation in Haiti is both timely and effective, particularly given the heightened risks of harm in the region.

To assess the legitimacy of Meta’s action and its adherence to its human rights commitments, the Board invoked Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which protects public affairs commentary, including offensive speech. The Board applied the three-part test from Article 19(3) of the ICCPR, which evaluates whether restrictions on freedom of expression are prescribed by law, pursue a legitimate aim, and are necessary and proportionate.

a. Legality (Clarity and accessibility of the rules)

The principle of legality requires that rules limiting expression be clear and accessible to both those subject to them and those responsible for their enforcement. These rules must not grant unfettered discretion to the individuals applying them. Consequently, Meta’s content reviewers must have access to well-defined guidelines to ensure consistent and transparent enforcement.

In this case, the Board found that Meta’s policies prohibiting statements of intent to commit violence and incitement to high-severity violence were clearly stated, thereby satisfying the legality requirement.

However, the Board identified a significant legality concern regarding the omission of the “raising awareness or condemning violence” exception from the public-facing version of the Violence and Incitement policy. This failure to publish the exception, and to clarify that the onus is on the user to state their intent, undermines the policy’s clarity and accessibility. Although Meta had previously committed to implementing this recommendation from the Russian Poem decision by adding these interpretive exceptions to the public policy, it has yet to do so. The Board has therefore reiterated this recommendation and urges immediate compliance.

b. Legitimate aim

The Board found the Violence and Incitement policy’s prohibition of statements of intent and incitement to commit violence pursued the legitimate aim of protecting public order and respecting the rights of others.

c. Necessity and proportionality

The principles of necessity and proportionality require that any restriction on expression must be appropriate to achieve a protective function and represent the least intrusive means available. As in previous cases, such as Brazilian General’s Speech and Cambodian Prime Minister decisions, the Board applied the Rabat Plan of Action factors to evaluate Meta’s removal of the content under its Violence and Incitement policy. The Board also considered the implications of Meta’s delayed review for fulfilling its human rights obligations in content moderation, particularly in Haiti.

A majority of the Board found that removing the post three weeks after its publication was no longer necessary. This conclusion was based on the context of widespread violence in Haiti, the post’s significant reach, and the diminished likelihood of harm due to Meta’s delay. The majority argued that the high view count by the time of review indicated that any potential risks had likely already materialized, rendering removal disproportionate at that stage.

The Board also stressed the vital importance of information access in contexts of public breakdown and violence, noting that Haitians frequently rely on platforms like WhatsApp to stay informed about security risks. Preserving access to content documenting such events is therefore essential for public awareness and safety.

Citing the Claimed COVID Cure decision, the Board reiterated that Meta must transparently justify its selection of enforcement actions. This includes demonstrating that public interest objectives cannot be met through non-restrictive measures, that the chosen measure is the least intrusive among available options, and that it is effective in reducing harm. In this instance, Meta failed to explain why less intrusive tools, such as geo-blocking, limiting engagement, or demoting the content, were insufficient after a three-week delay, instead presenting the decision as a binary choice.

The Board raised serious concerns regarding Meta’s capacity to conduct timely and effective moderation in Haiti, linking delays to inadequate investment in non-English content moderation and under-resourced systems, a recurring issue previously highlighted in the Mention of the Taliban in News Reporting,” Shared Al Jazeera Post,” andÖcalan’s Isolation decisions. These concerns were compounded by inconsistent response times to reports from Trusted Partners, a key mechanism for flagging harmful content in the region. The Board highlighted that this is a recurring situation as noted in an evaluation by one of Meta’s own Trusted Partners, which found significant irregularities in response times and concluded the program is under-resourced, reinforcing the Board’s concern that internal teams lack the necessary support to review these escalations promptly.

Additionally, the Board criticized Meta for not activating its Crisis Policy Protocol in Haiti, despite the ongoing emergency. This protocol was developed in direct response to the Board’s recommendation in theFormer President Trump’s suspension” decision to govern responses to crises where standard processes are insufficient. Although Meta cited pre-existing mitigation measures, the Board emphasized that the protocol was designed for precisely such prolonged and volatile crises and that its underuse undermines both public safety and Meta’s human rights commitments.

The Board acknowledged the challenges Meta faces in prioritizing resource allocation across its different content-moderation systems, such as developing language-specific classifiers, hiring content moderators, deploying the Crisis Policy Protocol or prioritizing operational measures such as Trusted Partners. Nevertheless, it concluded that Meta must ensure timely and effective content moderation in Haiti during this period of heightened risk to meet its human rights responsibilities.

Accordingly, the Board overturned Meta’s decision to remove the video post, ruling that the removal (occurring after a significant delay) was neither necessary nor proportionate and therefore constituted an unjustifiable limitation on freedom of expression under Article 19(3) of the ICCPR.

Policy Advisory Statement

On enforcement, the Board urged Meta to assess the timeliness and effectiveness of its responses to content escalated by its Trusted Partners in order to mitigate the risk of harm in regions with no or limited proactive moderation capabilities

On policy, the Board reiterated a prior recommendation from the Russian Poem decision, calling on Meta to publish clear and accessible exceptions to its Violence and Incitement policy within its public-facing Community Standards.

Dissenting Opinions

A minority of the Board disagreed with the majority’s assessment that the newsworthiness allowance applied to this content, arguing that its removal was both necessary and proportionate.

These members agreed with Meta’s decision not to apply the allowance, contending that the risk of harm to the individuals depicted, though potentially decreased, remained significant within Haiti’s broader context of ongoing violence. They concluded that this risk outweighed the post’s public interest value. Specific concerns included the potential for the video to inspire others to join the vigilante movement or for members of the “5 Seconds Gang” or its affiliates to recognize individuals in the video and seek revenge.

The minority found no alternative measure to be sufficient; only complete removal could mitigate the imminent risk of violence against those depicted.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

This decision expands freedom of expression. By invoking the newsworthiness allowance to permit content that otherwise violates Meta’s policies, the Board’s decision crucially upholds the principles of public interest and underscores the importance of disseminating information during a crisis.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback