Oversight Board Case of Communal Violence in Indian State of Odisha

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    November 28, 2023
  • Outcome
    Oversight Board Decision, Agreed with Meta’s initial decision
  • Case Number
    2023-018-FB-MR
  • Region & Country
    India, Asia and Asia Pacific
  • Judicial Body
    Oversight Board
  • Type of Law
    International/Regional Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement
  • Tags
    Facebook, Incitement, Oversight Board Policy Advisory Statement, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board upheld Meta’s decision to remove a Facebook video depicting communal violence during the Hindu festival of Hanuman Jayanti in the Indian state of Odisha. The video showed a religious procession that escalated into violence between Hindu and Muslim communities. In response to the unrest, the Odisha State government imposed a curfew, suspended internet access, and restricted social media use. Law enforcement later requested that Meta remove the video, which the company determined violated its Violence and Incitement policy. Meta subsequently added the video to its Media Matching Service (MMS), a tool used to automatically detect and remove identical content across its platforms. The company also flagged related content, including media featuring the chant “Jai Shri Ram,” which has historically been associated with incitement against the Muslim community. The majority of the Board agreed with Meta’s assessment, concluding that the video constituted a credible call to violence in light of the broader context of communal tensions and the lack of indicators suggesting the applicability of policy exceptions (e.g., awareness-raising or newsworthiness). The majority also found Meta’s decision to remove all identical videos to be justified, necessary, and proportionate due to the heightened risk of further harm. However, a minority of the Board disagreed, arguing that the blanket removal of all identical videos on a global scale was neither proportionate nor necessary. They expressed concern that Meta’s enforcement approach could unduly restrict freedom of expression and highlighted the absence of a publicly accessible “awareness raising” exception in Meta’s policies, which may deter users from sharing content of public interest.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On April 13, 2023, a Facebook user posted a video of a religious parade during the Hindu festival of Hanuman Jayanti in Sambalpur, Odisha. The procession featured saffron-colored flags associated with Hindu nationalism and chants of “Jai Shri Ram” (“Hail Lord Ram”), a phrase which, beyond its religious significance, has been used to incite hostility against minorities, particularly Muslims. The video also showed a person standing on a balcony throwing a stone at the procession. In retaliation, members of the crowd threw stones toward the building while chanting “Jai Shri Ram,” “bhago” (“run”), and “maro maro” (“hit him, hit him”). The post was captioned simply “Sambalpur” and received approximately 2,000 views, fewer than 100 comments and reactions, and no shares or reports.

During the festival, communal violence broke out between Hindus and Muslims. Communal violence in India is a recurring phenomenon characterized by group conflicts based on religious, ethnic, or linguistic divisions. Indian Muslims have been disproportionately targeted in such violence, often with impunity. In recent years, religious festivals and parades have increasingly served as flashpoints for incitement and attacks on minority communities.

The violence in Sambalpur led to the burning of shops and the murder of one person. In response, the Odisha State government imposed a curfew in certain areas and suspended internet services until April 23. Police arrested 85 individuals in connection with the unrest. In July 2023, the State government introduced a year-long ban on religious processions in Sambalpur.

On April 16, Odisha law enforcement requested that Meta remove an identical video posted by another user. Meta complied, citing a violation of the spirit of its Violence and Incitement policy. On April 17, the video was added to Meta’s Media Matching Service (MMS) bank, which flags identical content for review. The video in question was later identified through this system and removed for violating the same policy.

Meta stated that the content did not meet any of the policy’s exceptions, such as content shared to condemn violence, raise awareness, or for academic or journalistic purposes. The company added that, given the high risk of further Hindu-Muslim violence, the content would have been removed even if it had included a condemnatory caption.

Social media platforms have frequently been used in India to incite violence against minorities, especially Muslims. In this instance, videos of the Sambalpur violence were posted at least 34 times within 72 hours, often with captions blaming Muslims for the attack.

Meta referred the case to the Oversight Board (OSB) due to the challenge of balancing its values of “Voice” and “Safety,” and the need for a more thorough contextual analysis to assess the risks of harm posed by the video.


Decision Overview

On 28 November 2023, the Oversight Board issued a decision on the matter. The main issue it analyzed was whether Meta’s decision to remove a video depicting communal violence during a religious festival in India was consistent with its content policies—particularly the Violence and Incitement policy—its core values, and human rights responsibilities.

The Board notified the user who posted the content and offered them an opportunity to submit a statement, but no response was received.

In its referral to the OSB, Meta highlighted the difficulty in balancing its values of “Voice” and “Safety,” particularly given the sensitive context of recurring clashes between Hindu and Muslim communities during the Hanuman Jayanti festival in Odisha.

The Board submitted 16 written questions to Meta, focusing on procedures for handling government content review requests, the use of Media Matching Service (MMS) banks for at-scale enforcement, and account-level enforcement practices. Meta responded to 15 questions but declined to provide a copy of the content review request from Odisha State law enforcement.

Meta explained that the original video, which was identical to the one under review but with a different caption, was removed under the spirit of the Violence and Incitement policy. The decision was based on three main factors: significant safety concerns flagged by law enforcement agencies (which Meta corroborated independently), the video’s virality, and the presence of violating comments. Meta then configured an MMS bank to automatically remove all copies of the video, regardless of caption, including the version currently under review.

To justify the removal, Meta cited the nature of the threat, India’s history of communal violence, and the heightened risk of renewed violence in Odisha. The video, according to the company, violated the Violence and Incitement policy as it depicted high-severity violence (such as stones thrown into a crowd and chants encouraging others to “hit” or “beat” someone in retaliation). While the target was not explicitly identified, Meta argued that the visual portrayal of violence against an implied target was sufficient to trigger enforcement measures.

Meta acknowledged that, under the policy, content that would otherwise violate the rules can be allowed if it is shared to condemn violence or raise awareness. However, in this case, the company concluded that the risk of offline harm outweighed any such considerations. On this basis, Meta removed the content under the spirit of the policy and extended this rationale to all identical videos, regardless of accompanying captions.

The company also determined that the content did not meet the criteria for a newsworthiness allowance, as the potential for harm outweighed any public interest. Meta cited ongoing religious and political tensions in India, the potential for violence to spread beyond localized incidents, and the risk of renewed unrest once the curfew and internet ban in Odisha were lifted. These concerns were confirmed by local law enforcement and echoed by Meta’s own public policy and safety teams.

While Meta acknowledged the importance of informing the public about ongoing violence, it argued that the video’s informational value had diminished, being more than four days old at the time of removal, with a neutral caption and wide media coverage already available. The company concluded that content removal was the only adequate response to mitigate the risks.

Regarding the enforcement measures, Meta stated that while strikes are typically applied for all Violence and Incitement policy violations, exceptions can be made in extraordinary cases. In this instance, Meta chose not to apply strikes to content removed via the MMS bank, in order to reach a balance between safety and expression and avoid penalizing users whose content may have only violated the spirit—but not the letter—of the policy.

On the topic of government requests, the company clarified that formal reports from government or law enforcement agencies are first reviewed under Meta’s Community Standards, regardless of whether the content is alleged to violate local law. If the content does not breach Meta’s policies, a legal review is conducted to determine the validity of the request, followed by a human rights due diligence assessment in line with Meta’s Corporate Human Rights Policy.

1. Compliance with Meta’s content policies

a. Content Rules

The Board agreed with Meta that the content violated the Violence and Incitement policy. The majority of the OSB considered the following elements to reach its assessment: the context of ongoing violence in Odisha, the nature of the religious procession, the calls for high-severity violence in the video, and the virality of similar content. Additionally, the majority of the Board found that the content constituted incitement to violence.

The OSB said that the content depicted a scene of violence in which a crowd called for high-severity violence against a target. Meta’s definition of a “target” for content reviewers includes any “person,” including anonymous individuals, who are real but not identified by name or image. It also defines “high-severity violence” as violence likely to be lethal. The company said it has instructed reviewers to classify threats as high-severity when unsure about their severity level. The Board noted that the video fulfilled all the requirements and violated the relevant policy line of the Violence and Incitement policy.

Moreover, the OSB highlighted the importance of contextual analysis in this case, as stone pelting has been widespread and has triggered Hindu-Muslim violence. Moreover, as noted by the Board, the procession displayed symbols associated with Hindu nationalism alongside implicit calls for violence against Muslims. At this point, the OSB highlighted the usage of social media platforms to incite further violence through video posts. It additionally noted that a heightened risk of high-severity violence existed since there were reported fatalities, injuries and property damage. Accordingly, the Board concluded that the content was likely to incite more high-severity violence.

Furthermore, the OSB considered that many users posted similar content and that the originally flagged video was going viral and received a significant number of violating comments, which reflected the reports of coordinated online campaigns spreading anti-Muslim hate speech.

The Board noted that while the user shared the video shortly after the violence broke out, it did not fulfill the raising awareness exception under the policy. To it, the lack of a neutral caption and any contextual cues pointing towards awareness-raising proved the point. To the OSB, the risk of harm outweighed the public interest value of the post. Hence, the newsworthiness allowance was not applicable in this case.

The majority of the Board concluded that the post constituted a credible call for religious violence in Sambalpur in light of the online and offline context, the heightened violence ongoing in Sambalpur, and the lack of indication of applicability of policy exceptions. Thus, it found the removal consistent with Meta’s Violence and Incitement policy.

b. Enforcement Action

Meta informed the Board that the MMS bank was set up to remove all identical videos, even if they fell within policy exceptions. The OSB underlined the significant impact of Meta’s MMS bank enforcement on users posting identical content to raise awareness or condemn the actions. To it, the usage of the MMS bank should be time-bound. Meta informed the Board that its MMS bank practices were not time-limited or bound to certain geographic locations and that there were no plans to roll back this type of enforcement.

II. Compliance with Meta’s human rights responsibilities

The Board utilized the three-part test stipulated in Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR) to assess whether Meta’s decision in this case—to automatically remove all identical videos—was compatible with Meta’s human rights obligations towards the right of freedom to expression.

a. Legality (clarity and accessibility of the rules)

The OSB held that the prohibition of content inciting high-severity violence was sufficiently clear. The Board further noted that the awareness-raising exception was not included in the public-facing Violence and Incitement policy, leaving users unaware of it—which might prevent them from engaging in public interest discussions. The OSB  urged Meta to include the exception in its public-facing policy.

The Board also argued that the “spirit of the policy” allowance was not clear or accessible to the users, which raised serious concerns under the legality test. It urged Meta to include a Transparency Center page providing information on the spirit of the policy allowance.

b. Legitimate aim

The OSB considered that prohibiting incitement to violence on the basis of religious affiliation pursued the aim of protecting the rights of others to life and freedom of religion or belief, which is a legitimate aim recognized in Article 19(3) of the ICCPR.

c. Necessity and proportionality

On this prong of the test, the OSB concluded that the removal of the post was consistent with Meta’s human rights responsibilities as it posed an imminent risk of harm. The Board implemented the six-factor test stipulated in the Rabat Plan of Action to analyze the risks associated with the content. It focused on the factors of context, content, and likelihood of harm.

The OSB said that the video depicted a scene of violence during a religious parade between a person and the crowd. It noted that the “Jai Shri Ram” chant has been previously used to implicitly incite violence against minority groups, especially Muslims. The Board further highlighted that the religious parade led to violence and a fatality, and underlined the relationship between religious processions and communal violence.

Consequently, the OSB concluded that the removal of the post was necessary and proportionate due to the online and offline context of heightened communal violence ongoing in Odisha and the lack of any indication of the applicability of any policy exception. For the Board, the video posed a serious risk of furthering violence due to the volatile context of Odisha. Therefore, it upheld Meta’s removal decision.

The OSB noted that the MMS bank was set up to remove all identical videos regardless of the caption accompanying the posts, which ignored all exceptions to the Violence and Incitement policy. The majority recognized that the challenges of content moderation at scale were relevant to the analysis of broader enforcement decisions. To it, the timeliness of Meta’s enforcement actions was crucial in mitigating risks of violence during times of heightened conflagration. On this point, the OSB recalled the Board’s previous stance that Meta’s mistakes were inevitable and that false positives negatively impacted expression while false negatives endanger safety and the expression of those targeted (United States Posts Discussing Abortion decision). To this majority, the decision to remove all identical videos without applying strikes was necessary and proportionate to avoid the pointed risks associated with the content and to limit its dissemination.

Moreover, the majority considered the reports underlining the coordinated online campaigns to spread hate speech against Muslims and the related communal violence. Additionally, it acknowledged Meta’s challenges when removing threats of violence at scale (Protest in India Against France decision). As noted, the Board has previously highlighted that dehumanizing speech, including discriminatory actions or words, could contribute to atrocities. In order to prevent such harm, Meta has the legitimate authority to take down posts on its platforms that promote violence (Knin Cartoon decision). Furthermore, the OSB recognized that in certain circumstances, the removal of content that did not appear to directly incite violence in isolation with the aim to address cumulative harms at scale might be consistent with Meta’s human rights obligations (Depiction of Zwarte Piet decision)

Nonetheless, the majority of the Board noted that broad enforcement actions, including Meta’s MMS bank, should be time-bound. It stated that after the risk of violence in Odisha decreases, the company should reassess enforcement measures related to the MMS bank to allow policy exceptions. The OSB urged Meta to limit broad enforcement actions to a specific period and to heightened-risk geographic areas to avoid disproportionally impacting freedom of expression.

Considering this, the Oversight Board upheld Meta’s decision to remove the content.

Policy advisory statement

The Oversight Board did not issue any new recommendations but reiterated previous ones relevant to the case. It reiterated its recommendation from the Russian Poem decision that Meta should explicitly allow neutral content or content shared to condemn or raise awareness of violent threats under its Violence and Incitement policy.

Additionally, the Board recalled its recommendation from the Sri Lanka Pharmaceuticals decision, urging Meta to provide greater clarity for users regarding the “spirit of the policy” allowance. Specifically, it recommended that Meta explain on the landing page of the Community Standards that this allowance applies when Meta’s values and underlying rationale call for a different outcome than what is required by the letter of the policy.

Dissenting or Concurring Opinions

A minority of the Board disagreed with the majority’s conclusion that the video constituted a credible call for violence. It noted the absence of contextual indicators or evidence suggesting the user intended to issue or endorse such a call. The minority expressed concern that interpreting the video as a credible call for violence could set a precedent prohibiting any post depicting incitement, regardless of its purpose.

Nonetheless, the minority considered that the content could still be removed under the Violence and Incitement policy for a different reason. While the policy is silent on “depicted incitement,” it argued it could apply in cases where: (1) the post clearly intends to incite violence, (2) the post contains no indicators suggesting a policy exception applies, or (3) there is evidence that similar content was shared to incite violence. In this case, the second condition was met, making the removal permissible in its view. The minority emphasized the need for Meta to clarify whether the Violence and Incitement policy covers depicted incitement.

Under the legality test, the minority argued that Meta’s justification for mass removals—based on the “spirit” of the policy—amounted to an admission that the removals were not authorized by the letter of the policy. The fact that Meta also declined to apply strikes to this content reinforced, in their view, that the policy had not been violated. These factors led the minority to conclude that Meta failed the legality requirement in relation to its broader enforcement action.

While they acknowledged that the policy’s intent could be inferred to include depicted incitement, the minority recommended that Meta explicitly prohibit such content and define the conditions that would trigger its removal. Nonetheless, it agreed that the policy and its objectives were sufficiently clear to meet the legality requirement in this specific case.

The minority found Meta’s blanket removal of all identical videos, regardless of accompanying caption or context, to be disproportionate. It argued that the existence of communal violence did not justify sweeping content bans in the name of harm prevention. The minority emphasized the importance of raising awareness and sharing information during violent conflicts, warning that overly aggressive enforcement could harm vulnerable communities.

The dissenting group also challenged the framing of “voice” and “safety” as conflicting values, asserting instead that they are interconnected: suppressing certain content might silence those in danger, while misinformation can also endanger vulnerable groups.

The minority noted that blanket removals could disproportionately restrict freedom of expression, silencing individuals seeking help or sharing critical information. It warned that justifying such sweeping restrictions could embolden authoritarian governments to censor expression until their goals are met.

The minority further emphasized the negative impact of blanket removals on news reporting and accountability, noting that such actions could hinder efforts to identify those responsible for offline incitement. They rejected the idea that even time-limited blanket bans are consistent with Meta’s values and human rights obligations, especially during the viral period immediately following a post. In their view, removing content without considering context or intent fails to uphold Meta’s commitment to voice and freedom of expression. They expressed concern that the majority’s reasoning could be used to legitimize internet shutdowns by repressive regimes.

The minority also criticized Meta’s invocation of content moderation at “scale” as a blanket justification for mass removals. The dissenting pointed out that the Odisha State government had already taken restrictive actions—shutting down the internet, requesting removals, and banning protests for a year. In this context, it argued that Meta had alternative, less intrusive options and should have explained why removal was the least restrictive choice, as the Board previously requested in the Claimed COVID Cure decision.

Finally, the minority recommended that Meta offer a more comprehensive public explanation of how its artificial intelligence tools are being developed to better detect policy exceptions, ensuring more nuanced and proportionate enforcement.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

In this decision, the Board restricted expression by upholding both the removal of the post and the blanket removal of all identical videos. However, it did so within the narrow and specific context of ongoing communal violence and discrimination in India. The Board found that the post constituted a call to violence based on the Rabat Plan of Action, which provides a structured test to assess whether speech amounts to incitement. Nonetheless, it is worth highlighting the Board’s minority’s criticism that accepting blanket bans on content, regardless of its intent, opens the door to dangerous over-enforcement measures that can silence already affected groups or marginalized communities.

Global Perspective

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

 

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback