Global Freedom of Expression

Oversight Board Case of Punjabi concern over the RSS in India

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 29, 2021
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2021-003-FB-UA
  • Region & Country
    India, Asia and Asia Pacific
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Transparency Recommendation, Oversight Board Policy Advisory Statement, Political speech, Discrimination against Minorities

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On April 29, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a user’s Facebook post containing a 17-minute video of an interview with Professor Manjit Singh, a social activist and supporter of the Punjabi culture. Additionally, the post criticized a Hindu nationalist organization, India’s Prime Minister, and his party. Upon review, Facebook restricted the user’s account since it considered the content breached the platform’s Dangerous Individuals and Organizations Community Standard. However, after the Board identified the case for review,  Facebook realized that the content was removed in error and restored it. The Board found that Facebook’s original decision was inconsistent with the company’s Community Standards or human rights responsibilities. It noted that the post highlighted the concerns of minority and opposition voices in India that were allegedly discriminated against by the government. Additionally, the Oversight Board expressed concerns about the vagueness of rules that prohibit the praise of dangerous individuals and organizations, the impact that restrictive measures on freedom of expression have on the political speech of minorities, and the lack of translation of Facebook’s Community Standards into Punjabi.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In November 2020, a Facebook user shared a video of a 17-minute interview, from Punjabi-language online media Global Punjab TV, with “Professor Manjit Singh, described as ‘a social activist and supporter of the Punjabi culture’” [p. 4]. In the post, Global Punjab TV “included the caption ‘RSS is the new threat. Ram Naam Satya Hai. The BJP moved towards extremism’” [p. 4]. The Rashtriya Swayamsevak Sangh (RSS) is a Hindu nationalist organization allegedly involved in violence against religious minorities in India. The BJP is India’s current ruling party. The current Indian Prime Minister, Narendra Modi, belongs to said party and “has close ties with the RSS” [p. 4]. 

 The content initially uploaded during a mass farmer’s protest touched on the reasons for the demonstrations and praised them. 

The Facebook user added accompanying text to the post saying that “the CIA designated the RSS a ‘fanatic Hindu terrorist organization’ and that Indian Prime Minister Narendra Modi was once its president. The user wrote that the RSS was threatening to kill Sikhs, a minority religious group in India, and to repeat the ‘deadly saga’ of 1984 when Hindu mobs attacked Sikhs” [p. 4]. Additionally, the user alleged that the current Prime Minister is formulating the threat of “Genocide of the Sikhs” on advice of the RSS president. The text ended “with a claim that Sikhs in India should be on high alert and that Sikh regiments in the army have warned Prime Minister Modi of their willingness to die to protect the Sikh farmers and their land in Punjab” [p. 4]. 

The user’s post “was up for 14 days and viewed fewer than 500 times” [p. 4]. The content was reported by another user for “terrorism.” A Facebook human reviewer considered that the post violated the Community Standard on Dangerous Individuals and Organizations and removed the content. This triggered an automatic restriction “on the use of the account for a fixed period of time” [p. 5]. Facebook notified the user of its decision, noting it was final “and could not be reviewed due to a temporary reduction in its review capacity due to COVID-19” [p. 5].

The user appealed Facebook’s decision to remove the content, and restrict their account, to the Oversight Board.

After the Case Selection Committee identified the case for review, but before it was assigned to a panel, Facebook determined that it removed the content in error and decided to restore it. Nevertheless, the Board decided it had the authority to review Facebook’s decision since “[c]oncerns over why the error occurred, the harm stemming from it, and the need to ensure it is not repeated remain pertinent” [p. 5]. 


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove the user’s post, in which the BJP, the RSS, and several of its leaders, were criticized, complied with its Dangerous Individuals and Organizations Community Standard.  The Board also assessed — applying a three-part test — if the measures taken by Facebook complied with the company’s Human Rights standards on freedom of expression.

The affected user argued before the Board “that the post was not threatening or criminal but simply repeated the video’s substance and reflected its tone” [p. 7]. 

Facebook, after having restored the content, accepted that the user´s post made no reference “to individuals or organizations designated as dangerous. It followed that the post contained no violating praise” [p. 7]. The company explained that the error in removing the content was due to the complexity of the content and its claims about various political groups, its length (17 minutes), and the number of speakers (2). Since human reviewers look at thousands of pieces of content daily, mistakes happen, noted Facebook. Facebook could also not specify the part of the content the reviewer found to violate the company’s rules.

Compliance with Community Standards

The Board began by analyzing if Facebook’s decision to remove the user’s content and restrict their account complied with the company’s Dangerous Individuals and Organizations Community Standard. It explained that the policy seeks to prevent real-world harm by not allowing “any organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.” More, it noted that the Standard states that Facebook removes “content that expresses support or praise for groups, leaders, or individuals involved in these activities” [p. 6]. 

In the Board’s view, Facebook’s original decision to remove the content did not comply with its Dangerous Individuals and Organizations Community Standard. It noted that the suppressed content referred to the BJP, the RSS, and several of its leaders; however, none of them were designated as ‘dangerous’ under the company’s Community Standards. The Board argued that even if these organizations were to be considered dangerous, the removed content was critical of them. Hence, the Board believed there was no violation of Facebook’s Community Standards.

Having found no violation in the removed user´s post, the Board stated its concern that the account restriction was wrongly imposed on the user, noting that “the consequences of enforcement mistakes can be severe”.

Compliance with Facebook’s Values

Facebook’s Community Standards establish “Voice”, “Safety” and “Dignity” as fundamental values for the platform. “Voice” aims to “create a place for expression and give people a voice” [p. 6]; “Safety” seeks to make Facebook a safe place, by forbidding expression that threatens people or “has the potential to intimidate, exclude or silence others [p. 6]; Dignity states “that all people are equal in dignity and rights” [p. 6].

The Board found that Facebook’s original decision was inconsistent with the company’s values: Voice, Dignity and Safety. Considering that the removed content linked to a media report, which discussed relevant political matters, “including commentary on the alleged violation of minority rights and the silencing of opposition by senior BJP politicians and the RSS” [p. 9], the Board argued that the “incorrect removal of the post undermined the values of ‘Voice’ and ‘Dignity’” [p. 9].

Compliance with Human Rights Standards on Freedom of Expression

The Board then analyzed Facebook’s actions concerning Human Rights Standards on Freedom of Expression. It concluded the company’s application of the Dangerous Individual and Organizations Standard was inconsistent with Principles 11 and 13 of the United Nations Guiding Principles on Business and Human Rights (UNGP).  These call “on businesses to avoid causing or contributing to adverse human rights impacts that may arise from their own activities or their relationships with other parties, including state actors, and to mitigate them” [p. 10]. 

The Board also cited article 19 of the ICCPR to highlight the international protection of expression and uninhibited debate “concerning political figures and the discussion on human rights” [p. 10], the right to seek information and the “the importance of independent and diverse media, especially for ethnic and linguistic minorities” [p. 10]. 

With this in mind, the Board, through a three-part test, analyzed whether the measures enacted by Facebook satisfied the legality requirement, pursued a legitimate aim, and were necessary and proportional.

  • Legality

Regarding this requirement, the Board reiterated its concern, expressed in previous decisions (2020-005-FB-UA), regarding “Facebook’s interpretation of ‘praise’, and the process for designating dangerous individuals and organizations” [p. 10]. Furthermore, the Board expressed its concern “with the accessibility of the Community Standard on Dangerous Individuals and Organizations” [p. 10].

To underscore its first concern, the Board cited the UN Special Rapporteur on Freedom of Expression, who expressed similar worries regarding “social media companies adopting vague rules that broadly prohibit ‘praise’ and ‘support’ leaders of dangerous organizations” [p. 10]. 

The Board also stated that the information regarding account restrictions was “spread across many locations, and not all set out in the Community Standards as one would expect” [p. 10]. The Board pointed out that in previous decisions (2020-006- FB-FBR), Facebook had been advised not to expect users “to synthesize rules from across multiple sources, and for rules to be consolidated in the Community Standards” [p. 11]. 

Finally, the Board noted the lack of translation of Facebook’s Community Standards into Punjabi, a language “widely spoken globally with 30 million speakers in India” [p. 11]. It further remarked that Facebook’s Internal Implementation Standards were also not available in Punjabi for moderators working in this language. The Board argued that this situation makes it harder for users to understand the rules of use and increases the likelihood of enforcement errors in the moderation process. The Board said that the absence of a translation to Punjabi raises human rights concerns in relation to the “possible specific impacts on a minority population” [p. 11] and referenced a UN report by the Independent Expert on minority issues (A/HRC/22/49).

  • Legitimate aim

Regarding the legitimate aim of the Dangerous Individuals and Organizations Community Standard, the Board considered that the policy pursues a legitimate aim by protecting “the right to life, security of a person, and equality and non-discrimination” [p. 11]. 

  • Necessity and proportionality 

Upon analyzing the necessity and proportionality of the measures issued by Facebook, the Board stated that the company admitted that removing the content was a mistake. Such mistakes, the Board opined, were especially worrying when they impact “minority language speakers or religious minorities who may already be politically marginalized” [pp. 11 & 12]. In the Board’s view,  the political context in which the post was uploaded — of anti-government farmer protests in India— an error like the one Facebook made could “silence minority voices that seek to counter hateful and discriminatory narratives” [p. 12].

For the Board, the account restrictions, which excluded the user from the platform, were disproportionate in light of the critical context of India. More, it suggested that the company, in line with the Human Rights Committee General Comment No. 34,  “should avoid undermining the expression of minorities who are protesting their government and uphold media pluralism and diversity” [p. 12].

The Board stated with concern that Facebook “could not carry out an appeal on the user’s content due to reduced capacity during the COVID-19 pandemic” [p. 12]. It deemed that such refusal directly affected the user’s right to access a remedy and thus requested Facebook to prioritize the return of its review capacity as soon as possible.

Finally, the Board noted that Facebook had declined to “provide specific answers to the Board’s questions regarding possible communications from Indian authorities to restrict content around the farmer’s protests, content critical of the government over its treatment of farmers, or content concerning the protests” [p. 12]. The Board criticized the company’s lack of transparency and said that this “makes it difficult for the Board or other actors to assess, for example, if enforcement of the Dangerous Individuals and Organizations policy has particular impacts on users, and particularly minority language speakers, in India. To inform the debate, Facebook should make more data public, and provide analysis of what it means” [p. 13]. Due to these concerns, the Board expressed regarding the legality and the necessity and proportionality of the measures issued by Facebook, it decided to overturn the company’s decision while acknowledging that Facebook had already restored the removed content. 

Policy Advisory Statement 

The Board made several Policy Advisory Statements. It urged Facebook to “translate its Community Standards and Internal Implementation Standards into Punjabi” [p. 14]. Likewise, the Board recommended that the company improve its human review and access to human appeals to pre-pandemic levels “as soon as possible while fully protecting the health of Facebook’s staff and contractors” [p. 14]. Finally, the Oversight Board said that Facebook should enhance its transparency “to increase public information on error rates by making this information viewable by country and language for each Community Standard” [p. 14].


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

In its assessment of Facebook’s Dangerous Individuals and Organizations Community Standard, and the company’s application of it in the case at hand, the Oversight Board expands freedom of expression, in line with human right standards, by arguing that measures that restrict expression should not be overly broad or vague and that rules should be consolidated in the Community Standards in diverse languages. Likewise, the Board widens the scope of freedom of expression in its assessment of the importance of understanding the impact that certain restrictions impose upon minorities and the relevance of protecting these groups’ political speech.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this general comment to underscore media pluralism.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression has expressed concerns at social media companies adopting vague rules.

  • UN Special Rapporteur, Report of the Special Rapporteur on torture and other cruel, inhuman or degrading treatment or punishment, UN Doc. A/HRC/22/53 (2013)

    The Board referenced the report to underscore that the Rapporteur on Freedom of Expression has expressed concerns regarding the impact that certain restrictions on freedom of expression have on minorities.

  • OSB, Nazi quote, 2020-005-FB-UA (2021)

    The Board referred to this decision to reiterate its concern for the process for designating dangerous individuals and organizations.

  • OSB, Claimed COVID-19 Cure, 2020-006-FB-FBR (2021)

    The Board referred to this decision to reiterate Facebook’s duty to consolidate rules from across multiple sources.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback