Global Freedom of Expression

Oversight Board Case of Colombian Protests

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 27, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2021-010-FB-UA
  • Region
    International
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Freedom of Association and Assembly / Protests, Political Expression, Facebook Community Standards, Objectionable Content, Hate Speech
  • Tags
    Oversight Board Policy Advisory Statement, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation, Meta Newsworthiness allowance, LGBTI

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 27, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a post on Facebook that included a video of a protest in Colombia in which people could be heard criticizing the then-president, Ivan Duque, and calling him, “marica”. Facebook determined that the term had been designated as a slur considering its inherent offensiveness, and its use as an insulting and discriminatory label primarily against gay men. In its decision, the Board concluded that though the removal of the content was consistent with the Hate Speech Community Standard on the face of it, the newsworthiness allowance should have been applied since the video was posted during widespread protests against the government at a significant moment in the country’s political history. 

 *The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In May 2021, the Facebook page of a regional news outlet in Colombia shared a post by another Facebook page without any additional caption. The shared content contained a short video from TikTok that showed a protest in Colombia, with people marching behind a banner that said “SOS COLOMBIA”. The protesters sang in Spanish and called the president “hijo de puta” and “marica.” Facebook translated these words as “son of a bitch” and “fag”, respectively.

According to the decision, “fewer than five users reported the content” [p. 4]. Following a human review, Facebook removed the shared post under its Hate Speech policy because the content described/negatively targeted people with slurs. The policy defined slurs as inherently offensive words, and which are used as insulting labels based on protected characteristics, including sexual orientation; the word “marica” was on Facebook’s list of prohibited slur words. The user who posted the shared post appealed Facebook’s decision. Following another human review, Facebook upheld its original decision to remove the content. Facebook also removed the original root post from the platform.


Decision Overview

The main issue for the Board was whether Facebook’s decision to remove a post, in which the then president of Colombia was called “marica” amidst protests, complied with the company’s content policies, values, and human rights responsibilities.

In their submission to the Board, the user stated that they were a journalist reporting on local news from their province. They further claimed that the content was posted by another person who took their phone, and nonetheless, maintained that this content did not intend to cause harm and showed protests during a crisis. The user further asserted that the content showed young people protesting within the framework of freedom of expression and peaceful protest.

Facebook contended that it had removed the content because it contained the word “m**ica,” thus violating Facebook’s Hate Speech Community Standard. It further noted that the word “m**ica” was on Facebook’s list of prohibited slur words because it targets people based on their sexual orientation. Finally, the company explained that the newsworthiness allowance could have only been applied if the content moderators who initially reviewed the post had decided to escalate it for additional review by Facebook’s content policy team. 

Compliance with Community Standards

The Board concluded that even though Facebook’s removal of the content appeared to follow its Hate Speech Community Standard on a surface level, the newsworthiness allowance should have been used to permit the content to stay on the site.

According to the public comments and expert advice presented to the Board, the word “m**ica” has several connotations and could be used without having discriminatory intent. It further noted that experts explained that the term attained widespread usage in Colombia to refer to a person as “friend” or “dude” and as an insult like “stupid,” “dumb,” or “idiot”, however, there was consensus that its origins were homophobic, used particularly against gay males. The Board pointed out that this evolution did not necessarily indicate that the term’s use was less damaging for gay men. While the Board agreed with Facebook that none of the exclusions clearly outlined in the Hate Speech Community Standard explicitly applied to allow its usage on the site, the Board deemed that the newsworthiness allowance should have been used to permit the content to stay on the platform.

The Board then explained that the newsworthiness allowance required the company to weigh the public interest of expression against the risk of harm from allowing violating content on the platform. However, it highlighted that it was vital to assess the contextual elements when determining the level of public interest in this topic. In the instant case, the Board remarked that the content was posted while the Colombian government targeted widespread protests. Even though participants seemed to employ the slur phrase on purpose, in the Board´s view, the protest’s goal was not discriminatory. Moreover, it noted that the “slur” was used once, among many other statements. Considering the previous, the Board believed that the newsworthiness exception was particularly relevant since the video sought to spread awareness of the protesters and show support for their cause rather than disparage people based on protected characteristics or encourage discrimination or violence.

The Board emphasized that the application of the newsworthiness allowance, in this case, should not be understood as an endorsement of the protesters’ language since it was not inherently of public interest value. Instead, it observed that public interest rested upon allowing expression on the platform, considering the significance of the political moment in Colombia’s history.

Compliance with Facebook’s values

The Board observed that reinstating the content aligned with Facebook’s values. On the one hand, the Board shared Facebook’s concern that allowing vile slurs to spread on the platform could affect members of the communities the slurs intend to target in terms of “Dignity”. On the other hand, the Board considered that the content, which shared protests against a political figure, represented the value of “Voice” at its apex. It noted that by applying the newsworthiness allowance to the slur policy, Facebook could uphold its core commitment to “Voice” without jeopardizing its legitimate commitment to “Dignity”. 

Compliance with Facebook’s human rights responsibilities

The Board found that restoring the content was consistent with Facebook’s human rights responsibilities as a business. After conducting a three-part test, the Board concluded that the removal was not necessary or proportionate.

I. Legality (clarity and accessibility of the rules)

The Board found that although Facebook’s Hate Speech Community Standard specified that slurs related to protected characteristics are prohibited, the specific list of words that Facebook designated as slurs in different contexts was not publicly available. Given that the term “m**ica” could be used differently, the Board considered it may not have been clear to users that this word contravened Facebook’s prohibition against slurs.

II.   Legitimate aim

 The Board determined that the hate speech policy pursued the legitimate aim of protecting the rights of others to equality, and protection against violence and discrimination based on sexual orientation and gender identity. 

III.       Necessity and proportionality

The Board found that removing the content was unnecessary and disproportionate. Although the Board recognized the potential for harms to the rights of the LGBT community by allowing homophobic slurs to remain on the platform, the Board also considered it essential for the company to examine the context around the post. Thus, by considering the political situation in Colombia, and the fact that the protest addressed a political figure, the Board remarked that the removal of the content in the immediate case was not proportionate to achieving the aim of protecting the rights to non-discrimination and equality of LGBT people.

Policy advisory statement:

The Board recommended that Facebook publish illustrative examples from the list of slurs it has designated as violating under its Hate Speech Community Standard. In addition, it stated that Facebook must link the short explanation of the newsworthiness allowance provided in the introduction to the Community Standards to the more detailed Transparency Center explanation of how this policy applies. Moreover, it noted that Facebook should develop and publicize clear criteria for content reviewers to escalate for additional review public interest content that potentially violates the Community Standards but may be eligible for the newsworthiness allowance. Finally, the Board considered that the company must notify all users who reported content assessed as violating, but left on the platform for public interest reasons, that the newsworthiness allowance was applied to the post.

Dissenting or Concurring Opinions:

A minority of the Board considered it essential to assess the content restriction in this case and its impact on the right to freedom of peaceful assembly. They believed that “restrictions on the right to freedom of peaceful assembly should be narrowly drawn, meeting the requirements of legality, legitimate aim, and necessity and proportionality” [p. 13]. Moreover, they underscored that journalists and other observers play an essential role in amplifying protests’ collective expression and associative power through disseminating footage online. The minority concluded that Facebook’s removal of the content impaired the right to freedom of peaceful assembly and that restriction was not justified.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The decision expands expression because the Board protected the right to freedom of peaceful assembly and its relationship with freedom of expression. The Board decided that protection for public discourse in democratic contexts was exceptionally high. The platform was instructed to provide space for a political message. The decision expanded expression as it ensured that journalists, human rights advocates, election observers, and other observers or reporters of assemblies could use online communications to report these types of protests.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board used the UNGPs as the legal basis of Meta’s commitment to respect human rights.

  • ICCPR, art. 19

    The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression. They also used it to apply the three-part test of legality (clarity), legitimacy, and necessity and proportionality.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board used Report A/HRC/38/35 to argue that social media companies must respect the requirements of Article 19, para. 3, ICCPR.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board cited this report to analyze the importance of context when there is possible use of hate speech.

  • ICCPR, art. 2

    The Board referred to Article 2 of the ICCPR as the legal basis for the right to non-discrimination.

  • ICCPR, art. 21

    The Board cited Article 21 of ICCPR as the legal basis for the right to peaceful assembly.

  • ICCPR, art. 26

    The Board referred to Article 26 of the ICCPR as the legal basis for the right to non-discrimination.

  • UNHR Comm., General Comment No. 18 (1989)

    The Board referred to General Comment No. 18 as the legal basis for the right to non-discrimination.

  • UNHRC, Protection against violence and discrimination based on sexual orientation and gender identity, A/HRC/RES/32/2 (15 July 2016 )

    The Board referred to UN Human Rights Council Resolution 32/2 as the legal basis for the protection against violence and discrimination based on sexual orientation and gender identity.

  • UNHRC Comm., General Comment No. 37 (2020)

    The Board referred to General Comment No. 37 as the legal basis for the right to peaceful assembly.

  • UN Special Rapporteur on freedom of peaceful assembly and association, A/HRC/41/41 (2019)

    The Board used the report as the legal basis for the right to peaceful assembly.

General Law Notes

Oversight Board Decisions:

  • Armenians in Azerbaijan ( 2020-003-FB-UA)
    • By referring to this case, the Board recalled that it had urged  Facebook to give users more detail on the specific parts of the Hate Speech policy that their content violated so that users can regulate their behavior accordingly. 
  • Former President Trump’s suspension (2021-001-FB-FBR)
    • By referring to this case, the Board noted that it had recommended Facebook produce more information to help users understand and evaluate the process and criteria for applying the newsworthiness allowance.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.” 

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback