Global Freedom of Expression

Oversight Board Case of Colombian Police Cartoon

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    September 15, 2022
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2022-004-FB-UA
  • Region & Country
    Colombia, Latin-America and Caribbean
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Political Expression, Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Transparency Recommendation, Oversight Board Policy Advisory Statement, Cartoons, Policing of Protests

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On September 15, 2022, the Oversight Board overturned Meta’s decision to remove a Facebook post of a cartoon depicting police violence in Colombia. Meta removed the content originally under its Dangerous Individuals and Organizations Community Standard because the post matched an image in a Media Matching Service bank of content that breached the community standard. The Board held that removing the content was inconsistent with Meta’s content policies and values; it was also unnecessary and disproportionate. The Board highlighted the heightened protection of freedom of expression regarding political and social issues and urged Meta to make changes to its processes around media matching service banks – one of its automated content moderation systems.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

A Facebook user in Colombia posted in September 2020 a cartoon “resembling the official crest of the National Police of Colombia, depicting three figures in police uniform holding batons over their heads. They appear to be kicking and beating another figure who is lying on the ground with blood beneath their head” [p. 3]. The crest’s text read —in Spanish— “República de Colombia – Policía Nacional – Bolillo y Pata”, which Meta translated as “National Police – Republic of Colombia – Baton and Kick”. The post was uploaded “during a time of widespread protest in the country following a police killing” [p. 4].

In January 2022, 16 months after the post was originally published on Facebook, Meta decided to remove the content. The company argued that the post matched “with an image in a Media Matching Service bank of content that violates Facebook’s Dangerous Individuals and Organizations Community Standard” [p. 4]. After the user appealed this decision, Meta decided to uphold its decision to delete the content “but based its removal decision on the Violence and Incitement Community Standard instead” [p. 4]. When the content was removed, it had been “viewed three times and received no reactions or user reports” [p. 4].

As a result of the Oversight Board selecting this case, Meta reviewed the post again and concluded that “it did not violate the Dangerous Individuals and Organizations Community Standard or the Violence and Incitement Community Standard” [p. 4]. Subsequently, the content was restored to the platform one month after its removal. 

Meta informed the Board “that because the image was in a Media Matching Service bank, identical content, including this case, had been removed from the platform” [p. 4]. There were several removals of which 215 were appealed by the users; a total of “210 of those appeals were successful, meaning that the reviewers in this set of cases decided the content was not violating” [p. 4]. 


Decision Overview

The Oversight Board analyzed whether Meta was correct to remove, under its Dangerous Individuals and Organizations Community Standard or the Violence and Incitement Community Standard, a Facebook post in which a cartoon of the official crest of the National Police of Colombia is associated with police violence. The Board also assessed if this measure complied with Meta’s values and its human rights responsibilities. Additionally, the Board showed special concern about Media Matching Service banks — a system that can automatically delete images that violate Meta’s rules — and the way they can “amplify the impact of incorrect decisions to bank content” [p. 1].

In its submission to the Board, the user “explained that the content reflected reality in Colombia which was important for those who were interested in or affected by the situation” [p. 6].

For its part, Meta explained that it deleted “the content because it matched with an image that had been mistakenly entered by a human reviewer into a Dangerous Individuals and Organizations Media Matching Service bank” [p. 6]. On appeal, Meta determined that the post violated the Violence and Incitement policy. The company later acknowledged to the Board that both decisions were wrong and that the content did not violate any Community Standard. 

Meta then explained how its Media Matching Service banks operate. According to the company “the Media Matching Service banks identify and act on media, in this case, images, posted on its platforms. Once content is identified for banking, it is converted into a string of data, or ‘hash.’ The hash is then associated with a particular bank. Meta’s Media Matching Service banks align with particular content policies such as Dangerous Individuals and Organizations, Hate Speech, and Child Sexual Exploitation, Abuse and Nudity, or specific sections within a policy” [p. 6-7].

In the case at hand, the content was in a Media Matching Service Bank for criminal organizations, prohibited under the Dangerous Individuals and Organizations policy. Media Matching Service banks, as Meta explained, can be programmed to take different actions “once it identifies banked content. For example, Meta might delete content that is violating, add a warning screen, or ignore it if it has been banked as non-violating content” [p. 7]. The system can also act at different moments in time: “Meta can scan images at the point of upload to prevent some violating content from being posted. Banks can also be configured to only detect and take action on newly uploaded content, or they can be used to scan existing content on the platform” [p. 7].

Compliance with Community Standards

The Board noted that Meta’s actions did not comply with Facebook’s policies, and Meta itself recognized that the removed content did not violate any Community Standard. Thus, Meta’s decision to remove the post, add the image “to a Media Matching Service bank and the failure to overturn the automated removal on appeal were wrong” [p. 9].

Compliance with Facebook’s values

The Board held that Meta’s decisions were inconsistent with its value of “Voice”, and that the post’s removal “was not supported by any other Meta values” [p. 9].

Compliance with Human Rights Standards

Upon analyzing Meta’s measures, and their compliance with human rights standards, the Board mentioned that Meta “committed itself to respect human rights under the UN Guiding Principles on Business and Human Rights (UNGPs)” [p. 10]. The Board also said that Article 19 of the ICCPR — as understood by the Human Rights Committee in General Comment 34, paras. 11 and 20— “provides for broad protection of expression, and expression about social or political concerns receives heightened protection” [p. 10]. 

The Board analyzed, through a three-part test, whether Meta’s decisions complied with its human rights responsibilities.

I. Legality

The legality requirement demands restrictions on freedom of expression to be “accessible and clear enough to provide guidance as to what is permitted and what is not” [p. 10]. In this specific case, the Board considered that the content removal “was not attributable to the lack of clarity or accessibility of relevant policies” [p. 10].

II. Legitimate aim

To be valid, restrictions on freedom of expression “should pursue one of the legitimate aims listed in the ICCPR, which include the ‘rights of others’” [p. 10]. In the instant case, the requirement is fulfilled since the Board has consistently found that the Dangerous Individuals and Organizations and the Violence and Incitement policies seek to prevent offline violence, hence complying “with Meta’s human rights responsibilities” [p. 10].

III. Necessity and proportionality

The necessity and proportionality principles provide “that any restrictions on freedom of expression ‘must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected’” [p. 10-11]. Considering that Meta’s decision to remove the content did not pursue a legitimate aim, the Board deemed this measure unnecessary. 

The Board also considered that the design of Meta’s Media Matching banks “enabled reviewers to mistakenly add content to a bank that resulted in the automatic removal of identical content, despite it being non-violating” [p. 11]; this was considered by the Board to be extremely disproportionate. Similarly, the Board expressed concern regarding the removal of content about human rights violations at the hands of state agents: “Police violence is an issue of major and pressing public concern. The Board has also previously noted the importance of social media in sharing information about protests in Colombia in case 2021-010-FB-UA” [p. 11].

Additionally, the Board observed that the implementation of controls “on the addition, auditing, and removal of content in [media] banks” [p. 11] is essential, as are appeal opportunities. The Board argued that these automated systems may have disproportionate consequences, considering the scale at which they operate. “The stakes of mistaken additions to Media Matching Service banks are especially high when, as in this case, the content consists of political speech criticizing state actors or actions. Media Matching Service banks automate and amplify the impacts of individual incorrect decisions, and it is important for Meta to continually consider what error mitigation measures best help its human reviewers” [p. 11].

The Board expressed concern about the lack of performance metrics regarding “Media Matching Service banks for particular content policies” [p. 12]. According to the Board, Meta must publish “information on accuracy for each content policy where it uses Media Matching Service technology. This should include data on error rates for non-violating content mistakenly added to Media Matching Service banks of violating content. It should also include the volume of content impacted by incorrect banking and key examples of errors” [p. 12].

In light of these reasons, the Oversight Board overturned “Meta’s original decision to take down the content” [p. 12].

Policy Advisory Statement:

The Oversight Board urged Meta to “ensure that content with high rates of appeal and high rates of successful appeal is re-assessed for possible removal from its Media Matching Service banks” [p. 13]. Likewise, it asked the company to limit the time between when banked content is identified for additional review and when, if deemed non-violating, it is removed from the bank. Finally, the Board requested that Meta publish “the error rates for content mistakenly included in Media Matching Service banks of violating content, broken down by each content policy, in its transparency reporting” [p. 13].


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

In this decision, the Oversight Board expanded freedom of expression by protecting speech regarding pressing political and social matters related to police violence during protests. Likewise, the Board’s assessment regarding the risks of automated moderation systems —such as media banks— and the importance of implementing controls, provides a robust protection to expression in social media against overly broad moderation online.  

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board referred to this provision to highlight the international protection of freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this general comment to underscore the heightened protection of expression about political and social issues.

  • OSB, Colombia Protests, 2021-010-FB-UA (2021)

    By referring to this case, the Board highlighted the public interest in allowing content criticizing the government during protests, in particular in contexts where states are accused of violating human rights.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.” 

The decision was cited in:

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback