Global Freedom of Expression

Oversight Board Case of UK Drill Music

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    November 22, 2022
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2022-007-IG-MR
  • Region & Country
    United Kingdom, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Instagram Community Guidelines, Referral to Facebook Community Standards, Facebook Community Standards, Violence And Criminal Behavior, ​​Violence and Incitement, Artistic Expression
  • Tags
    Meta Spirit of the Policy allowance, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On November 22, 2022, the Oversight Board overturned Meta’s decision to remove a UK drill music video clip shared on Instagram. Meta had removed the post following a request from the UK Metropolitan Police. Relying on information provided by the Metropolitan Police, Meta had found that the track contained a “veiled threat” by referencing a shooting in 2017. Meta manually removed another 52 posts containing the track, and its automated systems removed such content another 112 times. In its decision, the Board found that Meta lacked sufficient evidence to determine that the content contained a credible threat and noted that the company should have given more weight to the artistic nature of the post. 

The Board found that the channels through which law enforcement makes requests to Meta are haphazard and opaque. It recognized that, while state actors can provide context and expertise, not every piece of content they report should be taken down. Likewise, the Board stressed that Meta should assess such requests independently, particularly when they relate to artistic expression from marginalized groups whose risk of cultural bias against their content is acute. The Board also highlighted that users’ inability to appeal content moderation decisions Meta takes at escalation raised serious concerns regarding users’ right of access to remedy. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In January 2022, a public Instagram account, self-described as a promoter of British music, posted a 21-second clip from the music video “Secrets Not Safe” by drill rapper Chinx (OS). The lyrics of the clip are quoted below, meaning for non-standard English terms, as interpreted by the Board, are in square brackets and names of individuals have been redacted:

“Ay, broski [a close friend], wait there one sec (wait). You know the same mash [gun] that I showed [name redacted] was the same mash that [name redacted] got bun [shot] with. Hold up, I’m gonna leave somebody upset (Ah, fuck). I’m gonna have man fuming. He was with me putting loud on a Blue Slim [smoking cannabis] after he heard that [name redacted] got wounded. [Name redacted] got bun, he was loosing (bow, bow) [he was beaten]. Reverse that whip [car], confused him. They ain’t ever wheeled up a booting [a drive-by shooting] (Boom). Don’t hit man clean, he was moving. Beat [shoot] at the crowd, I ain’t picking and choosing (No, no). Leave man red [bleeding], but you know [track fades out].” [pp. 3-4]

The video ended by fading to a black screen that said: “OUT NOW”. Additionally, in the caption to the video, the user stated that the track had just been released and tagged Chinx (OS) and another artist.

Drill is a subgenre of rap music, popular in the UK, particularly among young Black people. Drill artists are known for speaking in detail about ongoing violent street conflicts, using a first-person narrative with imagery and lyrics that depict or describe violent acts.

After the video was posted, the UK Metropolitan Police sent an email requesting Meta to review all content that included the track “Secrets Not Safe”. As a result, Meta escalated the content for review to its internal Global Operations team and then to its Content Policy team.

Based on the additional information provided by Metropolitan Police, Meta decided to remove the post under the Violence and Incitement policy since the track contained a “veiled threat” referencing a 2017 shooting, which could potentially lead to further violence.

On the same day, the content creator appealed the decision to Meta. While users usually cannot appeal content decisions the company takes through its escalation process, in this case, due to human error, the user was able to appeal the decision to Meta’s at-scale reviewers. Following an assessment by an at-scale reviewer, the content was found non-violating and was restored to Instagram. 

Eight days later, after receiving a second request from the Metropolitan Police, Meta removed the content through its escalation process again. Additionally, the company removed 52 pieces of content containing the track “Secrets Not Safe” from other accounts, including Chinx (OS)’s. Further, Meta added the content to the Violence and Incitement Media Matching Service bank, resulting in 112 automated removals of matching content from other users. 

Meta referred the case to the Board. The Board requested that Meta also refer the removal of content from Chinx (OS) ‘s account for review so that it could be examined alongside the content in this case. However, Meta said that this was impossible as removing the “Secrets Not Safe” video from Chinx (OS)’s account ultimately led to the account being deleted, and that the content was not preserved.

 


Decision Overview

The main issue before the Oversight Board was whether Meta’s decision to remove the post was in line with Facebook’s ​​Violence and Incitement Community Standard, Meta’s values, and the company’s human rights responsibilities. 

The user did not provide a statement to the Board.

In its submission to the Board, Meta explained that it had removed the content since it contained a veiled threat of violence which violated the company’s Violence and Incitement policy. 

Compliance with Meta’s content policies

I. Content rules

The Board explained that Meta’s Violence and Incitement Community Standard seeks to prevent potential offline harm and that it removes language “that incites or facilitates serious violence” or content that “poses a genuine risk of physical harm or direct threats to public safety.” The Board highlighted that, as it had previously stated in the decisions of Protest in India against France and Knin cartoon, detecting and assessing threats at scale can be particularly challenging when they are veiled. Moreover, the Board noted that veiled threats conveyed in art could present additional challenges since they can be obscure in their intent and deliberately subject to interpretation.

The Board also explained that through its internal Implementation Guidance, Meta establishes its “veiled threats analysis,” which seeks to assess whether a veiled threat is present. This analysis requires the identification of both primary and secondary signals for content to qualify as a veiled threat. Applying this criterion to the present case, the Board found that Meta’s removal of the content did not comply with the Violence and Incitement Community Standard. It agreed with Meta that a primary signal was present since the excerpt referenced a 2017 shooting, a historical incidence of violence. However, the Board found that Meta had not demonstrated that a second signal was present. It explained that identifying a veiled or implicit threat also required a secondary signal showing that the reference “could be threatening and/or could lead to imminent violence or physical harm” [p.9]. Such a signal, the Board emphasized, relies on the local context or experts in the matter indicating that the content could potentially be threatening or confirmation by the targeted individual that they view the content as threatening. In the present case, the UK Metropolitan Police provided this confirmation. Meta determined that Chinx (OS)’s reference to the 2017 shooting was potentially threatening, or likely to contribute to imminent violence or physical harm and qualified as a veiled threat. Its assessment of the lyrics included the specific rivalry between gangs associated with the 2017 shooting, as well as the broader context of inter-gang violence and murders in London. However, the Board considered that to establish that a mere mention of a shooting that had occurred years ago presented a risk of harm today would require additional probative evidence beyond the reference itself. Moreover, the Board highlighted that artistic references to violent rivalry between gangs does not necessarily constitute a threat. In the Board’s view, “[I]n the absence of either sufficient detail to make that causal relationship clearer, such as evidence of past lyrics materializing into violence or a report from the target of the purported threat that they were endangered, greater weight should have been afforded to the artistic nature of the alleged threat when evaluating its credibility” [p.9].

When analyzing whether the track’s reference to past violence constituted a credible threat, the Board found that the fact performative bravado was typical of the drill musical genre was relevant context that Meta should have considered. The Board noted that its own review had not uncovered evidence to support the finding that the lyrics in the track represented a credible threat.

Additionally, the Board emphasized that the present case raised concerns regarding Meta’s relationships with governments, particularly where law enforcement requests lead to lawful content being reviewed against the Community Standards and removed. The Board stated that “while law enforcement can sometimes provide context and expertise, not every piece of content that law enforcement would prefer to have taken down should be taken down” [p.9]. Thus, the Board highlighted the importance of Meta analyzing such requests independently, especially “when they relate to artistic expression from individuals in minority or marginalized groups for whom the risk of cultural bias against their content is acute” [p.9].

II. Enforcement action and transparency 

The Board highlighted that in response to the request from the Metropolitan Police request, 174 matching pieces of content were removed (52 manual removals and 112 automated removals). In the Board’s view, the scale of these removals evidenced the importance of due process and transparency surrounding Meta’s relationship with law enforcement and the consequences of actions carried out pursuant to that relationship. To address these concerns, the Board stated that there “needs to be a clear and uniform process with safeguards against abuse, including auditing; adequate notice to users of government involvement in the action taken against them; and transparency reporting on these interactions to the public” [p.10].

The Board explained that Meta publishes reports on government requests to remove content based on local law, governmental requests for user data, and enforcement against Community Standards. However, it stressed that none of these differentiate data on content removed for violating content policies following a government request for review.

In the Board’s view, the present case showed the privileged access law enforcement has to Meta’s internal enforcement teams. For the Board, such a relationship brings into question Meta’s ability to independently assess government actors’ conclusions that lack detailed evidence. The Board recognized that, while Meta has made progress concerning transparency reporting since the Board’s first addressed this issue, further transparency efforts could be valuable to the public discussion regarding the implications of the interactions between governments and social media companies.

The Board noted that this case demonstrated significant flaws regarding Meta’s system governing law enforcement requests, where these requests are not based on local law and are made outside of its in-product reporting tools. The Board highlighted that the channels through which governments can request review for violations of Meta’s content policies remained opaque. Law enforcement agencies make requests by various communications channels, making the standardization and centralization of requests, and collecting data about them, challenging. The current system does not adequately ensure that third-party requests meet minimum standards and does not allow for the accurate collection of data to enable the effects of this system to be properly monitored and audited. 

Compliance with Meta’s values

The Board remarked that the present case demonstrated the difficulties Meta faces in balancing the values of “Voice” and “Safety,” when seeking to address a high number of potential veiled threats in art. The Board referred to its decision in the case of Wampum belt to emphasize that “art is a particularly important and powerful expression of “Voice,” especially for people from marginalized groups creating art informed by their experiences” [p.11]. While the Board recognized the importance of keeping communities safe from violence, it noted that a presumption against the value of “Voice” could disproportionately impact marginalized people’s voices. In the Board’s view, Meta did not have sufficient information in the present case to conclude that the content posed a risk to the value of “Safety” that justified displacing the value of “Voice”.

Compliance with Meta’s human rights responsibilities

By referring to Articles 2 and 19 of the ICCPR, the Board noted that the right to freedom of expression is guaranteed to all people without discrimination. Specifically, the Board underscored that Article 19 protects expression “in the form of art.”

Turning to the present issue, the Board highlighted that “Drill music relies on boastful claims to violence to drive the commercial success of artists on social media” [p.12], so Meta’s decision to remove Chinx (OS) from Instagram permanently could significantly impact his ability to reach his audience and find commercial success.

By employing the three-part test set out in Article 19 ICCPR, the Board proceeded to analyze whether Meta’s decision to remove the content was consistent with its human rights responsibilities as a business.

I. Legality (clarity and accessibility of the rules)

The Board stated that the principle of legality requires laws limiting expression to be clear and accessible so individuals understand what is permitted and what is not. Also, it reiterated its concerns about the unclear relationship between the Instagram Community Guidelines and Facebook Community Standards.

Additionally, the Board remarked that it was concerned about the discrepancies between the publicly facing Violence and Incitement Community Standard and Meta’s internal Implementation Standards. Notably, it pointed out that although the company’s public-facing Community Standards establish that “signals” are used to determine whether specific content contains a veiled threat, there is no mention that Meta divides these into primary and secondary signals or that both are required to find a policy violation.

II. Legitimate aim 

The Board considered that since one of the primary purposes of the Violence and Incitement Community Standard is to prevent offline harm, the policy serves the legitimate aim of the protection of the rights of others.

 III. Necessity and proportionality

The Board noted that through General Comment No. 34, the Human Rights Committee has clarified that any restrictions on expression must respect the principle of non-discrimination.

The Board stated that through a freedom of information request, it learned that all the 286 requests made by the Metropolitan Police to social media companies and streaming services to review or remove musical content posted from June 2021 to May 2022 involved drill music. Additionally, the Board underlined that 255 of these requests resulted in platforms removing content.  Twenty-one were related to Meta platforms, resulting in fourteen content removals. In the Board’s opinion, the intensive focus on drill music raised serious concerns of potential over-policing of certain communities. 

Likewise, the Board stressed that it was paramount that Meta provides users adequate access to remedy for content decisions that impact their rights. The Board noted that in the present case, the content under review was posted by an Instagram account not belonging to Chinx (OS), yet the artist had posted the same video to his account, which then was removed, resulting in his account being first disabled and later deleted. In the Board’s view, this demonstrated how collaboration between law enforcement and Meta could significantly limit artists’ expression, denying their audience access to art on the platform. As the freedom of information request had confirmed, such collaboration specifically and exclusively targeted drill artists, mostly young Black men. The Board noted it was concerned about access to remedy for users and that users cannot appeal decisions taken “at escalation” to the Oversight Board. This includes all government requests for removals (besides “in-product tool” usage), including lawful content. The Board emphasized that this was especially concerning for individuals belonging to discriminated-against groups, likely to experience further barriers to accessing justice due to Meta’s product design choices. In the Board’s view, “the company cannot allow its cooperation with law enforcement to be opaque to the point that it creates a barrier to users accessing remedies for potential human rights violations” [p.14].

In conclusion, the Oversight Board overturned Meta’s decision to remove the content, requiring the post to be restored.

Identical content with parallel context

The Board stressed that including the content at issue to the Violence and Incitement Media Matching Service bank resulted in the automated removals of matching content and potentially additional account-level actions on other accounts. The Board stated that following this decision, “Meta should ensure the content is removed from this bank, restore identical content it has wrongly removed where possible, and reverse any strikes or account-level penalties” [p. 14]. Moreover, the Board noted that the company should remove any restriction on Chinx (OS) ‘s ability to re-establish an account on Instagram or Facebook.

Policy advisory statement:   

The Board shared seven specific policy recommendations related to content policy, enforcement, and transparency.

Regarding content policy, the Board recommended that Meta update its description of the value of “Voice” to reflect the importance of artistic and creative expression. Additionally, the Board urged the company to clarify that for content to be removed as a “veiled threat” under the Violence and Incitement Community Standard, both a primary and secondary signal is required.

Concerning enforcement, the Board recommended that Meta allow users to appeal to the Oversight Board for any decisions made through Meta’s internal escalation process. The Board also urged Meta to implement a globally consistent approach to receive requests for content removals from state actors by creating a standardized intake form asking for minimum criteria. The Board also encouraged the company to mark and preserve any accounts and content that were penalized or disabled for posting content subject to an open investigation by the Board.

Finally, regarding transparency, the Board advised Meta to create a section in its Transparency Center, alongside its “Community Standards Enforcement Report” and “Legal Requests for Content Restrictions Report,” to report on state actor requests to review content for Community Standard violations. Additionally, the Board urged Meta to regularly review the data on its content moderation decisions brought by state actor requests to assess for systemic biases. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

Through this decision, the Board expands freedom of expression by finding that Meta lacked sufficient evidence to conclude that the content contained a credible threat. Notably, the Board expands expression in four distinct manners. First, the Board stressed the importance of considering the artistic nature of content when addressing veiled threats. Second, the Board highlighted the lack of transparency around Meta’s relationship with law enforcement, noting that such a relationship has the potential for the company to amplify bias. Third, by recommending that Meta provide users with the opportunity to appeal to the Oversight Board for any decisions made through the company’s internal escalation process, the Board safeguards users’ right to access remedy when they consider their right to freedom of expression has been infringed. Finally, the Board protects expression by recommending Meta preserve accounts and content penalized or disabled for posting content subject to an open investigation by the Board. By urging the company to implement the latter, the Board seeks to ensure that users can re-access their accounts if it finds that Meta has wrongfully removed content. 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)
  • ICCPR, art. 2

    The Board referred to this Article to remark that the right to freedom of expression, as established in article 19 of the ICCPR, is guaranteed to all people without discrimination.

  • ICCPR, art. 6

    The Board interpreted the right to life contained in this Article

  • ICCPR, art. 19

    The Board analyzed Meta’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Meta’s actions allowed expression to be limited.

  • ICCPR, art. 26

    The Board referenced this standard on equality and non-discrimination.

  • ICCPR, art. 27

    The Board referred to this Article to emphasize the rights of persons belonging to ethnic minorities to enjoy their own culture in community with other members of their group.

  • ICESCR, art. 15

    The Board referred to this Article to highlight the right to participate in cultural life.

  • ICERD, art. 2

    The Board stated that this standard informed its analysis of this case regarding its findings on equality and non-discrimination.

  • ICERD, Article 5

    The Board referenced this standard which protects the exercise of the right to freedom of expression without discrimination based on race.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Meta’s actions allowed expression to be limited, the Board referred the General Comment for guidance.

  • UNHR Comm., General Comment No. 31 (2004)

    The Board noted that through this standard, the UN Human Rights Committee has established that the right of access to remedy is a key component of international human rights law.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    In referring to this report, the Board remarked that the UN Special Rapporteur on freedom of expression has noted that social media companies should be mindful that regulating expression at scale can give rise to concerns particular to a context.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board cited that the UN Special Rapporteur has stressed the process of remediation for social media companies “should include a transparent and accessible process for appealing platform decisions, with companies providing a reasoned response that should also be publicly accessible.”

  • UN Special Rapporteur on freedom of opinion and expression, report A/HRC/44/49/Add.2 (2020)

    The Board relied heavily on this report to underscore the difficulties artists face in their use of social media, as such expression tends to have complex characteristics and can easily fall foul of platforms’ rules, with inadequate remedial mechanisms. It is also referenced the report’s comments on the responsibilities of social media companies in relation to artistic expression.

  • UN Special Rapporteur in the field of cultural rights, report on artistic freedom and creativity, A/HRC/23/34, 2013

    By referring to this report, the Board underscored the complexity of artistic expression, and the fact that art is often political, and international standards recognize its unique and powerful role in challenging the status quo.

General Law Notes

Oversight Board Decisions:

  • Colombian police cartoon (2022-004-FB-UA).
    •  The Board stated that, as it had previously held in this decision, it was concerned that Media Matching Service banks could amplify the impact of incorrect decisions. 
  • Knin cartoon (2022-001-FB-UA).
    • The Board referred to this decision to highlight that detecting and assessing veiled threats at scale is challenging, particularly when specific cultural or linguistic expertise may be required to assess context.
  • Wampum belt (2021-012-FB-UA).
    • By referring to this decision, the Board stated that art is a particularly important and powerful expression of the value of “Voice,” especially for people from marginalized groups creating art informed by their experiences.
  • Shared Al Jazeera post (2021-009-FB-UA).
    • The Board reiterated that as it had underscored in the decisions of Shared Al Jazeera post and Öcalan’s isolation, the scale of law enforcement requests that have resulted in content removals evidenced the importance of due process and transparency around Meta’s relationship with law enforcement and the consequences of actions taken under that relationship. 
  • Öcalan’s isolation (2021-006-IG-UA).
    • By implementing the Board’s recommendation in this case, the Board noted that Meta had disclosed that it is improving notifications to users by specifically indicating when content was removed for violating the Community Standards after being reported by a government entity. Likewise, the Board reiterated that as it had underscored in the decisions of Shared Al Jazeera post and Öcalan’s isolation, the scale of law enforcement requests that have resulted in content removals evidenced the importance of due process and transparency around Meta’s relationship with law enforcement and the consequences of actions taken under that relationship. 
  • Protest in India against France (2020-007-FB-FBR).
    • The Board referred to this decision to highlight that detecting and assessing veiled threats at scale is challenging, particularly when specific cultural or linguistic expertise may be required to assess context.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback