Case Summary and Outcome
On November 22, 2022, the Oversight Board overturned Meta’s decision to remove a UK drill music video clip shared on Instagram. Meta had removed the post following a request from the UK Metropolitan Police. Relying on information provided by the Metropolitan Police, Meta had found that the track contained a “veiled threat” by referencing a shooting in 2017. Meta manually removed another 52 posts containing the track, and its automated systems removed such content another 112 times. In its decision, the Board found that Meta lacked sufficient evidence to determine that the content contained a credible threat and noted that the company should have given more weight to the artistic nature of the post.
The Board found that the channels through which law enforcement makes requests to Meta are haphazard and opaque. It recognized that, while state actors can provide context and expertise, not every piece of content they report should be taken down. Likewise, the Board stressed that Meta should assess such requests independently, particularly when they relate to artistic expression from marginalized groups whose risk of cultural bias against their content is acute. The Board also highlighted that users’ inability to appeal content moderation decisions Meta takes at escalation raised serious concerns regarding users’ right of access to remedy.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
Facts
In January 2022, a public Instagram account, self-described as a promoter of British music, posted a 21-second clip from the music video “Secrets Not Safe” by drill rapper Chinx (OS). The lyrics of the clip are quoted below, meaning for non-standard English terms, as interpreted by the Board, are in square brackets and names of individuals have been redacted:
“Ay, broski [a close friend], wait there one sec (wait). You know the same mash [gun] that I showed [name redacted] was the same mash that [name redacted] got bun [shot] with. Hold up, I’m gonna leave somebody upset (Ah, fuck). I’m gonna have man fuming. He was with me putting loud on a Blue Slim [smoking cannabis] after he heard that [name redacted] got wounded. [Name redacted] got bun, he was loosing (bow, bow) [he was beaten]. Reverse that whip [car], confused him. They ain’t ever wheeled up a booting [a drive-by shooting] (Boom). Don’t hit man clean, he was moving. Beat [shoot] at the crowd, I ain’t picking and choosing (No, no). Leave man red [bleeding], but you know [track fades out].” [pp. 3-4]
The video ended by fading to a black screen that said: “OUT NOW”. Additionally, in the caption to the video, the user stated that the track had just been released and tagged Chinx (OS) and another artist.
Drill is a subgenre of rap music, popular in the UK, particularly among young Black people. Drill artists are known for speaking in detail about ongoing violent street conflicts, using a first-person narrative with imagery and lyrics that depict or describe violent acts.
After the video was posted, the UK Metropolitan Police sent an email requesting Meta to review all content that included the track “Secrets Not Safe”. As a result, Meta escalated the content for review to its internal Global Operations team and then to its Content Policy team.
Based on the additional information provided by Metropolitan Police, Meta decided to remove the post under the Violence and Incitement policy since the track contained a “veiled threat” referencing a 2017 shooting, which could potentially lead to further violence.
On the same day, the content creator appealed the decision to Meta. While users usually cannot appeal content decisions the company takes through its escalation process, in this case, due to human error, the user was able to appeal the decision to Meta’s at-scale reviewers. Following an assessment by an at-scale reviewer, the content was found non-violating and was restored to Instagram.
Eight days later, after receiving a second request from the Metropolitan Police, Meta removed the content through its escalation process again. Additionally, the company removed 52 pieces of content containing the track “Secrets Not Safe” from other accounts, including Chinx (OS)’s. Further, Meta added the content to the Violence and Incitement Media Matching Service bank, resulting in 112 automated removals of matching content from other users.
Meta referred the case to the Board. The Board requested that Meta also refer the removal of content from Chinx (OS) ‘s account for review so that it could be examined alongside the content in this case. However, Meta said that this was impossible as removing the “Secrets Not Safe” video from Chinx (OS)’s account ultimately led to the account being deleted, and that the content was not preserved.
Decision Overview
The main issue before the Oversight Board was whether Meta’s decision to remove the post was in line with Facebook’s Violence and Incitement Community Standard, Meta’s values, and the company’s human rights responsibilities.
The user did not provide a statement to the Board.
In its submission to the Board, Meta explained that it had removed the content since it contained a veiled threat of violence which violated the company’s Violence and Incitement policy.
Compliance with Meta’s content policies
I. Content rules
The Board explained that Meta’s Violence and Incitement Community Standard seeks to prevent potential offline harm and that it removes language “that incites or facilitates serious violence” or content that “poses a genuine risk of physical harm or direct threats to public safety.” The Board highlighted that, as it had previously stated in the decisions of Protest in India against France and Knin cartoon, detecting and assessing threats at scale can be particularly challenging when they are veiled. Moreover, the Board noted that veiled threats conveyed in art could present additional challenges since they can be obscure in their intent and deliberately subject to interpretation.
The Board also explained that through its internal Implementation Guidance, Meta establishes its “veiled threats analysis,” which seeks to assess whether a veiled threat is present. This analysis requires the identification of both primary and secondary signals for content to qualify as a veiled threat. Applying this criterion to the present case, the Board found that Meta’s removal of the content did not comply with the Violence and Incitement Community Standard. It agreed with Meta that a primary signal was present since the excerpt referenced a 2017 shooting, a historical incidence of violence. However, the Board found that Meta had not demonstrated that a second signal was present. It explained that identifying a veiled or implicit threat also required a secondary signal showing that the reference “could be threatening and/or could lead to imminent violence or physical harm” [p.9]. Such a signal, the Board emphasized, relies on the local context or experts in the matter indicating that the content could potentially be threatening or confirmation by the targeted individual that they view the content as threatening. In the present case, the UK Metropolitan Police provided this confirmation. Meta determined that Chinx (OS)’s reference to the 2017 shooting was potentially threatening, or likely to contribute to imminent violence or physical harm and qualified as a veiled threat. Its assessment of the lyrics included the specific rivalry between gangs associated with the 2017 shooting, as well as the broader context of inter-gang violence and murders in London. However, the Board considered that to establish that a mere mention of a shooting that had occurred years ago presented a risk of harm today would require additional probative evidence beyond the reference itself. Moreover, the Board highlighted that artistic references to violent rivalry between gangs does not necessarily constitute a threat. In the Board’s view, “[I]n the absence of either sufficient detail to make that causal relationship clearer, such as evidence of past lyrics materializing into violence or a report from the target of the purported threat that they were endangered, greater weight should have been afforded to the artistic nature of the alleged threat when evaluating its credibility” [p.9].
When analyzing whether the track’s reference to past violence constituted a credible threat, the Board found that the fact performative bravado was typical of the drill musical genre was relevant context that Meta should have considered. The Board noted that its own review had not uncovered evidence to support the finding that the lyrics in the track represented a credible threat.
Additionally, the Board emphasized that the present case raised concerns regarding Meta’s relationships with governments, particularly where law enforcement requests lead to lawful content being reviewed against the Community Standards and removed. The Board stated that “while law enforcement can sometimes provide context and expertise, not every piece of content that law enforcement would prefer to have taken down should be taken down” [p.9]. Thus, the Board highlighted the importance of Meta analyzing such requests independently, especially “when they relate to artistic expression from individuals in minority or marginalized groups for whom the risk of cultural bias against their content is acute” [p.9].
II. Enforcement action and transparency
The Board highlighted that in response to the request from the Metropolitan Police request, 174 matching pieces of content were removed (52 manual removals and 112 automated removals). In the Board’s view, the scale of these removals evidenced the importance of due process and transparency surrounding Meta’s relationship with law enforcement and the consequences of actions carried out pursuant to that relationship. To address these concerns, the Board stated that there “needs to be a clear and uniform process with safeguards against abuse, including auditing; adequate notice to users of government involvement in the action taken against them; and transparency reporting on these interactions to the public” [p.10].
The Board explained that Meta publishes reports on government requests to remove content based on local law, governmental requests for user data, and enforcement against Community Standards. However, it stressed that none of these differentiate data on content removed for violating content policies following a government request for review.
In the Board’s view, the present case showed the privileged access law enforcement has to Meta’s internal enforcement teams. For the Board, such a relationship brings into question Meta’s ability to independently assess government actors’ conclusions that lack detailed evidence. The Board recognized that, while Meta has made progress concerning transparency reporting since the Board’s first addressed this issue, further transparency efforts could be valuable to the public discussion regarding the implications of the interactions between governments and social media companies.
The Board noted that this case demonstrated significant flaws regarding Meta’s system governing law enforcement requests, where these requests are not based on local law and are made outside of its in-product reporting tools. The Board highlighted that the channels through which governments can request review for violations of Meta’s content policies remained opaque. Law enforcement agencies make requests by various communications channels, making the standardization and centralization of requests, and collecting data about them, challenging. The current system does not adequately ensure that third-party requests meet minimum standards and does not allow for the accurate collection of data to enable the effects of this system to be properly monitored and audited.
Compliance with Meta’s values
The Board remarked that the present case demonstrated the difficulties Meta faces in balancing the values of “Voice” and “Safety,” when seeking to address a high number of potential veiled threats in art. The Board referred to its decision in the case of Wampum belt to emphasize that “art is a particularly important and powerful expression of “Voice,” especially for people from marginalized groups creating art informed by their experiences” [p.11]. While the Board recognized the importance of keeping communities safe from violence, it noted that a presumption against the value of “Voice” could disproportionately impact marginalized people’s voices. In the Board’s view, Meta did not have sufficient information in the present case to conclude that the content posed a risk to the value of “Safety” that justified displacing the value of “Voice”.
Compliance with Meta’s human rights responsibilities
By referring to Articles 2 and 19 of the ICCPR, the Board noted that the right to freedom of expression is guaranteed to all people without discrimination. Specifically, the Board underscored that Article 19 protects expression “in the form of art.”
Turning to the present issue, the Board highlighted that “Drill music relies on boastful claims to violence to drive the commercial success of artists on social media” [p.12], so Meta’s decision to remove Chinx (OS) from Instagram permanently could significantly impact his ability to reach his audience and find commercial success.
By employing the three-part test set out in Article 19 ICCPR, the Board proceeded to analyze whether Meta’s decision to remove the content was consistent with its human rights responsibilities as a business.
I. Legality (clarity and accessibility of the rules)
The Board stated that the principle of legality requires laws limiting expression to be clear and accessible so individuals understand what is permitted and what is not. Also, it reiterated its concerns about the unclear relationship between the Instagram Community Guidelines and Facebook Community Standards.
Additionally, the Board remarked that it was concerned about the discrepancies between the publicly facing Violence and Incitement Community Standard and Meta’s internal Implementation Standards. Notably, it pointed out that although the company’s public-facing Community Standards establish that “signals” are used to determine whether specific content contains a veiled threat, there is no mention that Meta divides these into primary and secondary signals or that both are required to find a policy violation.
II. Legitimate aim
The Board considered that since one of the primary purposes of the Violence and Incitement Community Standard is to prevent offline harm, the policy serves the legitimate aim of the protection of the rights of others.
III. Necessity and proportionality
The Board noted that through General Comment No. 34, the Human Rights Committee has clarified that any restrictions on expression must respect the principle of non-discrimination.
The Board stated that through a freedom of information request, it learned that all the 286 requests made by the Metropolitan Police to social media companies and streaming services to review or remove musical content posted from June 2021 to May 2022 involved drill music. Additionally, the Board underlined that 255 of these requests resulted in platforms removing content. Twenty-one were related to Meta platforms, resulting in fourteen content removals. In the Board’s opinion, the intensive focus on drill music raised serious concerns of potential over-policing of certain communities.
Likewise, the Board stressed that it was paramount that Meta provides users adequate access to remedy for content decisions that impact their rights. The Board noted that in the present case, the content under review was posted by an Instagram account not belonging to Chinx (OS), yet the artist had posted the same video to his account, which then was removed, resulting in his account being first disabled and later deleted. In the Board’s view, this demonstrated how collaboration between law enforcement and Meta could significantly limit artists’ expression, denying their audience access to art on the platform. As the freedom of information request had confirmed, such collaboration specifically and exclusively targeted drill artists, mostly young Black men. The Board noted it was concerned about access to remedy for users and that users cannot appeal decisions taken “at escalation” to the Oversight Board. This includes all government requests for removals (besides “in-product tool” usage), including lawful content. The Board emphasized that this was especially concerning for individuals belonging to discriminated-against groups, likely to experience further barriers to accessing justice due to Meta’s product design choices. In the Board’s view, “the company cannot allow its cooperation with law enforcement to be opaque to the point that it creates a barrier to users accessing remedies for potential human rights violations” [p.14].
In conclusion, the Oversight Board overturned Meta’s decision to remove the content, requiring the post to be restored.
Identical content with parallel context
The Board stressed that including the content at issue to the Violence and Incitement Media Matching Service bank resulted in the automated removals of matching content and potentially additional account-level actions on other accounts. The Board stated that following this decision, “Meta should ensure the content is removed from this bank, restore identical content it has wrongly removed where possible, and reverse any strikes or account-level penalties” [p. 14]. Moreover, the Board noted that the company should remove any restriction on Chinx (OS) ‘s ability to re-establish an account on Instagram or Facebook.
Policy advisory statement:
The Board shared seven specific policy recommendations related to content policy, enforcement, and transparency.
Regarding content policy, the Board recommended that Meta update its description of the value of “Voice” to reflect the importance of artistic and creative expression. Additionally, the Board urged the company to clarify that for content to be removed as a “veiled threat” under the Violence and Incitement Community Standard, both a primary and secondary signal is required.
Concerning enforcement, the Board recommended that Meta allow users to appeal to the Oversight Board for any decisions made through Meta’s internal escalation process. The Board also urged Meta to implement a globally consistent approach to receive requests for content removals from state actors by creating a standardized intake form asking for minimum criteria. The Board also encouraged the company to mark and preserve any accounts and content that were penalized or disabled for posting content subject to an open investigation by the Board.
Finally, regarding transparency, the Board advised Meta to create a section in its Transparency Center, alongside its “Community Standards Enforcement Report” and “Legal Requests for Content Restrictions Report,” to report on state actor requests to review content for Community Standard violations. Additionally, the Board urged Meta to regularly review the data on its content moderation decisions brought by state actor requests to assess for systemic biases.