Facebook Community Standards, Violence And Criminal Behavior, Violence and Incitement, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of UK Drill Music
United Kingdom
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On March 7, 2024, the Oversight Board overturned Meta’s original decision to remove an Instagram video of a man confronting a woman for not wearing a hijab in Iran. Meta originally removed the post after considering that a phrase from the caption was a credible threat targeting the man in the video, which violated the Violence and Incitement policy. According to the Board, the post did not violate the policy as the phrase was figurative speech used to express anger at the regime. Moreover, the Board found the removal unnecessary and recommended Meta add a policy lever to its Crisis Policy Protocol used in Iran (policy levers are temporary policy changes that aid Meta in addressing the situation in designated at-risk countries such as Iran).
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies
In July 2023, an Instagram user posted a video depicting an unidentifiable man confronting a fully visible woman in public for not wearing a hijab. The content was a repost of a video initially shared by someone supporting the Iranian regime. The subjects of the video spoke in Persian and the video had English subtitles. The caption was also in Persian and expressed solidarity towards Iranian women standing up to the regime. The caption included a phrase Meta translated to “it is not far to make you into pieces” and stated that the woman was arrested at the time the content was posted. The post received around 47,000 views, 2,000 likes, 100 comments and 50 shares.
The content was first flagged by an automated classifier which was configured to identify potential violations of Meta’s policies and send content to human review. Multiple human reviewers assessed the content and disagreed about whether the post violated the Violence and Incitement policy. Although multiple reviewers assessed the content under the policy, they did not reach the same conclusion, which, combined with a technical error, meant the post was not initially removed. Afterwards, a user reported the content, which was sent for additional review by the automated classifier. The post was removed after additional review by Meta’s regional team with language expertise since the company considered that the phrase “it is not far to make you into pieces” was a threat to the man. The author of the post appealed the decision and a reviewer upheld it.
People in Iran have been protesting the government and demanding civil and political rights and gender equality. Those protests intensified after the death of 22-year-old Jina Mahsa Amini during police custody—after she was arrested for not wearing a “proper hijab”. An estimated 14,000 people were arrested and 500 were killed by the government’s violent crackdown on the protests by the end of 2022. Social media has been crucial for the women’s protest movement in Iran. However, social media also exposed women to increased repression by the regime. Several public comments highlighted the regime’s tactics of mass reporting protest content to pressure social media companies into removing it.
The author of the post appealed the removal to the Board. When the Board identified the case for legal review, Meta considered whether the content also breached the Coordinating Harm and Promoting Crime policy since it outed an unveiled woman and put her at risk of harm. After the Board selected the case, Meta reversed its decision and restored the content based on additional input from its regional team and the rationale used in the Call for Women’s Protest in Cuba decision.
The main issue before the Oversight Board was whether Meta’s original decision to remove an Instagram post depicting the confrontation between an Iranian man and a woman for not wearing a hijab—in the context of raising persecution against women and opposition against the regime in Iran—, alongside a caption with possibly threatening language, was consistent with Meta’s content policies, values, and human rights responsibilities.
In their appeal to the Board, the user explained that the video showed the bravery of an Iranian woman standing for her rights when confronted by a representative of the Iranian government for not wearing a hijab. The user noted that other users shared similar videos and that the content did not violate any Instagram policy.
Meta informed the Board that its original removal decision was based on the Violence and Incitement policy, as the regional team interpreted the phrase “make you into pieces” as a threat of physical harm which was prohibited under the policy. However, later on, Meta reversed this decision as subsequent review concluded that the content sought to raise awareness about the abuse Iranian women endured. Meta explained that the possibly threatening language should be considered in light of the context and that the phrase was likely a reference to taking down the Iranian regime. Meta acknowledged the importance of this type of awareness raising, especially in Iran, and cited the Iran Protest Slogan decision.
Furthermore, Meta clarified that another factor it considered to restore the content was the Board’s recent decision in the Call for Women’s Protest in Cuba case, in which the Board emphasized the importance of considering state repression and the public interest in historic protests. Additionally, Meta referred to the Iran Protest Slogan decision, in which the Board analyzed the women’s rights movement in Iran and highlighted the importance of protecting voices in the context of the protest movement due to the limited freedom of expression outlets available to Iranians. Meta also considered the Metaphorical Statement Against the President of Peru case, which underlined the importance of implementing context-sensitive moderation systems aware of irony, satire, and rhetorical expression, to protect political speech.
Meta explained to the Board that it reconsidered its stance regarding a possible violation of the Coordinating Harm and Promoting Crime policy for outing the woman in the video. The policy’s prohibition considered whether the content exposed the person without their permission and was likely to put them at risk. Meta decided that the content shouldn’t be removed under this policy as the identity of the woman was widely known and she had already been arrested which reduced the risk of harm associated with the content.
Compliance with Meta’s content policies and values
Violence and Incitement policy
The Board noted that according to the linguistic experts it consulted, the contested parts of the caption translated to “we will tear you to pieces sometime soon!” or “it is not far away, we will rip you into shreds.” The experts noted that this phrase was used in Iran to express anger and resentment towards the oppressive regime and that such figurative speech reflected the anger shared by both the user and their audience. As such, the Board concluded the caption did not entail credible threats of violence.
Moreover, the Board found that the statement didn’t constitute a credible threat considering the context of escalating repression and violence against protesters in Iran. It considered that the phrase was figurative and not an invitation to commit high-severity violence. This interpretation, the Board said, was consistent with Meta’s value of Voice and the importance of protecting political expressions.
Coordinating Harm and Promoting Crime policy
The Board argued that the content did not violate the Coordinating Harm and Promoting Crime policy either. It agreed with Meta’s argument that the content did not out the woman as her identity was known and the risk of harm was non-existent as she had already been arrested by the time the content was posted. The Board noted that deciding whether a post outed a woman and put her at risk was context-dependent and that enforcing the policy on an escalation-only basis gave moderators the time and resources to analyze the relevant context.
Compliance with Meta’s human rights responsibilities
The Board highlighted the crucial role of social media in closed societies, especially in Iran where the protest movement relied on digital spaces for its survival. The Board referenced the United Nations Human Rights Committee case of Yaker v. France to highlight that laws dictating women’s attire affect their freedom and dignity, whether these laws mandate the wearing of a veil or forbid appearing in public without one. The Board utilized the three-part test stipulated in Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR) to assess whether Meta’s decision to remove the post was compatible with its obligations towards the right to freedom of expression.
1. Legality (clarity and accessibility of the rules)
The legality part of the test, as the Board held, requires restrictions on freedom of expression to be based on rules that are clear and accessible to the users, and clear and precise to those who are tasked with enforcing them.
Violence and Incitement policy
The Board noted that while the policy rationale indicated that context was considered when assessing the credibility of threats, the internal guidance did not reflect the same principle. The Board highlighted that at-scale content moderators were instructed to look for specific criteria and once they were met, moderators should remove posts. In the Board’s view, content moderators were not empowered to assess the credibility of threats, but only to implement a formulaic approach which made assessing rhetorical speech or a credible threat challenging.
The Board recalled the Iran Protest Slogan case to highlight that Meta’s policy rationale accommodated the existence of rhetorical speech in contexts of protests while the written rules and internal guidance did not. The Board expressed concern about how this misalignment between the policy rationale and actual enforcement practices did not satisfy the principle of legality. It reiterated its findings in the aforementioned decision asking Meta to provide detailed guidance directing its moderators to contextual analysis and to refrain from default removal of rhetorical content expressing dissent, especially when it is of a political nature.
Coordinating Harm and Promoting Crime policy
The Board found that Meta’s prohibition regarding content that put unveiled women at risk, by posting their images, was sufficiently clear. However, The Board expressed concern over the fact that Instagram Community Guidelines did not link to the Coordinating Harm and Promoting Crime policy which undermined the accessibility of Instagram users to the rules. The Board had previously recommended Meta clarify to its users how Facebook policies apply to Instagram [Breast Cancer Symptoms and Nudity decision; Öcalan’s Isolation decision]. According to Meta, the company has undertaken a process to unify the policies and while this effort remained a priority, its timeline has been delayed by legal and regulatory considerations. The Board stressed the importance of completing this process quickly to ensure rules’ clarity.
2. Legitimate aim
The Board noted that the Violence and Incitement policy served the legitimate aim of protecting the rights to life and to physical security by prohibiting content that poses a genuine risk of physical and offline harm. On the other hand, the Board found that the Coordinating Harm and Promoting Crime policy pursued the legitimate aim of protecting Iranian women’s rights to non-discrimination, freedom of expression, assembly, to take part in public life, privacy, life, liberty, and security.
3. Necessity and proportionality
Violence and Incitement policy
The Board found Meta’s original removal decision unnecessary as it was not required to protect the safety of any person. The Board expressed concern that even after its decision in the Iran Protest Slogan case, Meta’s policies and internal guidance left room for inconsistent enforcement of figurative threats in Iran.
The Board utilized the six-part test laid out by the Rabat Plan of Action to evaluate whether the content at hand incited discrimination, violence, or other lawless action. This test entails an analysis of the context, the identity of the speaker, their intent, the content and form of the expression, its extent and reach, and the likelihood and imminence of harm.
Context: The Board highlighted that the content was posted during a wave of escalating repression and violence against protesters in Iran. Furthermore, the Board underlined the importance of Instagram, as one of the very few unbanned platforms, and its role in the “Woman, Life, Freedom” movement against discriminatory laws.
Identity of the speaker: Meta informed the Board that it didn’t consider the author of the post to be a public figure. The Board noted that the user seemed to be a supporter of the “Woman, Life, Freedom” movement, therefore, they had no authority and were likely risking their own safety by posting the content.
Intent: While the Board recognized the challenges of assessing the intent when moderating content at scale, it concluded that after an objective reading of the post, it showed support for the woman who appeared in it, and raised awareness about her arrest. The Board noted that experts considered that Iranian protestors often circulate images of women after their arrest to pressure authorities to keep them safe.
Content and form of expression: Linguistic experts explained to the Board that “make you into pieces”, in this context, could be understood by Iranians as an expression of anger and disappointment and not as a literal threat of violence. The Board recalled that in the Iran Protest Slogan case, it found the slogan “Death to Khamenei” to be a rhetorical threat and noted that Meta issued a spirit of the policy allowance for the phrase “I will kill whoever kills my sister/brother.” The phrase in the case at hand appeared in the middle of a caption that praised the depicted woman and supported Iranian women in their struggle against the abusive practices of the regime.
Extent and reach: The post received 47,000 views, 2,000 likes, 100 comments and 50 shares. The Board found that the high reach of the post didn’t indicate that removal was necessary as other factors in the incitement analysis haven’t been fulfilled.
Likelihood and imminence of harm: The Board noted that the most likely form of harm that could result from this content would be in the form of retaliatory violence from the regime against the user themselves or the woman in the video. Experts highlighted that considering the danger protestors face, the aim of circulating images of unveiled women serves to raise awareness about their arrest and pressure the authorities to keep them safe.
Hence, the Board concluded that the content did not constitute a credible threat or incited offline harm. It noted that Meta should allow its reviewers to analyze language within its local context, thus aligning the internal guidance with the policy rationale. Moreover, the Board opined that ensuring accurate assessments on the credibility of threats would improve moderation more broadly as automation accuracy would be impacted by training data based on human moderators’ decisions.
Furthermore, the Board highlighted the variety of mechanisms available to Meta to adjust its policies and enforcement practices during crises—including the “at risk” country tiering system and the Crisis Policy Protocol. The “at risk” country tiering system identifies countries at risk of offline harm and violence to allow Meta to prioritize its product development accordingly. Iran had been designated as an “at risk country” for the second half of 2023. Additionally, Iran was designated by the Crisis Policy Protocol on September 21, 2022. The Crisis Policy Protocol enabled Meta to modify its policies temporarily to address certain situations, such as permitting the slogan ‘I will kill whoever kills my sister/brother” in Iran, which could be a violating statement in countries where the protocol does not apply.
Furthermore, the Board found that content removal under the Coordinating Harm and Promoting Crime policy would be unnecessary in this case since the depicted woman’s identity was known and the post was meant to raise awareness about her arrest and pressure authorities to release her. The Board noted that navigating the protection of vulnerable users’ identities while ensuring that those seeking visibility were not censored required careful consideration, context-specific analysis, prompt evaluation, and swift decision-making.
Policy advisory statement:
The Board recommended Meta introduce a policy lever to its Crisis Policy Protocol permitting figurative statements not likely to incite violence under the Violence and Incitement policy. Moreover, the Board emphasized the importance of developing criteria for at-scale moderators to identify these statements in the relevant context.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
Through this decision, the Board expanded expression by stressing the importance of context analysis when determining whether a statement constituted incitement or a threat of violence. Moreover, the Board noted the crucial role of digital spaces in the struggle for women’s rights in Iran. The Board also underlined the use of rhetorical statements to convey strong feelings and the wider scope of protection these statements enjoy as they fall within the scope of political speech.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board analyzed Meta’s obligations towards freedom of expression as laid out by this article. It also referred to this norm to apply the three-part test to assess whether a restriction to freedom of expression is valid.
The Board referred to this article to highlight the protection of women’s right to assembly.
The Board referenced this provision to underscore the international protection of the right to life.
The Board referred to this article to highlight the international protection of women’s right to liberty and security.
The Board cited this provision to underline the international protection of women’s right to non-discrimination.
The Board referred to this norm to underscore the international protection of women’s right to non-discrimination.
The Board cited this provision to highlight the international protection of women’s right to non-discrimination.
Within the framework of this document, the Board analyzed Meta’s human rights obligations.
The Board used this General Comment as a guide to explain and apply the three part test.
The Board utilized this instrument to analyze whether the constested content incited violence.
The Board referenced this case to highlight the impact on women’s rights of laws regulating women’s clothes.
The Board relied on this case throughout its analysis to highlight the importance of contextual analysis in content moderation given the context of ongoing protests in Iran and Government repression.
The Board recalled its stance in this case to express concern about the non-alignment between Facebook Community Standards and Instagram Community Standards.
The Board recalled its stance in this case to express concern about the non-alignment between Facebook Community Standards and Instagram Community Standards.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.