Facebook Community Standards, Safety, Bullying and Harassment
Oversight Board Case of Pro-Navalny Protests in Russia
Russian Federation
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
This case is available in additional languages: View in: Español
The Oversight Board overturned Meta’s decision to remove a post from Facebook in which a user reported the rape of two minors and provided graphic detail of the impact on one of the survivors. The content was posted in August 2019. Meta, applying the Community Standard on Child Sexual Exploitation, Abuse and Nudity, removed it two years later. For the Board, the broader context of the post led to the conclusion that the user was reporting on an issue related to the Swedish criminal justice system and condemning the sexual exploitation of minors. The content was not referring to them in a “sexualized context” nor sexually exploiting them. The Board determined that the post did not violate the policy against depictions of sexual exploitation of minors and required it to be restored. Similarly, the Board considered that the Community Standard on Child Sexual Exploitation failed a legality test since it does not clearly define some of its own key terms (depiction, sexualization). The Board also argued that Meta’s removal of the post was also unnecessary, since the deletion of content on matters of public interest, such as sex crimes against minors, is not the least intrusive way to protect the rights of children.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
*El Consejo Asesor de Contenido es una entidad separada de Meta y aportará su criterio independiente tanto en casos individuales como en cuestiones de política. Tanto el Consejo como su administración están financiadas por un fondo fiduciario independiente. El Consejo tiene autoridad para decidir si Facebook e Instagram deben permitir o eliminar contenidos. Estas decisiones son vinculantes, a menos que su aplicación infrinja la ley. El Consejo también puede decidir emitir recomendaciones sobre las políticas de contenidos de la empresa.
In August 2019, a Facebook user from Sweden posted a stock picture of a young girl sitting down with her head in her hands obscuring her face, with a caption that described incidents of sexual violence against two minors. The caption included graphic language which detailed the rapes of the two unnamed minors, specifying their ages and the municipality in which the first crime had occurred. Additionally, the user described the convictions that the two perpetrators received. One of them was a minor when he committed the offense, so he reportedly received a non-custodial sentence. The other one was reported as having recently completed a custodial sentence for a violent crime against another woman.
In the post, “the user argue[d] that the Swedish criminal justice system is too lenient and incentivizes crimes”. The user advocate[d] for the establishment of a sex offender register in the country. The user provided sources in the comments section of the post, identifying the criminal cases by court reference numbers and linking to coverage of the crimes by the local media. Moreover, at the time of publication, Sweden was going through a time of debates about its criminal justice system, including penalties for child sexual assault. The user was already engaged in denouncing child sex abusers on their Facebook page. They were also an advocate for reforming the existing penalties for sex crimes in his country.
“The post received about two million views, 2,000 comments and 20,000 reactions. (…) the post was shared on a page with privacy settings set to public, which means that anyone could view the content posted. The page has about 100,000 followers, 95% of whom are located in Sweden.” (p. 4)
From August 2019 until September 1, 2021, Meta received several complaints about the publication claiming it did not comply with Facebook’s Community Standards. In 2021, it was finally removed from the platform. The post was deemed by Meta to violate the standard on Child Sexual Exploitation, Abuse and Nudity because it described the physical aftermath of the rape in detail. Additionally, it mentioned that the offender allegedly bragged to his friend that “the girl was ‘tight’ and proudly showed off his bloody hands” (p. 4). The user was also blocked from some of the functions they could perform on the social network. They were not allowed to go live on Facebook, for example. The user appealed. Meta reversed the strike “because the company determined that the purpose of the post was to raise awareness” (p. 4). However, the post was not restored.
The main issue the Oversight Board analyzed was whether the user’s content should be restored considering the Community Standard on Child Sexual Exploitation, Abuse and Nudity and its rationale, and Meta’s publicly stated values. The Board also analyzed, applying a three-part test, whether Meta’s decision complied with its Human Rights obligations. The Board decided that the post did not violate any of the standards mentioned above and should be restored. It also recommended Meta “provide a clear definition of sexualization, graphic depiction, and reporting” (p. 10) in its content policies.
Although the Board provided the user with the opportunity to submit a statement, the user did not submit one.
In its submission to the Board, Meta argued that it removed the content based on several academic articles and guidelines that assert that “allowing depictions of rape can harm victims through re-traumatization, invasion of privacy and by facilitating harassment” (p. 8). Following this, the company stated that “it is the risk of revictimization that led it to determine that removal was necessary” (p. 8) while acknowledging that in some cases a newsworthiness exception could be applied if the public interest of a given content outweighs the risk of harm.
Compliance with Community Standards
The Board concluded that the removed post did not violate the Community Standard on Child Sexual Exploitation, Abuse and Nudity because this policy prohibits “content that sexually exploits or endangers children.” (p. 10). The description and clinical facts about the aftermath of the crimes, according to the Board, did not constitute language that sexually exploited children or depicted a minor in a “sexualised context”. Moreover, the post did not show the minors in a sexualized context, because the broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor. The user replicated language used in Swedish news media outlets reporting.
Compliance with Meta’s values
The Board concluded that Meta’s decision to remove this content is inconsistent with its value of “Voice”. It found that the public interest in bringing attention to this issue and informing the public, or advocating for legal and policy reforms, are at the core of the value of “Voice”. The Board found that “Privacy”, “Safety” and “Dignity” are of great importance when it comes to content that graphically describes the sexual exploitation of a minor. However, it concluded that the content at issue did not rise to the level of content that sexually exploited children and that it was important to protect the Voice of advocates against child sexual exploitation and survivors.
Compliance with Meta’s human rights responsibilities
Based on Article 19 of the ICCPR, the Board found that restoring the content, in this case, was consistent with Meta’s human rights responsibilities. Although the ICCPR does not create the same obligations for Meta as it does for states, Meta has committed to respecting human rights as set out in the UN Guiding Principles on Business and Human Rights. Considering this, the Board applied a three-part test that analyzed the legality, legitimacy, necessity and proportionality of the measure issued by Meta
The Board found that the terms “depiction” and “sexualization” included on the Child Sexual Exploitation, Abuse and Nudity policy, are not defined in the public-facing Community Standards and that the policy could benefit from clear definition of key terms and examples of borderline cases. The Board said that when Meta fails to define key terms or disclose exceptions, users cannot understand how to comply with the rules.
The Board noted that “Meta’s Known Questions” and Internal Implementation Standards (IIS), which are guidelines provided to content reviewers to help them assess content that might amount to a violation of one of Facebook’s Community Standards, provide more specificity but do not address “the difference between prohibited graphic depiction or sexualization of a minor and non-violating reporting on the rape and sexual exploitation of a minor.” (p. 11)
Finally, the Board found it concerning that the content was removed, without an explanation, two years after it was initially posted. The Board also noted that no substantive change to the policies during this period explained the removal.
The Board explained that restrictions on freedom of expression should pursue a legitimate aim, which includes the protection of the rights of others. In this case, the Facebook Community Standard on Child Sexual Exploitation, Abuse and Nudity “aims to prevent offline harm to the rights of minors that may be related to content on Facebook” (p. 12). The Board opined that restrictions in this policy do have a legitimate purpose of protecting the rights of children to physical and mental health (Article 12 ICESCR, Article 19 CRC), consistent with the best interests of the child (Article 3 CRC) (p. 12).
The Board concluded that “removing this content discussing sex crimes against minors, an issue of public interest and a subject of public debate did not constitute the least intrusive instrument of promoting the rights of the child” (p. 14). For the Board, since General Comment No. 34 highlighted the importance of political expression in Article 19 of the ICCPR, including the right to freedom of expression in “political discourse,” “commentary on one’s own and on public affairs,” and “discussion of human rights”, users should be able to discuss a country’s criminal justice system and report on its operations in specific cases without being censored.
The Board said it was aware of the off-platform harm to survivors of child sexual exploitation from depictions of that exploitation being available on the platform. Yet the Board made it clear that there is a big difference between a perpetrator’s language sexualizing a child and the user’s post quoting the perpetrator to raise awareness on an issue of public interest. For the Board, not all explicit language constitutes graphic depiction or sexualization.
Finally, the Board considered the potential for offline harm when reporting includes information sufficient to identify a child. “Content that may lead to functional identification of a minor who has been the victim of child sexual exploitation implicates children’s rights to freedom of expression (ICCPR, Art. 19), privacy (CRC, Art. 16) and safety (CRC, Art. 19). Functional identification may occur when content provides or compiles enough discrete pieces of information to identify an individual without naming them.” (p. 15) Unfortunately, the Board was unable to determine if, in this particular case, the victims were more likely to be identified.
Policy advisory statement:
The Board recommended that Meta define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Additionally, it suggested that legal, clinical or medical terms and graphic content should be differentiated from graphic content, and it should be clear that “not all explicit language constitutes graphic depiction or sexualization” (p. 16) Further, Meta should clarify the differences between child sexual exploitation and reporting on child sexual exploitation.
The Board also recommended that Meta “undergo a policy development process to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards” (p. 16). This process should include stakeholder and expert engagement on functional identification and the rights of the child. Meta must publish the minutes of the Product Policy Forum where this policy is discussed for the Board to consider it implemented.
Dissenting or Concurring Opinions:
Some Board Members emphasized that “when there is doubt about whether a specific piece of content may lead to functional identification of a child victim, Meta should err on the side of protecting the privacy and physical and mental health of the child in accordance with international human rights principles. For these Board Members, the platform’s power to amplify is a key factor in assessing whether the minor can be identified and therefore the protections afforded to children who are victims of sexual abuse.” (p. 15)
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The decision expanded expression because the Board recognized “the importance of not silencing advocates for and survivors of child sexual exploitation” (p. 10). The Board established that users of Meta platforms should be able to report child sexual exploitation without it being considered graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. For the Board, not all explicit language necessarily constitutes graphic depiction or sexualization: In some cases, users report on an issue of public interest and condemn the sexual exploitation of a minor by using it, which is the reason why it should be protected. Additionally, the Board recognized the deficiencies in the reasons proffered by Meta to remove the content as they were not consistent with Meta’s Human Rights responsibilities.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Although the ICCPR does not create the same obligations for Meta as it does for states, the Board used the UNGPs as the legal basis of Meta’s commitment to respect human rights.
The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression through any media and regardless of frontiers. They also used it to apply the three-part test of legality (clarity), legitimacy, and necessity and proportionality.
The Board used General Comment No. 34 as the legal basis to apply the three-part test.
The Board used the report A/74/486 (2019) as the legal basis to apply the principle of necessity and proportionality.
The Board used the report A/HRC/17/27 (2011) as the legal basis to decide if the right to freedom of expression was being respected.
The Board used Article 3 of the CRC as the legal basis to decide if the best interest of the child was being respected.
The Board used General Comment No. 25 as the legal basis to decide if the best interest of the child was being respected.
The Board used Article 12 of the ICESCR as the legal basis to decide if the rights of children, to access information for the promotion of their physical and mental health, and to be protected from all forms of physical or mental violence, were being respected.
The Board used Articles 17 and 19 of the CRC as the legal basis to decide if the rights of children, to access information for the promotion of their physical and mental health, and to be protected from all forms of physical or mental violence, were being respected.
The Board used Articles 17 and 19 of the CRC as the legal basis to decide if the rights of children, to access information for the promotion of their physical and mental health, and to be protected from all forms of physical or mental violence, were being respected.
The Board used Article 17 of the ICCPR as the legal basis to decide if the right to privacy was being respected.
The Board used Article 16 of the CRC as the legal basis to decide if the right to privacy was being respected.
The Board used the Concluding Observations CRC/C/15/Add.261 as the legal basis to decide if the right to privacy was being respected.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.