Facebook Community Standards, Safety, Child Sexual Exploitation, Abuse and Nudity
Oversight Board Case of Swedish journalist reporting sexual violence against minors
Sweden
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board (OSB) overturned Meta’s decision to remove a Facebook video that revealed the names and faces of the child victims of sexual exploitation and murder in Pakistan. Meta considered that the content in this case violated the Child Sexual Exploitation, Abuse and Nudity Community Standard. Although the content was of public interest, Meta decided that the potential harm from revealing the victims’ identities was significant. The Board held that the post did, in fact, violate the aforementioned policy. However, the majority of the OSB determined that Meta should restore the content and apply a newsworthiness allowance since the crimes occurred nearly 25 years ago, there were no surviving victims, and the documentary made an important contribution to public discussions.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
On January 28, 2022, a broadcaster (Voice of America (VOA) Urdu) shared a documentary video on its Facebook page. The footage shed light on Javed Iqbal, a person convicted of the sexual abuse and murder of around 100 children in Pakistan during the 1990s. The documentary, presented in Urdu, provided comprehensive information about the crimes, including the faces and the names of the child victims. Additionally, it delved into the perpetrator’s subsequent arrest and conviction.
The post contained a caption that alerted readers that the documentary featured content related to sexual abuse and violence. It also indicated that another movie focusing on Javed Iqbal’s horrendous crimes had recently made headlines as the Pakistani government banned its release. Media freedoms have often been suppressed by the Pakistani State, which has also shown a tendency to target individuals who dare to speak out against the authorities. Moreover, while child sexual abuse continues to be a significant issue in the country, authorities have failed to conduct a thorough investigation into such heinous crimes in general and the ones committed by Javed Iqbal in particular.
However, the post of this case went viral as it was viewed 21.8 million times and was reported multiple times by Facebook users.
Initially, Meta determined that the content did not violate any policies after conducting automated and outsourced human reviews. Meta’s High Risk Early Review Operations (HERO) system—”a tool used to address and flag problematic viral content” [p. 11]—flagged the post and prioritized it for review by Meta’s internal team with language, market, and policy expertise. The internal team escalated the content and requested an assessment under the newsworthiness allowance. In August 2023, Meta’s policy team decided to remove the content for violating the Child Exploitation, Abuse and Nudity policy. Additionally, Meta did not grant the content a newsworthiness allowance.
In consideration of the video’s significance in terms of public interest and its aim to raise awareness, Meta decided not to impose any restrictions on the news organization’s account.
Meta referred the case to the Oversight Board (OSB) due to its significance and complexity. The company held that it tried to carefully balance the safety, privacy, and dignity of the child victims—whose identities were revealed—, with the fact that the video aimed at “raising awareness around a serial killer’s crimes and discuss issues that have high public interest value.” [p. 6]
On 14 May 2024, the Oversight Board issued a decision on the matter. The OSB analyzed whether Meta’s decision to remove a Facebook video revealing the identities of child victims of sexual abuse and murder was consistent with the company’s policies and international human rights responsibilities.
The user who posted the content on Facebook did not provide a statement to the Board.
In its submission to the OSB, Meta explained that by revealing the identities of child victims of sexual abuse, the content violated the Child Sexual Exploitation, Abuse and Nudity community standard. The policy states that “Meta does not permit content that sexually exploits or endangers children.” [p.8] According to this policy, Meta should remove “content that identifies or mocks alleged victims of sexual exploitation by mentioning the individual’s first, middle, last or full name or by sharing imagery depicting the individual’s face.” [p. 10]
The company clarified that it does not allow any policy exceptions under the Child Exploitation, Abuse, and Nudity community standard for content that discloses the names or images of victims of sexual exploitation, regardless of the intention behind sharing such content. Meta further noted that child victims of sexual abuse lack the ability to give consent for identification. As a result, children face significant “risks of revictimization, community discrimination, and further violence.” [p. 10] The company argued that in this case, the potential harm from revealing the child victims’ identities was considerable, even though the crimes took place in the 1990s.
Additionally, Meta highlighted that it decided not to impose any restrictions on the news organization’s account, considering the video’s objective of promoting awareness. Furthermore, the fact that the video had been available for 18 months before its removal also influenced Meta’s decision.
1. Compliance with Meta’s Content Policies
The Board considered that the contested post violated the Child Sexual Exploitation, Abuse and Nudity community standard since it revealed the identities of child victims of sexual abuse. However, the majority of the OSB argued that Meta should have applied a newsworthiness exception to allow the content to stay on Facebook. It opined that the public interest in shedding light on child abuse in this case outweighed the potential harm to the victims and their families—especially since the crimes occurred nearly 25 years ago, there were no surviving victims, and the documentary was aimed at raising awareness.
The majority of the Board noted that according to the expert reports, Pakistan has a history of suppressing independent media and failing to stop and investigate heinous crimes against children. Hence, since the content was “broadly accurate and factual in nature, and was specifically contextualized against recent government decisions to censor a film on the topic, the post in this case made an important contribution to public discussions.” [p. 13]
2. Compliance with Meta’s Human Rights Responsibilities
The Board recalled that Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides broad protection to political discourse and journalism. To analyze whether Meta’s decision to remove the content complied with its human rights responsibilities, the OSB applied the three-part test set out in Article 19(3) of the ICCPR.
a. Legality (clarity and accessibility of the rules)
In line with the Human Rights Committee General Comment 34, the legality requirement demands restrictions on freedom of expression to be clear and accessible so that people have enough information about the limitations imposed on their right to freedom of expression. The OSB held that the Child Sexual Exploitation, Abuse and Nudity policy in this case was sufficiently clear. However, the Board said that Meta should provide journalists, along with all users, with clear information and guidance about the rules related to addressing and sharing sensitive topics like child abuse issues. The company should enhance the clarity on what constitutes identifying victims by name or image—for example, by clarifying whether partial names or blurred faces are included in the criteria. The OSB also urged Meta to state clearly that the policy strictly prohibits revealing the identities of child victims of sexual abuse, even in cases where the aim is to report, raise awareness, or condemn such abuse.
The Board recognized that Meta stated clearly in its policy rationale that the company removes nude images of children, even when the intent of parents, for example, is innocuous, because of the abusive potential others can give to the content. Hence, the OSB determined that Meta should be able to provide similar clarity in its rules when addressing challenging topics such as child sexual abuse. Furthermore, these updates should state that Meta could grant newsworthiness allowances in exceptional circumstances, like when the company permitted the photograph of the “napalm girl” for reasons of public interest and historical significance. In addition, the Board held that the company should create “a new section within each Community Standard describing what policy exceptions and general allowances apply, and also this section should note that general allowances such as the newsworthiness allowance apply to all Community Standards.” [p.16]
b. Legitimate aim
The OSB highlighted that the protection of the rights of others and the protection of public order and national security are legitimate aims that allow restrictions on the right to freedom of expression. It explained that, in line with its previous decision in the Swedish Journalist Reporting Sexual Violence Against Minors case, the objective of the Child Sexual Exploitation, Abuse and Nudity policy is to prevent offline harm to the rights of minors by removing content that presents a real risk of harm. Thus, the Board concluded that restrictions based on this policy serve the legitimate aim of protecting “the rights of child victims of sexual abuse to physical and mental health (Article 17 UNCRC), and their right to privacy (Article 17 ICCPR, Article 16 UNCRC), consistent with respecting the best interests of the child (Article 3 UNCRC).” [p. 17]
c. Necessity and proportionality
Following General Comment 34, the OSB mentioned that the necessity and proportionality principles provide “that any restrictions on freedom of expression must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; and they must be proportionate to the interest to be protected.” [p. 17]
Considering this, the Board said that according to Article 3 of the Convention on the Rights of the Child (UNCRC), “in all actions concerning children… the best interests of the child shall be a primary consideration.” Moreover, it is crucial to protect the rights and dignity of every child in all situations, ensuring that their best interests are safeguarded above all else. (The Human Rights Committee General Comment 25 and UNICEF’s Guidelines for Journalists Reporting on Children)
The OSB said that Meta’s decision to remove a post that revealed the names and faces of victims of child sexual exploitation was, a priori, necessary and proportionate. The Board recalled its decision in the Armenian Prisoners of War Video case to highlight that permitting content that identifies persons in vulnerable situations is exceptional.
Nonetheless, the OSB concluded that keeping the content up under a newsworthiness allowance was in the best interests of the children in this case, considering the crimes occurred nearly 25 years ago, and there were no surviving victims—which minimized the likelihood of direct harm to them. Also, the documentary was aimed at raising awareness about a prevalent and underreported issue in Pakistan, the Board said.
The OSB expressed concerns about the efficiency of Meta’s reviewing systems for Urdu language videos, considering it took the company 18 months to reach a decision about one post that was flagged and reported multiple times. The Board recalled its recommendations in the Armenians in Azerbaijan and the Breast Cancer Symptoms and Nudity cases to highlight the importance of providing users with more detailed notifications and instructions regarding the company’s technical tools. To the OSB, Meta “could have easily reposted the content in an edited form e.g., after removing the segments with offending images or by blurring the faces of the victims.” [p. 20] Furthermore, in line with the Board’s recommendation in the Sharing Private Residential Information policy advisory opinion, Meta should explore the possibility of temporarily suspending such content to give opportunity to the posting users to adequately edit it before it is permanently deleted.
In conclusion, the Oversight Board overturned Meta’s decision to take down the content, requiring Meta to apply the newsworthiness allowance and restore the post.
3. Recommendations
The Board recommended Meta to “better inform users when policy exceptions could be granted…and to create a new section within each Community Standard detailing what exceptions and allowances apply.” [p. 21]
Dissenting or Concurring Opinions:
A minority of the Board considered that Meta’s decision to remove the contested content and not apply the newsworthiness allowance was consistent with Meta’s human rights responsibilities and the best interests of the child in this case, given that it was feasible to delve into the issue of child sexual abuse in the case at hand without revealing the identities of the victims. The minority of the Board underscored that “the dignity of child victims of abuse and their privacy rights should be respected regardless of the passage of time and the assumed public debate value of such content.” [p. 19]
It noted that in line with human rights standards (General Comment no. 25 and Guidelines of the Optional Protocol to the Convention on the Rights of the Child), Meta should adopt strict content policies to help journalists and users report on children’s issues in a manner that enables them to promote the public interest without compromising the rights of children.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision expands freedom of expression as the Board recognized the importance of reporting and raising awareness about issues of public interest, such as the sexual exploitation of children. The OSB expanded expression by striking a balance between the dignity and privacy of victims of sexual abuse and their families, the value of raising awareness, and the protection of civic space, thus fostering a safe space for journalists covering this topic in sensitive contexts where it has been underreported or outright censored.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.