Facebook Community Standards, Objectionable Content, Violent and graphic content
Oversight Board Case of Sudan Graphic Video
Sudan
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board (OSB) upheld Meta’s decision to leave up a Facebook video that revealed the identity of Armenian soldiers being captured and mistreated by Azerbaijani armed forces. Although Meta considered that the post violated its Coordinating Harm and Promoting Crime policy, the company’s teams escalated the post for additional review. Meta decided to apply the newsworthiness allowance to allow the content to remain on its platforms. The Board upheld Meta’s decision to leave up the content in light of the newsworthiness allowance as the content was aimed at raising awareness and there was a compelling public interest in keeping such content. The Board stressed that Meta should update its policies and protocols, and provide more clarity on the criteria and process of assessing content related to war and conflict situations.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
In October 2022, a Facebook user posted a video related to the September 2022 Armenia–Azerbaijan clashes which stemmed from the escalation of the unresolved conflict over the Nagorno-Karabakh region. The video featured Azerbaijani soldiers finding people in the rubble who appeared to be Armenian soldiers. Some were dead and others were injured. The faces of the Armenian soldiers were not digitally obscured. At the end of the video, the Azerbaijani soldiers pulled one of the injured Armenian soldiers and started to insult and shout curse words at him.
In such high-risk events, Meta’s Global Operations team and Meta’s security team conduct a risk-monitoring effort. The monitoring includes “external signals such as news and social media trends related to the issue.” [p. 7] Hence, when Meta teams found the aforementioned video on Facebook, they escalated the post to Meta’s policy teams for additional review.
The company found that the post violated its Coordinating Harm and Promoting Crime policy which prohibits content revealing information or the identity of prisoners of war. After further review, Meta decided to issue a newsworthiness allowance, “which permits content on Meta’s platforms that might otherwise violate its policies if the public interest in the content outweighs the risk of harm.” [p.7] Additionally, Meta applied a “marked as disturbing” warning screen to the video under the Violent and Graphic Content policy.
Meta referred the case to the Oversight Board considering the significance of the case—since it concerned an ongoing military conflict—and the difficulty in striking a balance between “the value of raising awareness of such issues against the potential harm caused by revealing the identity of prisoners of war.” [p. 12 ]
The Oversight Board analyzed whether Meta’s decision to leave up a Facebook video —and add a warning screen to the content—revealing the identity of prisoners of war was consistent with Meta’s values and international human rights responsibilities. The Board also assessed Meta’s issuance of a newsworthiness allowance in conflict situations.
The user who posted the content on Facebook did not provide a statement to the Board.
In its submission to the Board, Meta explained that the content in this case violated the company’s Coordinating Harm and Promoting Crime policy. This policy includes a rule that prohibits “content that reveals the identity or location of a prisoner of war in the context of an armed conflict by sharing their name, identification number, and/or imagery.” [p. 9] This rule seeks to protect the safety and dignity of prisoners of war, given the scale and speed at which such imagery or information can be circulated via Meta platforms. However, Meta mentioned that to apply this rule, escalation teams require “additional context”. The company considered that the uniforms of the prisoners that appeared in the video confirmed that they were Armenian soldiers detained by Azerbaijani armed forces.
Nevertheless, Meta applied a newsworthiness allowance to permit the content on its platform. The company explained that by issuing this allowance it raised “awareness of the violence against prisoners of war.” [p. 11] To assess, in this case, the value of the public interest of the video against the risk of harm, Meta conducted a balancing test. This test required the company to consider several factors such as the imminent harm, the nature of the content and the speech, the political structure of the country, and country-specific circumstances (e.g. elections or war).
Meta recognized that revealing the identities of prisoners of war may result in exposing them and their families to ostracism and violence. Also, this kind of content, Meta explained, could intensify hatred, prejudice, and antagonism towards the other side’s civilians. However, the company stated that it “did not have evidence that videos of this kind were producing these negative effects.” [p. 13] Meta further explained that several international organizations were using these types of videos to pressure Azerbaijan “to end mistreatment of prisoners of war,” [p. 13] which highlighted the video’s public interest value to raise awareness and serve as evidence of war crimes.
Additionally, Meta considered “both the dignity and safety of the victims of violence and the fact that people may not want to see this content.” [p. 14] Hence, the company applied a “mark as disturbing” warning screen to the video under the Violent and Graphic Content policy.
Compliance with Meta’s content policies
The Board found that the content in this case violated the Coordinating Harm and Promoting Crime Community Standard which prohibits revealing information about the identity of prisoners of war. The Board agreed with Meta that the uniforms of the prisoners and their faces —that appeared in the video— confirmed that they were Armenian soldiers. The context of war and violence that was evident in the video indicated that they were detained by Azerbaijani armed forces. Therefore, the Board found that this was enough information to consider that the video violated Meta’s policy.
However, the Board found that “the public interest in the video outweighed the potential risks of harm.” [p. 17] Thus, Meta’s decision to issue a newsworthiness allowance to allow the content to remain on its platform was correct.
The Board noted that the balancing test that Meta conducted was adequate to assess the public interest of the content against the risk of harm it posed. According to the OSB, the role that this video and similar content can play in proceedings to hold accountable those who committed these acts, for example, justified keeping the content on the platform. The Board emphasized the importance of conducting a “rapid case-by-case contextual assessment” [p. 18] in such complex and fast-moving conflict situations, to prevent or mitigate risks and preserve the public’s right to access important information.
Additionally, the Board found that Meta’s decision to apply a “mark as disturbing” warning screen to the video was consistent with the company’s Violent and Graphic Content policy. This policy contains two rules that apply to the content of this case. The rules state that “imagery that shows the violent death of a person or people by accident or murder” and “imagery that shows acts of torture committed against a person or people”[p. 9] should be placed behind a warning screen. The Board explained that there were enough indicators in the video that the soldiers were detained and mistreated. Therefore, the graphic nature of the video justified a “mark as disturbing” warning screen to alert users and limit the ability to view the video.
Compliance with Meta’s human rights responsibilities
The Board recalled that Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides broad protection to expression, including the right to access information during crisis and conflict situations. Further, the OSB stated that in times of war and conflict, “the Board’s freedom of expression analysis is informed by the more precise rules in international humanitarian law.” [p. 20] The Board also recalled that Article 13 of the Geneva Convention (III) prohibits “acts of violence or intimidation against prisoners of war as well as exposing them to insults and public curiosity.” [p. 20]
To analyze whether Meta’s decision to leave up the content complied with its human rights responsibilities, the Board employed the three-part test set out in Article 19 of the ICCPR.
The legality requirement demands restrictions on freedom of expression to be accessible and clear, according to General Comment No.34, para.25 and the UN Special Rapporteur on freedom of expression, A/HRC/38/35, para. 46.
The Board found that Meta’s rule that prohibits revealing the identity of prisoners of war as well as the rule concerning the application of a “mark as disturbing” screen to violent content, were sufficiently clear. The Board mentioned that it was also sufficiently clear —as an exception—that Meta may leave up violating content when it fosters the public interest. However, the Board considered that Meta should provide more information to the public regarding the application of the newsworthiness allowance to content revealing the identity of prisoners of war.
The OSB noted that it has previously issued recommendations in this sense. In the cases of the Sudan graphic video and Colombian protests, it highlighted the importance of providing more information and examples related to the newsworthiness allowance. Although Meta responded to those recommendations and provided more clarity about the application of the allowance, the Board in this case underscored the need of “enhanced transparency and guidance to users, especially in crisis and conflict situations.” [p. 20]
The Board recalled that “respecting the rights of others, including the right to life, privacy, and protection from torture or cruel, inhuman or degrading treatment, is a legitimate aim for restrictions on the right to freedom of expression.” [p. 23] It noted that sharing or posting images that reveal the identity of prisoners of war can be humiliating and lead to offline violence. Such imagery, the Board opined, can also show “how social media can be abused to directly violate the laws of war,” [p. 23] especially the rules of international humanitarian law that call for the protection of the dignity of prisoners of war (Article 13, para. 2 of the Geneva Convention (III))
Hence, the Board found that Meta’s Coordinating Harm and Promoting Crime policy—which prohibits revealing information about or the identity of prisoners of war—was legitimate and in line with the company’s values of privacy, safety, and dignity.
On the other hand, the Board found that Meta’s rules on violent and graphic content pursued legitimate aims. The Board recalled its decision in the Sudan graphic video case to argue that the rules to apply a “mark as disturbing” warning screen are legitimate and “seek to empower users with more choices over what they see online.” [p. 23]
The necessity and proportionality principles, the Board said, provide “that any restrictions on freedom of expression must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected.” [p. 24]
The Board mentioned that its analysis of the necessity requirement is “informed by the more specific rules in international humanitarian law.” [p. 24] According to the Geneva Convention III, when sharing content that reveals the identity or information about prisoners of war a “reasonable balance” should be achieved between the potential benefits, and the value of raising awareness, and the potential harm and humiliation towards the prisoners and their families. The Board referred to the International Committee of the Red Cross (ICRC) Commentary on Article 13 of the Geneva Convention (III) to highlight that “such materials may be exceptionally disclosed, if there is a ‘compelling public interest’ in revealing the identity of the prisoner or if it is in the prisoner’s ‘vital interest.’” [p. 24]
Turning to the present issue, the Board mentioned that the Coordinating Harm and Promoting Crime Policy and its requirements of “additional context to enforce”, along with the case’s escalation to expert teams, were necessary and consistent with the goals of international humanitarian law. The Board also argued that removing such content would generally be proportionate given the harm that can result from it to war prisoners and the risk it entails when the content is disseminated by the detaining power with propagandistic purposes.
Nevertheless, the Board found that the issuance of the newsworthiness allowance, in this case, was proportionate because this post aimed to raise awareness. Also, the post documented human rights violations, which can be helpful in proceedings to hold accountable those who committed atrocious crimes, and promotes “the public’s right to information around the fact of the detainees’ capture, proof of them being alive.”[p. 25]
Additionally, the Board found that applying a “marked as disturbing” screen to the video was necessary. The Board referred to its decisions in the Video after Nigeria church attack and Russian poem cases to emphasize that when content depicts identified deceased bodies and injured people, applying a warning screen would help show respect for the rights of the prisoners and their families. The Board further explained that “while the warning screen would likely have reduced the reach of the content and therefore its impact on public discourse, providing users with the choice of whether to see disturbing content is a proportionate measure.” [p. 27]
Thus, the Board upheld Meta’s decision to leave up the content with a “mark as disturbing” warning screen.
Recommendations:
As previously stated in its recommendations in the former President Trump’s suspension case, the Board recommended Meta to preserve and share with competent authorities any content that documents human rights violations or war crimes. Hence, according to the Board, Meta should update its internal policies and provide clear guidance about the protocols to safeguard, preserve, or share data. The Board noted that “civil society, academia, and other experts in the field should be part of developing such protocols.”[p. 27]
Further, the OSB recommended Meta to provide users with more specific guidance regarding the application of the newsworthiness allowance to content that reveals the identity of prisoners.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
In this decision, the Oversight Board expanded freedom of expression by establishing that the content in this case sought to raise awareness and document human rights violations. Hence leaving up the content was of significant public interest. The Board noted that the risks posed by the post in the context of an armed conflict distinguished this case. Thus, the issuance of the newsworthiness allowance was necessary, not only to inform the public but also to increase pressure on the detaining power to protect the rights of the detainees. The Board expanded expression by recognizing that it encompasses the right to impart and receive information, which is particularly important in times of conflict and crisis. Furthermore, the Board concluded that the company’s decision to place a warning screen was a justified restriction on freedom of expression as it did not establish an undue burden on users, but rather informed them about the nature of the content and gave them the choice to see it or not.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board used the UNGPs to highlight Meta’s commitment to respect human rights.
The Board used Article 19 of the ICCPR as a legal basis that provides broad protection for freedom of expression.
The Board used General Comment No. 34 as the legal basis to apply the three-part test.
The Board referred to Article 6 of the ICCPR as the legal basis for the right to life.
The Board referred to Article 7 of the ICCPR as the legal basis for the right to be free from torture, inhuman or degrading treatment.
The Board referred to Article 17 of the ICCPR as the legal basis for the right to privacy
The Board referred to Article 13 as the legal basis for the protection of prisoners of war from insults and public curiosity
The Board referred to ICRC Geneva Convention (III) Commentary for guidance to analyze the scope of protection of prisoners of war from insults and public curiosity.
The Board referred to this decision as an example of cases related to the scalability of the newsworthiness allowance
The Board referred to this decision to recall the importance of applying a warning screen to content of graphic nature.
The Board referred to this decision as an example of cases related to the scalability of the newsworthiness allowance
The Board referred to this decision to highlight the importance of providing more clarity and transparency to users about Meta’s policies.
The Board mentioned this case to highlight that it had previously recommended Meta to preserve and share information with competent authorities, about grave human rights violations, to assist them in the investigations.
The Board referred to this decision as an example of cases concerned with conflict situations where the content did not have clear indicators of violence to justify adding a warning screen.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.