Facebook Community Standards, Objectionable Content, Hate Speech
Oversight Board Case of Myanmar Bot
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On June 13, 2022, the Oversight Board upheld Meta’s decision to restore a Facebook post depicting violence against a civilian in Sudan with a warning screen. The original post included a video showing a person lying beside a car with a significant head wound and a visibly detached eye. A caption in Arabic called on people to stand together and not to trust the military, with hashtags referencing military abuses and civil disobedience. After being identified by Meta, the post was removed for violating Facebook’s Violent and Graphic Content Community Standard.
In the Board’s view, the content sought to raise awareness of or document human rights abuses and thus was of significant public interest. Likewise, the Board deemed that while the initial removal of the content was in line with the rules in the Violent and Graphic Content Community Standard, Meta’s decision to restore the content with a sensitivity screen was consistent with its policies, values, and human rights responsibilities. Yet, the Board noted a lack of clarity in Meta’s content policies and no effective means of implementing this response to similar content at scale.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
A Facebook user posted the content analyzed by the Board during the military takeover of the civilian government in Sudan in October 2021 and the advent of civilian protests where security forces in the country fired live ammunition, used tear gas, and arbitrarily arrested and detained protesters.
On October 25, 2021, the user posted a video showing a person lying beside a car with a significant head wound and a visibly detached eye. Voices could be heard in the background saying in Arabic that someone had been beaten and left in the street. A caption in Arabic called on people to stand together and not to trust the military, with hashtags referencing military abuses and civil disobedience.
After being identified by Meta’s automated systems and reviewed by a human moderator, the company removed the post for violating Facebook’s Violent and Graphic Content Community Standard. Consequently, the user appealed; on October 29, 2021, Meta issued a newsworthiness allowance exempting the post from removal. However, due to an internal miscommunication, the company did not restore the content until nearly five weeks later. When it finally did so, it placed a warning screen on the video.
On December 21, 2021, Meta referred the case to the Board.
The main issue for the Board to analyze was whether Meta’s decision to restore the post with a caption complied with the company’s Hate Speech Community Standard, its values, and its human rights responsibilities.
The Board sent the user a message to provide them with an opportunity to submit a statement after the Board decided to accept the case. However, the user did not respond.
In its referral to the Board, Meta stated that the decision on this case was challenging because it highlighted the tension between the public interest in documenting human rights violations and the risk of harm associated with sharing such graphic content. Meta advised the Board that after the military coup in Sudan, it created a crisis response cross-functional team to monitor the situation and communicate emerging trends and risks. Concerning the immediate case, the company claimed that the user took the video in the context of widespread protests and threats regarding press freedom in Sudan.
Furthermore, Meta recognized that its initial decision to remove the content was inconsistent with Article 19 of the International Covenant on Civil and Political Rights (ICCPR), specifically with the principle of necessity. Consequently, it restored the content according to the newsworthiness allowance. To mitigate potential harm by allowing the graphic content, Meta restricted access to the restored video to people over 18 and applied a warning screen. The company explained that the rationale behind its decision to limit the visibility of the content to adults served the legitimate aim of protecting the safety of minors and was proportional to that aim.
Compliance with Meta’s content policies
The Board agreed with Meta’s decision to restore this content to the platform with a warning screen and age restriction. Still, it noted a lack of clarity in the company’s content policies and no effective means of implementing this response to similar content at scale. The Board considered that Meta’s initial decision to remove the content was consistent with the rules within its Violent and Graphic Content Community Standard. It recalled that the policy rationale of this standard held that “[Meta] allow[s] graphic content (with some limitations) to help people raise awareness about issues. [Meta] know[s] that people value the ability to discuss important issues such as human rights abuses or acts of terrorism” [p. 10]. However, the Board explained that the specific rules within the Community Standard did not include a “raising awareness” exception. Thus, without a particular exception within the Community Standard, the Board agreed with Meta’s decision to restore the content using the newsworthiness allowance.
Compliance with Meta’s values
After explaining the rationale stated by the company in its submission, the Board proceeded to analyze whether the decision to keep the content on the platform with a warning screen was consistent with Meta’s values of “Voice” and “Safety.” In the Board’s view, the importance of “Dignity” and “Privacy” was paramount in effectively protecting victims of human rights abuses. It also noted the relevance of “Safety” in the context of the immediate case since such value aims to protect users from content that poses a “risk of harm to the physical security of persons” [p. 12]. According to the Board, the publication could have raised awareness of the coup and contributed to improving safety in the region. At the same time, it acknowledged that the content could have created risks for the person shown in the video and/or their family.
The Board concluded that in a context where the state curtails civic space and media freedom, the value of “Voice” became even more critical. Additionally, “Voice” had also enhanced the importance of “Safety” by ensuring people could access information by exposing the state of violence.
Compliance with Meta’s human rights responsibilities
The Board then proceeded to analyze whether keeping the content on the platform with a warning screen was consistent with Meta’s human rights responsibilities. To do so, it employed the three-part test established in Article 19 of the ICCPR.
I. Legality (clarity and accessibility of the rules)
The Board started by analyzing whether the Violent and Graphic Content policy was clear on how Meta permits users to share graphic content to raise awareness of or document abuses. It highlighted the discrepancy behind the rationale of the Community Standard, which on the one hand, allows users to post graphic content “to help people raise awareness about” human rights abuses. While on the other hand, it prohibits all videos (regardless of whether it is shared to raise awareness or not) of dead bodies in non-medical settings if they depict dismemberment. In the Board’s opinion, Meta correctly relied on the broader newsworthiness allowance to restore this content; however, the Violent and Graphic Content Community Standard was not clear whether such content was allowed on the platform. The Board also concluded that the newsworthiness allowance did not clarify when content documenting human rights abuses or atrocities was beneficial from the allowance.
While the Board agreed with the company that determining newsworthiness was a “highly subjective” task, it considered that the rule in question did not even define the term. Furthermore, it highlighted that the newsworthiness allowance did not refer to using warning screens (or interstitials) for content that violated Meta’s policies. Therefore, the Board determined there was an apparent lack of clarity surrounding when and how the newsworthiness allowance was likely to invite arbitrary policy applications.
II. Legitimate aim
The Board agreed that the Violent and Graphic Content policy pursued several legitimate aims since Meta noted in the rationale for the policy that “content that glorifies violence or celebrates the suffering or humiliation of others […] may create an environment that discourages participation” [p. 14].
III. Necessity and proportionality
Lastly, the Board concluded that the company’s decision to place a warning label on the content was a necessary and proportionate restriction on freedom of expression. The warning screen did not establish an undue burden on those who wished to see the content while informing others about the nature of the content and allowing them to decide whether to see it or not. Moreover, it considered that the warning screen also adequately protected the dignity of the individual depicted and their family.
Concerning Meta’s delay in restoring the post by nearly five weeks, the Board noted that such a decision had disproportionately impacted freedom of expression in the context of ongoing violence and the restricted media environment in Sudan. It indicated that a delay of this length undermined the benefits of providing a warning to civilians and raising awareness. Furthermore, the Board remarked that the newsworthiness allowance did not provide an adequate mechanism for preserving content on the platform. To avoid censoring protected expressions, it advised Meta to amend the Violent and Graphic Content policy to allow such content to remain on the platform.
In light of the above, the Oversight Board upheld Meta’s decision to leave the content with a screen restricting access to those over 18.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The Oversight Board’s decision expands expression by establishing that the content in question sought to raise awareness of or document human rights abuses and thus was of significant public interest. Further, the Board concluded that the company’s decision to place a warning label was a justified restriction on freedom of expression. The rationale is that the warning screen did not establish an undue burden on those who wish to see the content while informing others about the nature of the content and allowing them to decide whether to see it or not.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.
While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.
Oversight Board decisions:
Case significance refers to how influential the case is and how its significance changes over time.
Article 2 of the Oversight Board Charter states, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The Board’s resolution of each case will be binding, and Facebook (now Meta) will implement it promptly unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with similar context – which the Board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the Board’s decision to that content as well. When a decision includes policy guidance or an advisory policy opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance. It will consider it in the formal policy development process of Facebook (now Meta) and transparently communicate about actions taken.
Let us know if you notice errors or if the case analysis needs revision.