Facebook Community Standards, Violence And Criminal Behavior, Violence and Incitement, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of UK Drill Music
United Kingdom
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On October 4, 2022, the Oversight Board upheld Meta’s decision to remove a Facebook post that threatened violence during the conflict in Ethiopia. The content was posted on the official page of the Tigray Regional State’s Communication Affairs Bureau and was viewed more than 300,000 times. The post discussed the losses suffered by federal forces, encouraged the national army to “turn its gun” toward Prime Minister Abiy Ahmed’s group, and warned government forces that they would die if they refused to surrender. After being reported by users and identified by Meta’s automated systems, the content was assessed by two Amharic-speaking reviewers who initially determined that the post did not contravene Meta’s policies. However, through the Integrity Product Operations Centre for Ethiopia, the company found the content violated Meta’s Violence and Incitement policy and removed it two days later. Subsequently, Meta referred the case to the Board.
In its decision, the Board held that by removing this post, Meta complied with Facebook’s Violence and Incitement Community Standard, Meta’s values, and the company’s human rights responsibilities. Moreover, the Board considered that the “context in Ethiopia, the status and intent of the speaker; the content of the speech as well as its reach; and the likelihood of offline harm all contributed to a heightened risk of offline violence” [p.14].
While the Board recognized Meta had taken positive steps to improve content moderation in some conflict zones, it highlighted that Meta should do more to meet its human rights responsibility to establish a principled, transparent system for moderating content in such contexts to reduce the risk of its platforms being used to incite violence or violations of international law. Particularly, the Board deemed that Meta provided insufficient information on how it implements its Violence and Incitement policy in armed conflict situations. Further, the Board considered that Meta’s current approach to content moderation in conflict zones suggested inconsistency, noting that observers had accused the company of differential treatment responses to conflicts, especially when considering the Russia-Ukraine conflict vis-à-vis others.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
The instant case concerns allegations made during the ongoing civil war in Ethiopia that erupted in 2020 between the Tigray region’s forces, and the Ethiopian Federal Government forces and their allies.
On November 2, 2021, Prime Minister Abiy Ahmed imposed a nationwide state of emergency after the TPLF (Tigray People’s Liberation Front) took over certain parts of the Amhara and Afar regions beyond Tigray. The Federal Government also called on citizens to take up arms as the TPLF made its way toward the capital.
On November 4, 2021, Meta convened an Integrity Product Operations Center (IPOC), a group of subject matter experts within the company that was brought together for a short period to monitor and respond in real-time to the rapidly unfolding situation in Ethiopia.
On November 5, 2021, the Tigray Communication Affairs Bureau Facebook page, which stated that it was the official page of the Tigray Regional State Communication Affairs Bureau (TCAB), posted content in Amharic that discussed “the losses suffered by the Federal National Defense Forces under the leadership of Prime Minister Abiy Ahmed in the armed conflict with the TPLF” [p. 5]. The post encouraged “the national army to “turn its gun towards the fascist Abiy Ahmed group” to make amends to the people it has harmed” [p. 5]. Further, the content urged the armed forces to “surrender to the TPLF if they hope to save their lives, adding: “If it refuses, everyone should know that, eventually, the fate of the armed forces will be death” [p. 5].
When the content was published, the TCAB public page had about 260,000 followers and was verified by a blue checkmark badge, which confirmed the page’s authenticity. The content was viewed more than 300,000 times and shared fewer than 1,000 times.
Ten users reported the content for violating the Violence and Incitement, Dangerous Individuals and Organizations, and Hate Speech policies. After Meta’s automated systems identified the content as potentially violating, it was reviewed by two human Amharic-speaking reviewers, who found that it did not infringe the company’s policies. Nevertheless, through the IPOC, Meta determined that the content violated the Violence and Incitement policy and proceeded to remove it from the platform.
On February 4, 2022, Meta referred the case to the Board.
The main issue before the Oversight Board was whether Meta’s decision to remove the post was in line with Facebook’s Violence and Incitement Community Standard, Meta’s values, and the company’s human rights responsibilities.
The user was notified of the Board’s review of the case and provided the opportunity to submit a statement; the user did not submit a statement.
In its referral of the case to the Board, Meta noted that its decision to remove the post in question had been difficult since, on the one hand, it involved removing “official government speech that could be considered newsworthy”, while on the other hand, the content posed a risk of inciting violence during an ongoing conflict. The company stated that it did not consider granting the newsworthiness allowance because such allowance “does not apply to content that presents a risk of contributing to physical harm” [p. 9].
Meta explained that since late 2020, it had treated Ethiopia as a Tier 1 at-risk country, the highest risk level. The company also noted that on November 4, 2021, in response to the escalation of the country’s conflict, an IPOC was created for Ethiopia. It remarked that the IPOC was convened as Level 3, which “involves the greatest level of coordination and communication within Meta” [p. 10]. The company highlighted that IPOCs are a “short-term solution” meant to “understand a large set of issues and how to address them across a crisis or high-risk situation. It is not intended to be a sustainable, long-term solution to dealing with a years-long conflict” [p. 10].
Further, Meta referred to the Boards decision in the case of Alleged crimes in Raya Kobo to support the proposition that “resolving the tension between protecting freedom of expression and reducing the threat of sectarian conflict requires careful consideration of the specifics of the conflict” [p. 10]. Given the nature of the threat, the influential status of the speaker, and the rapidly escalating situation in Ethiopia at the time the content was posted, the company deemed that the value of “Safety” would best be served by removing the post rather than leaving it on the platform, despite the potential value of the content to warn individuals in the country of future violence.
Compliance with Meta’s content policies
The Board determined that Meta’s decision to remove the content from the platform was consistent with the Violence and Incitement Community Standard. The Board explained that the policy prohibits “threats that could lead to death (and other forms of high-severity violence) … targeting people or places,” including “statements of intent to commit high-severity violence” [p. 11]. The Board deemed that the content in question could be “reasonably interpreted by others as a call that could incite or encourage acts of actual violence in the already violent context of an armed conflict” [p. 11].
Compliance with Meta’s values
Regarding whether Meta’s decision to remove the content was in line with the company’s values, the Board highlighted that the value of “Voice” was of particular “importance in a country with a poor record of press and civic freedoms and where social media platforms serve as a key means of imparting information about the ongoing armed conflict” [p.12]. However, it noted that in this case the context of an armed conflict marked by a history of sectarian violence and violations of international law, the values of “Safety” and “Dignity” prevailed to protect users from content that posed a heightened risk of violence.
The Board determined that the content in question could be interpreted as a call to kill “Abiy Ahmed’s group.” More, the Board deemed the post could have been “interpreted as a warning of punishment to those who will not surrender to the TPLF, and as such pose[d] a risk to the life and physical integrity of Ethiopian federal forces and political leaders” [p. 12]. While the Board recognized that a regional governing body shared the content, the post did not include sufficient information with strong public interest value to outweigh the risk of harm. Thus, the Board concluded that by removing the content from the platform, Meta acted consistently with its values of “Safety” and “Dignity.”
Compliance with Meta’s human rights responsibilities
The Board stated that Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides broad protection for freedom of expression, including the right to seek and receive information about possible violence. By employing the three-part test set out in Article 19 ICCPR, the Board proceeded to analyze whether Meta’s decision to remove the content was consistent with its human rights responsibilities as a business.
I. Legality (clarity and accessibility of the rules)
The Board highlighted that Article 19 of the ICCPR requires that “any restriction on freedom of expression should be accessible and clear enough to provide guidance to users and content reviewers as to what content is permitted on the platform and what is not. Lack of clarity or precision can lead to inconsistent and arbitrary enforcement of the rules” [p. 13].
In this case, the Board found that the Violence and Incitement policy was clear in establishing the prohibition of “threats that could lead to death” and, in particular, “statements of intent to commit high-severity violence” [p. 13]. Nevertheless, it noted that Meta provided insufficient information on how it “implements the Violence and Incitement policy in situations of armed conflict, what policy exceptions are available and how they are used, or any specialized enforcement processes the company uses for this kind of situation” [p. 13].
II. Legitimate aim
The Board remarked that “restrictions on freedom of expression should pursue a legitimate aim, which includes the respect of the rights of others, and the protection of national security or public order” [p. 13]. Regarding the policy employed by the company to remove the content, it highlighted that the Community Standard on Violence and Incitement seeks to prevent offline harm that may be related to content on Facebook. The Board noted that as it had previously concluded in the Alleged crimes in Raya Kobo case decision, restrictions based on this policy serve the legitimate aim of protecting the rights to life and bodily integrity.
III. Necessity and proportionality
The Board highlighted that the requirement of necessity and proportionality required Meta to show that the restriction on speech was necessary to address the threat to the rights of others and that it was not overly broad. In making this assessment, the Board employed the Rabat Plan of Action’s six-part test to inform its analysis.
i. Context: The Board identified that the content was posted in the context of an ongoing and escalating civil war “marked by violations of international human rights and humanitarian law committed by all parties to the conflict” [p. 14].
ii. Speaker: The Board noted that the speaker was a regional government ministry affiliated with one of the parties to the conflict with significant reach and influence.
iii. Intent: In light of the language and context of the content, the Board considered there was at least an explicit call to kill soldiers who did not surrender. Also, the Board believed that it could be reasonably inferred that there was further intent to commit harm.
iv. Content: For the Board, the post could be “read to advocate targeting combatants and political leaders, regardless of their participation in the hostilities” [p. 14].
v. Extent of dissemination: The Board highlighted that the content was posted “on the public page of a body connected to one of the parties to the conflict with about 260,000 followers and remained on the platform for two days before being removed” [p. 14].
vi. Likelihood and Imminence: The Board noted the content was posted not only around the time that TPLF forces advanced toward other parts of Ethiopia beyond Tigray but also as the Prime Minister declared a nationwide state of emergency and called on civilians to take up arms and fight.
In light of the above, the Board determined that the “context in Ethiopia, the status and intent of the speaker; the content of the speech as well as its reach; and the likelihood of offline harm all contributed to a heightened risk of offline violence” [p. 14]. Thus, the Board concluded that Meta’s decision to remove the post from the platform was a necessary and proportionate restriction on freedom of expression under international human rights law.
The Board recognized that Meta had long been aware that its platforms have been used to spread hate speech and fuel ethnic violence and that the company has taken positive steps to improve its moderation system in some conflicts. However, it considered that the company had not done enough to evaluate its existing policies and processes and to develop a transparent framework for content moderation in conflict zones. Overall, the Board considered that Meta must do more to meet its human rights responsibility to establish a principled, transparent system for moderating content in conflict zones to reduce the risk of its platforms being used to incite violence or violations of international law.
Notably, the Board remarked that in Ethiopia, Meta had outlined the steps it had taken to remove content that incites others to violence through two general processes: the “at-risk countries” tiering system and IPOCs. The Board pointed out that since late 2020 the company had designated Ethiopia as a Tier 1 at-risk country, and it had a level 3 IPOC the time the content was posted. Nevertheless, the Board considered that in the context of an armed conflict, the fact that the post was removed two days after it had been published suggested the inadequacy of the at-risk tiering system and IPOCs in dealing with events that pose heightened human rights risks. In the same vein, the Board highlighted that Meta did not provide enough public information on the general criteria used for the “at-risk countries”. Without such information, the Board believed that neither it nor the public could “evaluate the effectiveness and fairness of these processes, whether the company’s product investments are equitable or whether they are implemented with similar speed and diligence across regions and conflict situations” [p. 15]. Moreover, the Board remarked that given that IPOCs are “short-term solutions” and convened on an ad hoc basis, there was a need for the company to invest greater resources in a sustained internal mechanism that provides the necessary expertise, capacity, and coordination to review and respond to content effectively for the entirety of a conflict.
Similarly, the Board considered that Meta provides insufficient information on how it implements the Violence and Incitement policy in armed conflict situations and the applicable policy exceptions. Moreover, the Board noted the company’s current approach to content moderation in conflict zones could lead to the appearance of inconsistency and stated that observers had accused the company of treating the Russia-Ukraine conflict differently from others. The Board referred to public comment (PC-10433), submitted by Dr. Samson Esayas, who stated that “Meta’s “swift measures” in moderating content in the context of the Russia-Ukraine conflict and highlighted the “differential treatment between this conflict and conflicts in other regions, particularly Ethiopia and Myanmar” [p. 15].
Policy advisory statement:
The Board presented two specific recommendations regarding transparency and enforcement in its policy advisory statement. Concerning transparency and in line with the Board’s recommendations in the case decisions of Former President Trump’s Suspension and Sudan Graphic Video, the Board recommended Meta publish information on its Crisis Policy Protocol. Regarding the enforcement of the company’s policies during periods of armed conflict, the Board recommended that Meta assess the feasibility of establishing a sustained internal mechanism that provides it with the expertise, capacity, and coordination required to review and respond to content effectively for the duration of a conflict.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The Board contracted freedom of expression by upholding Meta’s decision to remove the content. However, it did so under a justified and recognized limitation to freedom of expression, since, in light of the context in Ethiopia, the status and intent of the speaker; the content of the speech as well as its reach; and the likelihood of offline harm all contributed to a heightened risk of offline violence.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board interpreted the right to life contained in this article as interpreted by General Comment No. 36 of the Human Rights Committee (2018)
The Board noted that its analysis of this case was informed by this standard on the right not to be subjected to torture or cruel, inhuman, or degrading punishment.
The Board noted that this standard informed its analysis of this case on the right to security of persons.
The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.
The Board noted that the UNGPs impose a heightened responsibility on businesses operating in a conflict setting.
The Board referenced the report to underscore that the Rapporteur on Freedom of Expression had raised concerns about the vagueness of Facebook´s Dangerous Individuals and Organizations Community Standard.
While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.
As part of its analysis, the Board drew upon the six factors from the Plan of Action to assess the capacity of speech to create a severe risk of inciting violence.
The Board noted that in this case decision, it had concluded that restrictions based on the Violence and Incitement Community Standard served the legitimate aim of protecting the rights to life and bodily integrity.
The Board referred to this case decision to underscore that it had previously recommended Meta develop and publish a policy for crises
The Board referred to this case decision to reiterate that it had previously recommended Meta develop and publish a policy for crises “where its regular processes would not prevent or avoid imminent harm.”
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.