Facebook Community Standards, Violence And Criminal Behavior, Violence and Incitement, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of UK Drill Music
United Kingdom
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s original decision to leave up a Facebook post that called for the burning down of a hotel in Ethiopia’s Amhara region. Responding to a user’s appeal, the Board reasoned that such calls for violence cause a risk of “near-term violence” which can worsen the situation on the ground, especially in a country like Ethiopia that suffers from armed conflicts and civil unrest. When the Board notified Meta of this case, the company found that the post violated its Violence and Incitement policy and decided to remove the post. The Board recognized Meta’s correction of its initial error when the case was brought to their attention, and recommended Meta look into the possibility of establishing a mechanism that properly reviews and handles content in times of conflict, as well as conducting an assessment on how Facebook and Instagram are used to spread hate speech and unverified rumors in Ethiopia.
* The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. Decisions, except summary decisions, are binding , unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
On 6 April 2023, a Facebook user posted an image of a hotel in Ethiopia’s Amhara region, with a caption that called for its incineration, along with the hotel’s address and the name of a general in the Ethiopian National Defense Force, whom the user claimed was the owner of the hotel. The content was uploaded during a period of heightened political tension in the Amhara region, as people were protesting against the government’s plan to disband a regional paramilitary force.
One user reported the content, and when Meta did not take the post down, the user appealed to the Board arguing that the post called for violence. When the Board brought this matter to Meta’s attention, the company reversed its original decision as it considered that the content violated its Violence and Incitement policy, and decided to remove the post.
Nonetheless, the Board issued a summary decision on the matter, which “examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention […] [to] provide transparency on Meta’s corrections and highlight areas where the company could improve its policy enforcement.” [p. 1]
The main issue before the Oversight Board, in this case, was whether Meta misapplied its Violence and Incitement policy in a case where a Facebook post calling to incinerate a hotel in Ethiopia—amidst a context of armed conflict and civil unrest— was not initially removed by the company.
The user who appealed to the Board submitted that “the post call[ed] for violence and violate[d] Meta’s Community Standards”. [p. 1]
Meta’s initial reaction was to leave the post on Facebook. However, when the Board brought this case to Meta’s attention, it decided that its original decision to leave up the content was incorrect and decided to remove the content, as it violated its Violence and Incitement Community Standard.
Meta’s Violence and Incitement policy provides that “the company removes content that calls for high-severity violence.” The Board stated that calls for violence, such as the one in the present case, in a country suffering from armed conflict and civil unrest “poses a heightened risk of near-term violence and can exacerbate the situation on the ground.” [p. 2]
The Board reiterated its recommendation from the Tigray Communication Affairs Bureau case (recommendation no.2) that Meta “assess the feasibility of establishing a sustained internal mechanism that provides the expertise, capacity and coordination required to review and respond to content effectively for the duration of a conflict.” [p. 2]
The Board highlighted that Meta was going to launch a “crisis coordination team” whose aim is to provide “dedicated operations oversight throughout imminent and emerging crises.” [p. 3] Subsequently, the OSB said that it would monitor the implementation of this new mechanism along with the existing policies, “to ensure that Meta treats users more fairly in affected regions.” [p. 2]
In addition to that, the Board recommended Meta to carry out “an independent human rights, due diligence assessment” on how Facebook and Instagram have been used to “spread hate speech and unverified rumors that heighten the risk of violence in Ethiopia,” [p. 3] and to publish the full report, as previously recommended in the case of Alleged crimes in Raya Kobo (recommendation no. 3). Although Meta referred to this recommendation to argue that this is something the company “already does”, the Board concluded that the company has not published information to demonstrate this claim.
Finally, The Board decided to overturn Meta’s original decision to leave the content on the platform and decided that it should be removed.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
In this decision, the Board decided that a Facebook post calling for violence in Ethiopia had to be removed from the platform. While this limits freedom of expression online, it takes into consideration that this content, in the context of civil unrest and armed conflict, entailed offline dangers that can be harmful to the rights and integrity of others. Thus, the decision restricts freedom of expression in line with international human rights standards.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board referenced to this case to reiterate its recommendation that Meta establish a mechanism for content moderation during times of conflicts.
The Board referred to this case to recommend Meta to commission an assessment of how its platforms have been used to promote hate speech and unverified rumors in Ethiopia.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.