Oversight Board Case of Sudan’s Rapid Support Forces Video Captive

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 11, 2024
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2023-039-FB-UA
  • Region & Country
    Sudan, Africa
  • Judicial Body
    Oversight Board
  • Type of Law
    International/Regional Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Coordinating Harm and Promoting Crime, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Enforcement Recommendation, Facebook

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On April 11, 2024, the Oversight Board overturned Meta’s original decision to leave up a video showing armed men, identified as Rapid Support Forces (RSF) members, detaining a person in a military vehicle. The Board found the content in violation of the Dangerous Organizations and Individuals policy, as it supported the RSF, which is designated as a dangerous group under that policy. Moreover, the Board noted that the content violated the Coordinating Harm and Promoting Crime policy by exposing the identity of a prisoner of war. The Board emphasized that this failure to remove the content reveals systemic flaws in enforcing policies during conflicts and in reviewing content that “outs” prisoners of war. The Board expressed concerns about Meta’s slow response, suggesting potential broader issues with enforcing policies effectively during armed conflicts, and urged Meta to create a scalable solution to proactively identify such content. After being notified of the case by the Board, Meta removed the video for supporting a designated entity and revealing the identities of prisoners of war.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

On August 27, 2023, a Facebook user posted a video of members of the Rapid Support Forces (RSF) detaining a person in the back of a military vehicle. In the video an RSF member claimed in Arabic that they had captured a foreign national, probably associated with the Sudanese Armed Forces (SAF). The man further stated that they would capture everybody working against the RSF, including the SAF leaders and their foreign combatants in Sudan. The video included derogatory remarks about foreign nationals and foreign leaders supporting the SAF.

The user included an Arabic caption which translated to: “we know that there are foreigners from our evil neighbor fighting side by side with the devilish Brotherhood brigades.” Shortly after, the user edited the caption to: “we know that there are foreigners fighting side by side with the devilish Brotherhood brigades.” Despite the user having 4,000 friends and 32,000 followers, the post received fewer than 100 reactions, 50 comments, and 50 shares.

Since April 2023, a non-international armed conflict has been ongoing in Sudan between SAF, the military forces of the internationally recognized Sudanese government, and the paramilitary group, the RSF. As of November 2023, the RSF controlled most of West Darfur, the area around the capital Khartoum, and parts of North and West Kordofan, while the SAF controlled most of the Nile Valley and the eastern provinces and ports.

7.3 million people have been displaced because of the conflict, and over 25 million people, including 14 million children, were facing severe food insecurity. Gender-based violence, sexual violence, harassment, sexual exploitation, and trafficking were all escalating. Sudanese human rights organizations reported that the RSF had detained over 5000 people in Khartoum in inhumane conditions with a lack of access to necessities essential for human dignity.

The International Criminal Court and the U.S. Department of State, among others, reported crimes against humanity, genocide, and war crimes by both parties to the conflict. Additionally, the RSF have ethnically targeted Masalit communities in Sudan and on the Chad border. Experts informed the Board that both sides were responsible for widespread abuses against detainees, including, inhumane conditions, illegal detentions, ethnic targeting, sexual violence, and using hostages as human shields.

On August 11, 2023, Meta designated the RSF as a Tier 1 terrorist organization under its Dangerous Organizations and Individuals (DOI) policy. Experts noted that such a designation limited the group’s dissemination of information. However, it has also encouraged the RSF to use other tactics, such as using non-official personal pages and accounts. Experts further noted that the designation hampered civilian access to information about the RSF, which they needed for security reasons.

Sudanese civilians relied on Facebook to find crucial information about humanitarian developments in Sudan, to establish routes to safety within the country or to flee, to find crucial information on military operations or violent outbreaks, to seek humanitarian help, and to learn about hostages and prisoners of war.

Shortly after the content was posted, Facebook users reported it; however, these reports were not prioritized for human review, purportedly due to their low likelihood of violating Meta’s Policies and low virality. One of the users appealed Meta’s decision, but the appeal was not reviewed by a human moderator and was automatically closed without review.

The same user subsequently appealed to the Oversight Board.


Decision Overview

The main issue before the Board was whether Meta’s original decision to leave up a video post supportive of the RSF, which depicted members of the group detaining an alleged foreign combatant, was compatible with Meta’s content policies and human rights obligations.

On the one hand, the reporting user requested the post’s removal, arguing that it endangered Sudanese people by disseminating disinformation and threats of violence by the RSF in Khartoum.

On the other hand, Meta explained that its initial decision to leave the content up was because automated systems did not prioritize it for human review. These systems prioritize content based on the severity of a potential violation, the content’s virality, and the likelihood of a violation. Reports with lower priority ranks are typically closed after 48 hours. Users reported this content four times for “terrorism,” “hate speech,” and “violence.” These reports were not prioritized due to their low severity and low virality. One user appealed the decision, but the appeal was automatically closed due to COVID-19 automation policies. Meta introduced these policies in 2020 to reduce the number of reports sent to human reviewers while prioritizing “high-risk” reports. Meta informed the Board that, upon the Board’s selection of the case, it removed the content. The company concluded that a video depicting a member of the designated Tier 1 RSF speaking about the organization’s activities, without a condemning, neutral, or news-reporting caption, violated the DOI policy.

The Board selected this case to explore how social media platforms should handle content in conflict zones like Sudan, where they must balance civilians’ vital need for access to information against preventing dangerous organizations from exploiting these platforms to incite real-world harm and further their violent missions. Additionally, the case provided an opportunity to assess Meta’s specific measures for protecting detainees in accordance with international humanitarian law.

(1) Compliance with Meta’s content policies

The Board first found that the content violated the DOI policy by supporting a designated Tier 1 organization. This policy prohibits content that praises, substantively supports, or represents designated organizations. Meta’s internal guidelines provide a non-exhaustive list of examples indicating substantive support, including posts featuring self-proclaimed members of designated entities speaking about the organizations.

The Board agreed with Meta’s conclusion that the content contained “substantive support,” as the video depicted an RSF member speaking about the RSF’s activities without a caption that condemned, neutrally discussed, or reported on those activities.

The Board further found the content in violation of the Coordinating Harm and Promoting Crime policy. This policy prohibits outing the identity of a prisoner of war during an armed conflict, with no exceptions, for example, even for content raising awareness. The policy was only enforced upon escalation. Meta defines a prisoner of war as a member of the military who is captured by an enemy during or immediately after a conflict.

The Board noted that this prohibition applies equally to both international and non-international armed conflicts. It ruled that the video showed an identifiable individual whom the RSF member described as a “foreign captive associated with the SAF.” Accordingly, the Board found the content to be in violation of the policy and that it should be removed.

(2) Compliance with Meta’s human rights responsibilities

In its reasoning, the Board relied on Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which protects a broad range of expressions, including political speech, and encompasses the right to seek, receive, and impart information. However, the Board noted that under Article 19(3), freedom of expression may be subject to restrictions if they are provided by law, pursue a legitimate aim, and are necessary and proportionate in a democratic society.

The Board emphasized that during armed conflicts, the protection of freedom of expression and access to information becomes critically important. This principle must inform the responsibilities of companies like Meta, even during wars. This is especially vital because, as noted by the UN Special Rapporteur, conflict is a time when civilians are most vulnerable and in need of trustworthy information for their safety, yet it is also when this freedom is most threatened by actors seeking to manipulate information.

The Board recognized the importance of ensuring people can freely share crucial information on conflicts, particularly when social media is a primary source of information. However, the Board also highlighted the equal importance of removing content that is likely to incite further violence during these armed conflicts.

a. Legality (clarity and accessibility of the rules)

The principle of legality requires that rules restricting expression be sufficiently precise, clear, and accessible. In the context of social media, this means users must have access to clear guidelines on what is permitted on a platform. These rules must not grant enforcement moderators unfettered discretion but must provide them with clear guidance.

Dangerous Organizations and Individuals

The Board emphasized that the terms “praise” and “support” are vague and reiterated its criticism that the list of designated organizations is inaccessible to users because it is not publicly published. Meta explained that it designates entities that have been designated as terrorists by the United States government. The Board recognized that while most of the Tier 1 terrorist designations are based on U.S. designations, the full list extends beyond them. The Board recommended that Meta hyperlink to the U.S. government’s list of Foreign Terrorist Organizations and Specially Designated Global Terrorists in its policy to increase clarity for users.

However, the Board highlighted that in this case, the RSF had not been designated by the U.S. government, yet Meta designated it as a Tier 1 organization. The Board emphasized that this demonstrates a transparency issue, as users cannot know whether their content might violate Meta’s DOI policy. The Board underlined its recommendation from the Nazi Quote decision for Meta to publicly publish the list of designated groups and individuals, a recommendation Meta has declined to implement.

The Board further expressed concern about the disproportionate impact on access to information in Sudan, given the RSF’s Tier 1 designation combined with the lack of transparency surrounding it. This leaves Sudanese users unaware of the designation. The Board urged Meta to be more transparent, especially when making decisions affecting regions in armed conflict with restrictive civic space, where social media is a main source of information. The lack of transparency endangers the local population’s physical security, particularly since the RSF controls large areas and Sudanese civilians depend on Facebook for vital safety information, including from RSF channels. The Board recalled its recommendation from the Referring to Designated Individuals as “Shaheed” policy advisory opinion for Meta to explain its designation procedures in more detail. Additionally, the Board urged Meta to publish information on the total number of individuals and groups in each tier and how many were added or removed in the past year.

Finally, the Board noted that, in the context of Sudan, Meta’s rule under the DOI policy prohibiting “substantive support”, which is defined as “channeling information or resources, including official communications, on behalf of a designated entity or event”, was clearly defined and accessible, thus passing the legality test.

Coordinating Harm and Promoting Crime

The Board found the rule prohibiting the exposure of a prisoner of war’s identity to be clear and accessible to users, which satisfied the principle of legality.

b. Legitimate aim

The Board has previously recognized, in the Punjabi Concern Over the RSS in India and Nazi Quote decisions, that the DOI policy pursues the legitimate aim of protecting the rights of others to life, security of person, equality, and non-discrimination. Additionally, in the Öcalan’s Isolation decision, the Board recognized the purpose of the DOI policy to prevent offline harm as a legitimate aim.

Moreover, the Board found that the Coordinating Harm and Promoting Crime policy pursues the legitimate aim of protecting the rights of others to life, privacy, and protection from torture or cruel, inhuman, or degrading treatment. The Board also recalled the Armenian Prisoners of War Video decision, noting that the legitimacy of the prohibition on depicting identifiable prisoners of war is informed by rules of international humanitarian law that call for the protection of the life, privacy, and dignity of prisoners of war, and by the fact that the situation in Sudan is designated as an armed conflict.

c. Necessity and proportionality

The Board noted that any limits on freedom of expression must be necessary, the least restrictive option available, and proportionate to the threat. For social media companies, this means considering a range of responses to problematic content beyond just deletion to ensure their actions are narrowly tailored.

In the Tigray Communication Affairs Bureau decision, the Board recognized that businesses operating in conflict settings have a heightened responsibility. Moreover, the Board found in the Armenian Prisoners of War Video decision that in situations of armed conflict, freedom of expression analysis must be informed by international humanitarian law.

Dangerous Organizations and Individuals

The Board found that removing the content under the DOI policy was necessary and proportionate. It noted that disseminating a video of an RSF member describing their activities and threatening anyone who opposed them could lead to a heightened risk of real-world harm. Given that the RSF is implicated in severe atrocities, including war crimes and crimes against humanity, the Board found that spreading information on behalf of such a designated organization on Facebook creates a significant risk of real-world harm. Therefore, removal was the least restrictive measure to protect the rights of others, as no lesser measure would address the high risk of harm.

The Board expressed concern over Meta’s failure to remove the content until notified by the Board two months later. Meta’s detection errors in this case alluded to broader problems with its automated classifiers’ ability to identify content supporting the RSF. The classifiers did not prioritize the content, giving it a low-ranking score; in response to the Board’s question about the cause of this score, Meta stated it could not identify the reason. The Board concluded that Meta must improve its automated systems by auditing the training data for its video classifiers. This audit should ensure the data includes diverse examples of content from designated organizations across various languages, dialects, regions, and conflicts. To handle the expected increase in content flagged for review, Meta must also boost its human review capacity.

Additionally, the Board noted that Meta failed to establish sustainable mechanisms to adequately enforce content policies during the war in Sudan. In the Weapons Post Linked to Sudan’s Conflictdecision, Meta explained that it did not set up an Integrity Product Operations Center for Sudan, which is used to respond to threats in real-time, claiming it could handle the content risks through normal procedures. Meta reiterated a similar position in this case. The Board has previously recommended in the Tigray Communication Affairs Bureau decision that Meta improve enforcement during armed conflict by establishing a sustained internal procedure capable of reviewing content effectively.

In August 2023, Meta reported to the Board that it had established a dedicated team to manage crisis coordination and oversee operations during various stages of imminent and emerging crises, including high-risk events and elections. The company stated it had completed staffing for this team and was preparing them for operational duties, with plans to have the team fully operational across all regions in the upcoming months. Meta also mentioned that it would continue to refine the team’s execution framework based on experiences with conflict incidents and evaluations of the structure’s effectiveness, considering this setup a complete response to the Board’s recommendation. However, in response to the Board’s question, Meta acknowledged that it has not implemented such a mechanism specifically for the conflict in Sudan, despite considering the recommendation fulfilled.

Coordinating Harm and Promoting Crime

The Board highlighted that, as indicated in the Armenian Prisoners of War Video decision, the necessity and proportionality of removing content that exposes the identities of prisoners of war are informed by international humanitarian law. Common Article 3 of the Geneva Conventions and Article 13 of the Third Geneva Convention prohibit acts compromising the personal dignity of detainees and prisoners of war during conflicts, such as humiliating or degrading treatment, violence, intimidation, and undue exposure to public curiosity. International humanitarian law allows the release of images of prisoners of war only under very limited circumstances, such as a compelling public interest or the vital interests of the prisoner, and requires that the prisoner’s dignity be preserved. The Board noted that although online tools are available for anonymizing such content, Meta does not provide users with capabilities to blur or obscure faces in videos of prisoners of war on its platform.

As the Board determined in the Armenian Prisoners of War Video decision, prohibiting the sharing of images of prisoners of war aligns with the objectives of international humanitarian law. Furthermore, content that discloses the identity or location of prisoners of war generally warrants removal, as the potential harm from such exposure is severe and the action of removal is considered proportionate to these risks. The Board found that the removal of the post was necessary due to the risks associated with the conflict in Sudan and to ensure the dignity and safety of the prisoner.

The Board expressed concerns over Meta’s failure to identify and remove content that violated its policy against outing prisoners of war, particularly given the increased risks of harm in armed conflicts. The policy on outing prisoners of war is only enforceable upon escalation, meaning it is beyond the remit of at-scale content moderators. In the Armenian Prisoners of War Video decision, the Board acknowledged the necessity of this rule requiring escalation to expert internal teams for enforcement, as it involves complex judgments about whether individuals in content are identifiable prisoners of war within a conflict context. Consequently, enforcement of such rules depends solely on external prompts, such as notifications from Trusted Partners or significant media attention.

The Board’s evaluation indicated that, due to this escalation-only enforcement policy, a considerable amount of content identifying prisoners of war likely remains on Meta’s platform. This situation is compounded by the limitations of Meta’s automated detection systems, which may not be adequately trained due to insufficient human-reviewed cases under such policies.

As a result, while the Board acknowledged the necessity of the rule prohibiting the outing of prisoners of war in armed conflicts, it deemed Meta’s current enforcement measures inadequate for upholding the company’s responsibilities to protect the rights of prisoners of war. To genuinely safeguard these rights under international humanitarian law, the Board recommended that Meta develop a scalable enforcement solution.

Access to remedy

Meta disclosed to the Board that the appeal in this case was automatically closed due to the company’s COVID-19 automation policies. The Board had previously, in the Holocaust Denial decision, urged Meta to confirm publicly whether all automation policies established during the COVID-19 pandemic had been discontinued. The Board remains concerned that these automation policies, initially justified by a temporary decrease in human review capabilities during the pandemic, are still active. The Board reiterates its recommendation, emphasizing the need for Meta to publicly clarify when it expects to restore full human reviewer capacity.

Accordingly, the Board overturned Meta’s decision to leave the content on Facebook, as it violated the DOI and Coordinating Harm and Promoting Crime policies. The Board ruled that the removal is a justifiable limitation on expression under Article 19(3) of the ICCPR.

Policy advisory statement

The Board recommended that Meta create a scalable solution to enforce its Coordinating Harm and Promoting Crime policy prohibition against outing prisoners of war, as required by international humanitarian law. This involves setting up a specialized team to proactively monitor and identify such content during armed conflicts, to ensure prompt and effective enforcement of the policy.

Moreover, the Board suggested that Meta audit the training data of its video content classifier to ensure it includes diverse examples of content related to designated organizations in various languages, dialects, regions, and conflicts. This recommendation aims to improve the automated detection and prioritization of potential policy violations under the DOI policy for human review.

Finally, to provide more clarity to users, the Board urged Meta to hyperlink to the U.S. Foreign Terrorist Organizations and Specially Designated Global Terrorists lists in its Community Standards where these lists are mentioned.

Dissenting Opinions

A minority of Board members noted that although Meta did not publicly announce the designation of the RSF, the group’s alleged involvement in war crimes and crimes against humanity was well-documented and widely reported as the conflict escalated. Given this public knowledge, users could reasonably expect that sharing RSF-related content might breach Meta’s policies.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This decision has a mixed outcome. While the Board constrained expression by overturning Meta’s decision to keep the content up, the restriction was justified. It was implemented in accordance with the three-part test in the context of a non-international armed conflict, like the one in Sudan. Moreover, the Board’s decision was guided by the rules of the Geneva Conventions regarding the treatment of prisoners of war.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Meta’s obligations towards freedom of expression stipulated by this article. It also applied the three-part test in its analysis.

  • ICCPR, art. 6

    The protection of the right to life stipulated in this article was among the legitimate aims of the Dangerous Organizations and Individuals and Coordinating Harm and Promoting Crime policies.

  • ICCPR, art. 9

    The protection of the right to liberty and security stipulated in this article was among the legitimate aims of the Dangerous Organizations and Individuals and Coordinating Harm and Promoting Crime policies.

  • ICCPR, art. 7

    The protection of the right to be free from torture, inhuman and degrading treatment stipulated in this article was among the legitimate aims of the Coordinating Harm and Promoting Crime policy.

  • ICCPR, art. 17

    The protection of the right to privacy stipulated in this article was among the legitimate aims of the Coordinating Harm and Promoting Crime policy.

  • ICCPR, art. 10

    The protection of the right to be treated with humanity in detention stipulated in this article was among the legitimate aims of the Coordinating Harm and Promoting Crime policy.

  • Geneva Convention (III), art.13

    This article of the Third Geneva Convention guided the Board’s analysis of the Coordinating Harm and Promoting Crime policy prohibition on outing personalities of war prisoners.

  • Article 3 common to the Geneva Conventions

    This article of the Third Geneva Convention guided the Board’s analysis of the Coordinating Harm and Promoting Crime policy prohibition on outing personalities of war prisoners.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used this General Comment as a guide to how the three part test is applicable to Meta’s restrictions on freedom of expression, and to explain the elements of the three part test.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board analyzed Meta’s human rights obligations within the framework of the Guiding Principles on Business and Human Rights.

  • OSB, Nazi quote, 2020-005-FB-UA (2021)

    The Board highlighted its recommendation for Meta to publish a list of designated groups and individuals and its conclusion of the legitimate aims of the Dangerous Organizations and Individuals policy from this decision.

  • OSB, Tigray Communication Affairs Bureau, 2022-006-FB-MR (2022)

    The Board cited this case to underline its recommendation for Meta to establish an internal mechanism that review and respond to content effectively for the duration of a conflict, and to highlight the increased responsibility of business during conflicts.

  • OSB, Case of Armenian Prisoners of War Video, 2023-004-FB-MR (2023)

    This decision guided the Board’s overall analysis due to the similarities between the content in both cases.

  • OSB, Punjabi concern over the RSS in India, 2021-003-FB-UA (2021)

    The Board referred to this case to highlight the legitimate aims of the Dangerous Organizations and Individuals policy.

  • OSB, Öcalan's Isolation, 2021-006-IG-UA (2021)

    The Board referred to this case to highlight the legitimate aims of the Dangerous Organizations and Individuals policy.

  • OSB, Holocaust Denial, 2023-022-IG-UA (2024)

    The Board referenced this case to explain the COVID-19 automation policies and to reiterate its request for Meta to publicly confirm whether it has fully ended all COVID-19 automation policies.

  • OSB, Weapons post linked to Sudan's conflict, 2023-028-FB-UA (2024)

    The Board highlighted this case to underline Meta’s failure to implement an Integrity Product Operations Center for Sudan during the armed conflict which led to multiple enforcement errors.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback