Oversight Board Symbols Adopted by Dangerous Organizations

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    June 12, 2025
  • Outcome
    Agreed with Meta’s initial decision
  • Case Number
    2025-015-IG-MR, 2025-016-IG-MR, 2025-017-IG-MR
  • Region
    International
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations, Referral to Facebook Community Standards
  • Tags
    Incitement, Oversight Board Policy Advisory Statement, Oversight Board Transparency Recommendation, National Symbols

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board upheld Meta’s decisions to remove two posts on the grounds that they both constituted glorification of white nationalist and supremacist ideologies in violation of the Dangerous Organizations and Individuals policy. The first post showed a woman with “Slavic Army” text and a Kolovrat symbol over her face covering, with a caption expressing Slavic pride and urging their “people to wake up.” The second depicted a woman wearing Nazi-associated jewelry (iron cross with swastika) and a T-shirt showing an AK-47 and “Defend Europe”, captioned with an Odal rune. The Board also agreed with Meta to leave up a third post, featuring a quotation and artwork involving the Odal rune, since it did not violate the policy due to its neutral context and lack of references to hateful ideologies. While a minority of Board members disagreed with the removal of the Kolovrat post, the majority found it necessary and proportionate to prevent harm. The Board also expressed concerns about overenforcement and lack of transparency in Meta’s policies and enforcement practices, and issued four recommendations to improve clarity, accuracy, and public accountability, including publishing clearer definitions, reviewing symbol classifications, developing safeguards against false positives, and enhancing user transparency.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In November 2024, Meta submitted to the Board three Instagram posts for review, involving symbols usually used by hate groups, even though they can have other meanings.

The first post, dating back to April 2016, featured an image of a woman with blonde hair partially covering her face with a scarf, overlaid with the phrase “Slavic Army” and the Kolovrat symbol, a swastika-like emblem used by neo-Nazi and neo-pagan groups. In the caption, the user expressed pride in their Slavic identity and described the symbol as representing a mix of contradictory concepts such as faith, war, hate, and love. They concluded with a call for their people to “wake up” and pledged to pursue “their dreams to the death.” The post received fewer than 100 views, 500 reactions, and  50 comments. Meta’s policy experts deemed the post in violation of the Dangerous Organizations and Individuals (DOI) policy and removed it.

The second post, from October 2024, showed multiple selfies of a woman wearing symbols widely associated with far-right extremism: a swastika-bearing iron cross necklace, and a T-shirt with an AK-47 and the phrase “Defend Europe” in a Fraktur font. The caption included the Odal rune, the hashtag #DefendEurope, and additional symbols and emojis suggesting strength and militarism, including an M8 rifle. Although the Odal rune originates from pre-Latin alphabets, it has been co-opted by neo-Nazi and white supremacist movements to signify Aryan identity. This post reached about 3,000 views, and fewer than 500 reactions and 50 comments. It was also removed by Meta for violating its DOI policy.

The third post from February 2024 featured stylized artwork of the Odal rune entwined with a sword, accompanied by a quote on fate and blood by Ernst Jünger, a German nationalist known for his complex views on war and ideology. The caption emphasized the rune’s ancient heritage, omitting its Nazi associations, and presented the image as a symbol of tradition and family values, with prints available for purchase. This post gained wider reach, with nearly 25,000 views, fewer than 1,000 reactions, and 50 comments, but Meta’s experts ultimately concluded it did not violate the platform’s content policies.


Decision Overview

The Board examined whether Meta’s decision to remove the first two posts and retain the third post aligned with Meta’s content policies and human rights obligations.

The policy upon which the three posts were reviewed was Meta’s Dangerous Organizations and Individuals policy (DOI policy). This policy is designed to prevent real-world harm by removing content that promotes or glorifies hateful ideologies, particularly those categorized as Tier 1, such as Nazism and white supremacy. These ideologies are viewed as inherently violent and exclusionary, and Meta enforces strict rules against any support, celebration, or representation of them. The policy also includes removing unclear references like humor, ambiguous captions, or contextless posts, unless the user clearly signals an intent to criticize or neutrally report on these ideologies. If intent is ambiguous, the platform defaults to removal.

The users who authored the posts were notified of the Board’s review and given a chance to respond, but none submitted statements.

Meta explained that, in application of its DOI policy, it categorized symbols associated with these ideologies into three groups: (1) an extremely short list of well-known symbols associated with designated entities and are heavily used by them, thus treated as inherently violating when appearing as a primary focus without context, (2) a much larger list of symbols mostly used in hateful contexts of a designated entity or ideology and is treated as violating at scale, and (3) a very short list of symbols often used in both benign and harmful contexts, which are assessed individually when escalated.

Meta highlighted that the Kolovrat symbol, featured in the first post, belongs to the second group and was found to glorify white nationalist ideology, leading to removal. The Odal rune, seen in the second and third posts, falls in the third group and was evaluated for context. In the second post, Meta found enough signals like the use of “#DefendEurope,” Nazi imagery, and weapon references to conclude the post glorified white supremacy. In contrast, the third post was considered neutral in tone, focusing on historical and cultural explanations without glorifying hateful ideology. Meta confirmed that it removed the first two posts for violating the policy, while the third was left up.

Meta also noted it had not conducted a systemic audit of the first or third groups of symbols but was updating the mid-tier list regularly, and considering further policy revisions. In response to the Board’s inquiries, Meta provided full clarification on how symbols were reviewed and how enforcement decisions were made, including how user intent and contextual cues play a central role in determining whether a post violates the policy.

Compliance with Meta’s Content Policies

In the Board’s view, Meta’s decision to remove the first and second posts while keeping the third post online adhered to its DOI policy.

The Oversight Board upheld Meta’s decisions to remove two posts on the grounds that they both constituted glorification of white nationalist and supremacist ideologies in violation of the Dangerous Organizations and Individuals policy. The first post showed a woman with “Slavic Army” text and a Kolovrat symbol over her face covering, with a caption expressing Slavic pride and urging their “people to wake up.” The second depicted a woman wearing Nazi-associated jewelry (iron cross with swastika) and a T-shirt with an AK-47 and “Defend Europe”, captioned with an Odal rune. The Board also agreed with Meta to leave up a third post, featuring a quotation and artwork involving the Odal rune, since it did not violate the policy due to its neutral context and lack of references to hateful ideologies. While a minority of Board members disagreed with the removal of the Kolovrat post, the majority found it necessary and proportionate to prevent harm. The Board also expressed concerns about overenforcement and lack of transparency in Meta’s policies and enforcement practices, and issued four recommendations to improve clarity, accuracy, and public accountability, including publishing clearer definitions, reviewing symbol classifications, developing safeguards against false positives, and enhancing user transparency.

The Board agreed with Meta’s decision to remove the first post (featuring the Kolovrat symbol), but its reasoning differed. While Meta considered the post violative as a mere “reference” to a hateful ideology like white nationalism, the Board majority found it actively glorified that ideology, which is a designated hateful ideology under Tier 1 of the DOI policy. The post’s rhetoric, including references to “Slavic pride,” the “Slavic Army,” and calls for their “people to wake up” and follow “their dreams to the death”, demonstrated intent to legitimize or defend violent or hateful acts by portraying them as morally or politically justified, as prohibited by the policy. Consequently, the post did not qualify for the “social and political discourse” exception, which explicitly bars glorification.

The Board also concurred with Meta’s removal decision of the second post, finding it celebrated white supremacist violence and thus glorified a designated hateful ideology. The post combined multiple extremist symbols: a swastika-embedded iron cross, the white supremacist slogan “Defend Europe” in Nazi-associated Fraktur font, and an Odal rune paired with an M8 rifle. While individually these elements might not be violative, their collective presence clearly constituted glorification under the DOI policy, rendering the post ineligible for exceptions.

The Board accepted Meta’s finding that the third post did not violate the DOI policy, as it presented the Odal rune neutrally in an artistic context without promoting Nazism or hateful ideologies. While it included a quote by German nationalist Ernst Jünger referencing “fate and blood,” this alone did not glorify or support any designated hateful ideology. Similarly, while the sword imagery could symbolize violence, this single element is insufficient to constitute a policy violation.

Compliance with Meta’s Human Rights Responsibilities

The Board determined that Meta’s decision to take down the first and second posts while keeping the third one up complies with its human rights duties. However, a minority of Members disagreed that the removal of the first post was permissible under international human rights law.

In its reasoning, the Board relied on Article 19 of the ICCPR, which protects a broad range of expressions, including offensive and controversial speech. The Board noted that, under Article 19(3), any restriction on expression must be prescribed by law, in pursuit of a legitimate aim, necessary in a democratic society, and proportionate.

The Board emphasized that it applied this human rights framework to assess Meta’s compliance with its duties under the UN Guiding Principles on Business and Human Rights, to which Meta has committed, evaluating both Meta’s individual decisions and its broader content governance approach.

Legality (Clarity and Accessibility of the Rules)

The Board emphasized the importance of the principle of legality, which requires rules restricting expression to be clear, accessible, and precise so users can understand and comply with them. Equally important, these rules must not grant unlimited discretion to reviewers and must provide them with clear enforcement guidance.

The Board reiterated concerns about the lack of transparency in Meta’s designation processes under its DOI policy, particularly regarding Tier 1 entities and associated symbols. As noted in the Greek 2023 Election Campaign decision, the absence of a public hate entities list or clear symbol classification criteria limits users’ ability to understand prohibited content, reducing enforcement accountability and predictability. The Board also noted Meta’s refusal to publish the Tier 1 list, as recommended in the Nazi Quote decision, citing risks of bad actors circumventing enforcement and endangering employee safety. While Meta partially addressed Recommendation No. 3 from the Sudan’s Rapid Support Forces Video Captive decision by committing to hyperlink U.S. terror lists in its Community Standards, no equivalent exists for hate-based designations.

To address this, the Board urged Meta to increase transparency in its symbol designation process. It recommended implementing a global, evidence-based system for evaluating and categorizing symbols, with regular updates. This process should ensure proper contextual review of symbols with multiple meanings, requiring “escalated review from Meta’s internal subject matter teams or specific guidance is issued to human reviewers.” [p. 13] The Board also emphasized the need for regionally inclusive research on symbol usage trends across languages and cultures. Additionally, it stressed periodic list reviews to ensure regional balance and non-discriminatory enforcement, particularly by including ideologies relevant to both Global Minority and Majority regions, as highlighted in the Posts Displaying South Africa’s Apartheid-Era Flag decision.

Lastly, the Board criticized the ambiguity around Meta’s use of “reference” in its DOI policy enforcement. While the public DOI policy mentions removing unclear or contextless references, internal definitions appear broader, covering positive references, incidental depictions, and ambiguous satire. The Board also urged Meta to publish a detailed public explanation of how it develops and enforces its designated symbols list, including its application of account-level penalties such as strikes, and to ensure this information is readily accessible through the Transparency Center.

Legitimate Aim

The Board found that Meta’s DOI policy served a legitimate aim under the ICCPR by seeking to prevent real-world harm and protect the rights of others, including the rights to life, non-discrimination, and equality. This aligns with Article 19(3) of the ICCPR, which permits restrictions on expression to protect such fundamental rights. The Board based this conclusion on the policy’s application to organizations promoting hate, violence, and discrimination, as well as to violent events motivated by hatred, as demonstrated in the Sudan’s Rapid Support Forces Video Captive and Greek 2023 Elections Campaign decisions.

Necessity and Proportionality

The Board noted that under Article 19(3) of the ICCPR, restrictions on expression must meet necessity and proportionality requirements, meaning they should be the least intrusive means to achieve their protective purpose and proportionate to the interest being safeguarded. Citing the UN Human Rights Committee, the Board acknowledged that while displaying flags, uniforms, or symbols generally constitutes protected expression, even when associated with painful history, such displays may be legitimately restricted only in exceptional cases where they are directly and predominantly linked to incitement of discrimination, hostility, or violence.

The majority of the Board concluded that removing the Kolovrat symbol post was necessary and proportionate, given its clear references to Slavic nationalism and militaristic rhetoric that could be read as promoting violent action. They found the post contributed to normalizing or promoting white nationalist ideologies online that “further violence or exclusion.” [p. 16]

The Board unanimously agreed that removing the “Defend Europe” post met the necessity and proportionality requirement “to prevent likely and imminent discrimination and violence.” [p. 16] The post included multiple elements, including hate symbols, Nazi associations, and violent rhetoric, that together glorified white supremacist violence and risked being used as a rallying point for extremist networks “seeking to build connections and recruit likeminded individuals while evading content moderation.” [p. 16]

Regarding the third post, the Board supported Meta’s decision to leave it up, as it lacked any clear reference to designated hateful ideologies and focused on the user’s artwork.

Finally, the Board expressed concern about possible overenforcement in cases involving designated symbols, particularly due to insufficient granular enforcement data. It urged Meta to develop a system to detect spikes in removals of non-violating content and adjust enforcement practices accordingly, following Recommendation No. 2 in the Colombian Police Cartoon decision. Such a system would help prevent the wrongful removal of legitimate expression and ensure accurate and rights-respecting moderation. The Board further noted that it expects Meta to develop this monitoring system, report on corrective actions taken, and plans to review implementation progress in a future case evaluation.

Ultimately, the Board upheld Meta’s decisions to take down the first two posts while maintaining the third post.

Policy Advisory Statement

The Board issued key recommendations to improve Meta’s Dangerous Organizations and Individuals policy, focusing on clarity, enforcement, and transparency. First, it called on Meta to publish the internal definition of “references” and its subcategories, such as “positive references” or “incidental depictions”, so users clearly understand what content may be removed. This would bring the public-facing Community Standard in line with internal enforcement practices.

To strengthen enforcement, the Board recommended that Meta establish a clear, evidence-based process for reviewing and categorizing designated symbols. This includes regular audits to ensure that only symbols meeting the established criteria remain on the list and that it reflects global use and relevance. Additionally, the Board urged Meta to develop a system to detect “spikes” in removals of non-violating content involving designated symbols, to prevent overenforcement and improve the precision of its content moderation.

On transparency, the Board asked Meta to publish detailed information on how it creates and enforces the designated symbols list. This should detail the criteria, designation process, enforcement approach (including strikes), and actions taken against prohibited symbols. The Board specified that this information should be available in Meta’s Transparency Center and linked within the Community Standard to ensure users have accessible and accurate information about the platform’s policies.

Dissenting Opinion

A minority of Board members dissented from the majority’s assessment of the first post, contending it neither glorified nor constituted an unclear reference to a hateful ideology. They stressed that Slavic pride, with its distinct cultural and historical meaning, cannot be automatically linked to white nationalism. The minority also suggested the post could qualify for the “social and political discourse” exception, cautioning that Meta’s reliance on “unclear references” in policy enforcement risks over-enforcement. They further called for regular reviews to assess this line of policy’s accuracy and impact on removals, and whether there is potentially a need to narrow it down. Additionally, the dissenting Board members emphasized that removing this post failed to meet necessity and proportionality requirements, contending the content presented no imminent threat. They argued that less restrictive measures, such as de-amplification through algorithmic demotion, could have adequately addressed any potential harms while preserving expression.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This decision has a mixed outcome. While this decision restricts expression by upholding the removal of the first two posts, which Meta’s Dangerous Organizations and Individuals policy found to glorify white nationalism and supremacist ideologies, it allows the third post to remain. The retained post neutrally describes the Odal rune’s historical and linguistic origins, without clear evidence linking it to Nazism or white supremacy.

In this decision, the majority of the Board deemed the removals necessary and proportionate to prevent harm, applying a lower threshold, that focuses on the broader potential for harm and the symbolic role of hate imagery in online radicalization. Nevertheless, a minority of members raised important concerns regarding freedom of expression. They argued that the first post did not present a direct or imminent risk of inciting discrimination or violence to justify the removal, and that less intrusive measures, such as de-amplification, could have been more appropriate. This minority approach aligns more closely with international human rights standards on hate speech, which permit restrictions only when the expression constitutes advocacy of hatred that amounts to incitement, discrimination, hostility, or violence, as set out in Article 20(2) of the ICCPR and interpreted by the Rabat Plan of Action. Their position also reflects a more speech-protective interpretation of Article 19 of the ICCPR, which emphasizes that restrictions must be strictly necessary, the least intrusive means available, and proportionate to the legitimate aim pursued.

The minority, finally, cautioned that Meta’s current enforcement, particularly in the absence of clear public criteria and transparency, risks overreach and undermines users’ rights to engage in political or cultural expression, even when controversial.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Official Case Documents:


Amicus Briefs and Other Legal Authorities


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback