Global Freedom of Expression

Oversight Board Case of Russian Poem

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    November 16, 2022
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2022-008-FB-UA
  • Region & Country
    International, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Facebook Community Standards, Objectionable Content, Violent and graphic content, Hate Speech, Violence And Criminal Behavior, ​​Violence and Incitement, Artistic Expression
  • Tags
    Oversight Board Policy Advisory Statement, Oversight Board Enforcement Recommendation, Oversight Board Content Policy Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On November 16, 2022, the Oversight Board overturned Meta’s original decision to remove a Facebook user’s post containing an image of what appeared to be a dead body and text comparing the Russian army in Ukraine to Nazis and quoting a poem that calls for the killing of fascists. The Board also found that Meta’s subsequent decision to reinstate the content by applying a warning screen to the image did not align with the company’s Graphic Content policy.  In its decision, the Board determined that rather than making general accusations, the post drew parallels between Russian soldiers’ actions in Bucha and the Nazi army in World War II. According to the Board, Meta’s human rights responsibilities and Hate Speech Community Standard allow users to claim that soldiers have committed egregious wrongdoings and allow them to draw provocative comparisons with past events. The Board stressed that the quotes from the poem included in the user’s post were an artistic and cultural reference utilized as a rhetorical device and, when analyzed in the context of the whole post, were used to warn of cycles of violence and the possibility for history to repeat itself in Ukraine. Regarding the image, which appeared to depict a dead body, a majority of the Board deemed it did not contain clear indicators of violence which, according to Meta’s internal guidance for moderators, would justify using a warning screen.

In conclusion, the Board found that removing the post and later applying the warning screen was inconsistent with Facebook’s Community Standards, Meta’s values, and its human rights responsibilities. Moreover, the Board urged Meta to revise its policies to consider the circumstances of unlawful military intervention.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.

 


Facts

In April 2022, a Facebook user in Latvia posted a photo with accompanying text on their news feed. The picture showed what appeared to be a dead body lying on the street next to a fallen bicycle, no wounds were visible. The accompanying text in Russian stated that the alleged crimes committed by Soviet soldiers in Germany during the Second World War were excused since they avenged the atrocities the Nazis had inflicted on the USSR. The text also highlighted the similarities between the Nazi army and the Russian army in Ukraine, claiming that the Russian soldiers had become fascist.

The post stated that the Russian army in Ukraine “rape[s] girls, wound[s] their fathers, torture[s] and kill[s] peaceful people.”It says that “after Bucha, Ukrainians will also want to repeat… and will be able to repeat” such actions and quoted an excerpt of the poem “Kill him!” by Soviet poet Konstantin Simonov “kill the fascist so he will lie on the ground’s backbone, not you”; “kill at least one of them as soon as you can”; “Kill him! Kill him! Kill!” [p.5]. 

The post was viewed approximately 20,000 times, shared about 100 times, and received almost 600 reactions and over 100 comments. 

On the same day the content was posted, a user reported it as “violent and graphic content.” Based on a human reviewer’s assessment, Meta decided to remove the content for violating its Hate Speech Community Standard. After unsuccessfully requesting Meta to reconsider, the user appealed to the Board.

As a result of the Board selecting the appeal for review on May 31, 2022, Meta determined that its original decision to remove the content was in error and restored it. Around three weeks later, the company, under its Violet and Graphic Content policy, applied a warning screen to the photograph that read “sensitive content – this photo may show violent or graphic content” and offered users two options: “learn more” and “see photo.”


Decision Overview

The main issue before the Oversight Board was whether Meta’s decision to remove the post and its subsequent decision to apply a warning screen was in line with Metas Community Standards, Meta’s values, and the company’s human rights responsibilities.

In their appeal to the Board, the user stated that the photo was the “most innocuous” of the pictures documenting the crimes the Russian army had committed in Bucha. Additionally, the user claimed that by citing the extract from Simonov’s poem, they strived to show how the Russian army had become an analog of the fascist army.

In its submission to the Board, Meta focused on why the company had reversed its original decision by explaining it had acted in line with its Hate Speech, Violence, and Incitement and Violent and Graphic Content policies. 

Regarding the Hate Speech policy, the company indicated that the piece of content referring to Russian soldiers who committed crimes in the context of the Russia-Ukraine conflict did not constitute an attack under such policy because “qualified behavioral statements” were allowed on the platform. 

Concerning the Violence and Incitement policy, Meta explained that the user statement that Ukrainians would avenge the actions of the Russian army after the events in Bucha did not advocate violence since the content was a “neutral reference to a potential outcome” permitted under the policy. Similarly, the company argued that the quote from Simonov’s poem sought to raise awareness of the possibility of history repeating itself in Ukraine. The company explained that advocating violence against individuals covered in the Dangerous Individuals and Organizations policy, such as the Nazis, was allowed under the Violence and Incitement Community Standard.

Finally, regarding the Violent and Graphic Content policy, Meta noted that because the image in question showed the violent death of a person, adding a warning screen and appropriate age restrictions were in line with this policy.

Compliance with Meta’s content policies 

To unravel whether the content in question complied with Meta’s content policies, the Board focused its analysis on Meta’s Hate Speech, Violence and Incitement, and Violent and Graphic Content Community Standards.

 I. Hate Speech

The Board highlighted under Meta’s Hate Speech policy, attacks against people based on protected characteristics, such as nationality, are prohibited. The Board explained that the company’s internal guidelines for moderators establish that professions are generally entitled to protection against Hate Speech Tier 1 attacks when referenced along with a protected characteristic. Moreover, it noted that according to the company’s internal guidelines, it does not violate the Hate Speech policy to report human rights violations in a particular context, regardless of whether the targeted individuals are referred to by their national origin.

Focusing on the content in question, the Board deemed that the user’s comparison of Russian soldiers to German fascists during World War II, and their assertion that they raped, killed, and tortured innocent persons, did not violate Meta’s Hate Speech policy. The Board considered that the user’s accusation that Russian soldiers committed crimes analogous to the Nazis in the invasion of Ukraine was permitted. For the Board, the user’s comparison of Russian soldiers’ actions in Ukraine with the Nazis, known to have committed war crimes, are “qualified statements” related to the behavior observed during a specific conflict. Moreover, the Board considered that the user’s post targeted Russian soldiers because of their role as combatants rather than their nationality. Therefore, given that the content was not an attack directed at a group based on their protected characteristics, the Board concluded it could not be considered hate speech.

 II. Violence and Incitement 

The Board noted that under the Violence and Incitement policy, “Meta removes “calls for high-severity violence,” “statements advocating for high-severity violence,” and “aspirational or conditional statements to commit high-severity violence” [p. 12]. It further stated that the company’s internal guidelines for moderators clarify that this policy permits statements with a “neutral reference to a potential outcome of an action or an advisory warning.” [p.13] and “content that condemns or raises awareness of violent threats” [p.13].

Turning to the content in question, the Board deemed that the phrase “Ukrainians will also want to repeat… and will be able to repeat” was a “neutral reference to a potential outcome” since it neither called for nor advocated violence. Instead, the Board considered that it states that Ukrainians could respond as violently to the Russian army’s actions as the Soviets did to the Nazis. It is thus permitted by Meta’s internal guidelines for content moderators on the Violence and Incitement policy.

Similarly, the Board found that the quotations from Simonov’s poem, read together with the entire post and image, sought to warn of the possibility of history repeating itself in Ukraine. They were artistic and cultural references used as rhetorical devices by the user to transmit their message. Given that the content’s primary meaning, in context, was a warning against a cycle of violence, the Board found that it did not violate Meta’s Violence and Incitement Standard.

III. Violent and Graphic Content

The Board noted that under Meta’s Violent and Graphic Content policy, the company places warning labels on content such as images that show the violent death of a person so that users are aware of their graphic or violent nature before they click to see it. The Board highlighted that Meta’s internal guidelines for content moderators establish indicators of imagery that should be considered a violent death. Additionally, the Board indicated that the internal guidelines clarify that bodies without “any visible indicator of violent death” or without “at least one indicator of violence” should not be considered as a depiction of a “violent death.” [p.14].

The Board remarked that while Meta confirmed that the person portrayed in the picture was shot in Bucha, Ukraine, it noted that content moderators working at scale did not necessarily have access to such information.

The majority of the Board considered that in light of Meta’s internal guidelines for content moderators, the use of a warning screen was not justified since the image in question showed a person lying still on the street with no visible wounds, thus lacking clear visual indicators of violence, as described in Meta’s internal guidance for moderators. 

Compliance with Meta’s values

The Board found that the company’s decision to remove the content and place a warning screen over the image was inconsistent with Meta’s values. The Board noted that, while it was concerned about the situation of Russian civilians and the possible effects of violent speech targeting Russians in general, the user’s post did not pose a risk to their “Dignity” and “Safety” that would justify displacing the value of “Voice.”

Compliance with Meta’s human rights responsibilities

The Board noted that Article 19, paragraph 2 of the International Covenant on Civil and Political Rights (ICCPR) provides a “heightened protection to expression, including artistic expression, on political issues, and commentary on public affairs, as well as to discussions of human rights and of historical claims” [p.15]. By employing the three-part test set out in Article 19 ICCPR, the Board analyzed whether Meta’s decision to remove the content and place a warning screen over the image was consistent with its human rights responsibilities as a business.

I. Legality (clarity and accessibility of the rules) 

The Board explained that the principle of legality requires rules that limit freedom of expression to be clear and accessible, which, applied to Meta’s content rules, translates to users being able to understand what kind of content is allowed and what is prohibited on the platform. The Board noted that through the internal guidelines, the company interprets the Violence and Incitement Community Standard policy to allow content that seeks to warn of the possibility of violence by third parties when they are statements with a “neutral reference to a potential outcome of an action or an advisory warning.” Likewise, the Board remarked that internal guidelines establish that otherwise violating content is allowed when it “condemns or raises awareness of violent threats”.

The Board highlighted that since these internal guidelines are not included in the public-facing language of the Violence and Incitement Community Standard, users cannot easily discern when content is violating and when it is not. The Board also deemed that, in the present case, Meta’s determination to add a warning screen was inconsistent with its internal guidelines for content moderators.

Moreover, the Board noted that by failing to inform the user who initially reported the content that the post was later restored, the company’s actions raised legality concerns since they interfered with users’ ability to challenge or follow up on content-related complaints.

II. Legitimate aim

The Board stressed that the Hate Speech Community Standard pursued the legitimate aim of protecting the rights of others, including the rights to equality and non-discrimination based on ethnicity and national origin. However, it considered that protecting soldiers from being targeted for their role as combatants during a war from claims of wrongdoing was not a legitimate aim.

The Board explained that, in this case, the Violence and Incitement Standard sought to prevent the escalation of violence that could harm the physical security and life of people impacted by the Russian-Ukraine conflict. Notably, the Board emphasized that the United Nations General Assembly had recognized the Russian invasion of Ukraine as unlawful; thus, using force as self-defense against acts of aggression was permitted under Article 51 of the UN Charter. While the Board noted it had found the content in question to be non-violating, it urged Meta to revise its policies so that they consider circumstances of unlawful military intervention.

Concerning the company’s decision to incorporate a warning screen into the image, the Board agreed with Meta’s assertion that the Violent and Graphic content policy seeks to foster an environment conducive to diverse participation by limiting content that glorifies violence or celebrates the suffering or humiliation of others.

III. Necessity and proportionality

The Board noted that General Comment 34 establishes that the principle of necessity and proportionality requires that any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected” [p.18].

The Board explained that it typically employs the six-factor test described in the Rabat Plan of Action to assess whether risks were posed by violent or hateful content. In the present case, it had determined that “despite the context of ongoing armed conflict and the charged cultural references employed by the user, it [was] unlikely that the post – a warning against a cycle of violence – would lead to harm” [p.18].

A majority of the Board deemed that the use of a warning screen interfered with the right to freedom of expression and was not a necessary response since the image lacked clear visual indicators of violence, which, Meta’s internal guidelines to content moderators describe as justifying the use of a warning screen. The Board urged Meta to develop customization tools that allow users to decide whether to see sensitive graphic content with or without warnings on the platforms.

In conclusion, the Board determined that “despite the context of Russia’s unlawful invasion of Ukraine where potentially inflammatory speech could increase tensions, the evident intention of the user (raising awareness around the war and its consequences), the reflective tone adopted in quoting a war poem, and the proliferation of other communications regarding the horrific events in Ukraine, mean the content is not likely to contribute significantly to the exacerbation of violence” [p.19].

In light of the above, the Board found that Meta’s initial decisions to remove the content and apply the warning screen to the content were inconsistent with Facebook’s Community Standards, Meta’s values, and its human rights responsibilities. 

Policy advisory statement:

In its policy advisory statement, the Board presented two specific recommendations regarding Meta’s Community Standards and enforcement.

Concerning the Violence and Incitement Community Standard, the Board recommended that Meta explain in its public-facing standard that the company “interprets the policy to allow content containing statements with a “neutral reference to a potential outcome of an action or an advisory warning,” and content that “condemns or raises awareness of violent threats” [p.20]. Additionally, the Board recommended that Meta adds to the public Violent and Graphic content Community Standard detail from its internal guidelines on how it determines whether an image depicts “the violent death of a person.”  

The Board also recommended that Meta assess the possibility of incorporating tools that allow adult users to choose whether or not to see graphic content, with or without a warning screen.

Dissenting or Concurring Opinions: 

A minority of the Board considered the warning screen a necessary and proportionate measure tailored to encourage participation and freedom of expression. Particularly, this minority deemed that considering the dignity of deceased persons in the context of armed conflicts, Meta may err on prudence by adding warning screens over content, given the possible effects of images depicting death and violence on a great number of users. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Board expanded expression by overturning Meta’s initial decision to remove the content and holding that adding a warning screen was neither necessary nor proportionate and thus interfered with the user’s right to freedom of expression. 

Through this decision, the Board emphasized the importance of examining the context in which an alleged call for violence is made. Particularly the Board caution against the literal interpretation of words since they may only sometimes portray the essence of the message the users seek to convey when the content is read as a whole. The Board stressed that the use of artistic and cultural references must be carefully analyzed since, as in this case, the user, by quoting an extract from a poem, strived to warn of cycles of violence and the potential for history to repeat itself in Ukraine.

 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board noted that the UNGPs impose a heightened responsibility on businesses operating in a conflict setting.

  • ICCPR, art. 2

    The Board stated that Facebook’s Hate Speech Community Standard pursues the legitimate aim of protecting the rights of others, including the rights to equality and non-discrimination based on ethnicity and national origin, as contained in this Article.

  • ICCPR, art. 6

    The Board interpreted the right to life contained as established in this Article.

  • ICCPR, art. 9

    The Board referenced this standard in its analysis of this case on the right to security of persons.

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression. It employed the three-part test established in this Article to assess if Facebook’s actions allowed expression to be limited.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Facebook’s actions allowed expression to be limited, the Board referred to the General Comment for guidance.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board noted that since Meta had failed to notify the user who initially reported the content that it had restored the content, it was concerned that a lack of relevant information for users may interfere with their ability to challenge content actions or follow up on content-related complaints. The Board cited this standard in this analysis.

  • UN Special Rapporteur on freedom of opinion and expression, report A/HRC/44/49/Add.2 (2020)

    The Board cited this report to underscore that artistic expression includes fictional and nonfictional stories that educate, divert, or provoke.

  • UN Special Rapporteur on freedom of opinion and expression, A/74/486 (2019)

    The Board referred to this standard to highlight that social media companies should consider a range of possible responses to problematic content to ensure narrowly tailored restrictions.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback