Global Freedom of Expression

Oversight Board case of a Nazi quote

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    January 28, 2021
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2020-005-FB-UA
  • Region & Country
    United States, North America
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Oversight Board Policy Advisory Statement, Political speech

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Oversight Board overturned Facebook’s (now Meta) decision to remove content in which a user posted a quote incorrectly attributed to Joseph Goebbels, Minister of Propaganda in Nazi Germany. Facebook removed the content since it considered it to have breached the Community Standard on Dangerous Individuals and Organizations. The Board considered that the Facebook post did not intend to praise or support the Nazi party or Goebbels, and that comments on the post support the user’s claim that the post sought to draw comparisons between the presidency of Donald Trump and the Nazi regime. The Board also argued that Facebook’s Community Standard on Dangerous Individuals and Organizations lacks clarity, including because it fails to explain key terms.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In October 2020, a user posted on Facebook a quote incorrectly attributed to Minister of Propaganda during Nazi Germany, Joseph Goebbels. The quote, in English, “claimed that there is no point in appealing to intellectuals, as they will not be converted and, in any case, yield to the stronger man in the street. As such, the quote stated that arguments should appeal to emotions and instincts. It ended by claiming that truth does not matter and is subordinate to tactics and psychology” [p. 3-4]. No pictures of Goebbels or Nazi symbols accompanied the post, nor was there “additional commentary within the post indicating the user’s intent in sharing the content.” [p. 4].

The content was first published two years prior and was shared again by the user when prompted “by Facebook’s ‘memory’ function, which allows users to see what they posted on a specific day in a previous year, with the option of resharing the post” [p. 4]. Although no users reported the content, Facebook (now Meta) removed the post since the company considered it violated its Community Standard on Dangerous Individuals and Organizations. 


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove a user’s post, that included a quote incorrectly attributed to Joseph Goebbels Minister of Propaganda in Nazi Germany, complied with the company’s Dangerous Individuals and Organizations Community Standard. Likewise, the Board inquired —through a three-part test— if Facebook’s measure of removing the content complied with Human Rights standards on freedom of expression.

The affected user argued before the Board that the intention of posting the aforementioned content was “to draw a comparison between the sentiment in the quote and the presidency of Donald Trump” [p. 4]. For the user, the quote was a comment on important social issues related to “a leader whose presidency is following a fascist model” [p. 4]. Comments on the post suggest that the user’s friends understood their intention.

For its part, Facebook argued “that it treats content that quotes, or attributes quotes (regardless of their accuracy) to a designated dangerous individual as an expression of support for that individual, unless the user provides additional context to make their intent explicit” [p. 6]. According to Facebook, since the user didn’t provide additional context or commentary regarding the intention of the quote, the removal of the content was justified. Facebook also noted that although the comments on the post suggested that the content did not intend to show support or praise to Goebbels, the company only reviewed the post itself when making its content moderation decision. Facebook also “confirmed that the Nazi party (the national Socialist German Workers’ Party, active between 1920 and 1945) had been designated as a hate organization since 2009 by Facebook internally. Joseph Goebbels, as one of the party’s leaders, is designated as a dangerous individual” [p. 7].

The Board noted that when Facebook notified the user that their post was removed, “the company did not tell them which Community Standard their post had violated” [p. 6].

Compliance with Community Standards

The Board first analyzed whether Facebook’s decision to remove the user’s content complied with the company’s Dangerous Individuals and Organizations Community Standard. This standard seeks “to prevent and disrupt real-world harm”, by not allowing “any organisations or individuals that proclaim a violent mission or are engaged in violence” to have a presence on Facebook [p. 5]. Thus, Facebook will remove any content that supports or praises “groups, leaders or individuals involved in these activities” [p. 5]. 

The Board considered that Facebook’s explanation of its decision provided clarification on certain aspects that were not included in the Community Standards. For example, the fact that the Nazi party and Goebbels are designated internally as a hate organisation and a dangerous individual respectively along with the fact that “Facebook treats all content that supposedly quotes a designated dangerous individual as an expression of praise or support for that individual, unless the user provides additional context to make their intent explicit” [p. 7], are new elements not included in the Community Standards.

For the Board it was clear that the post “did not promote the ideology of the Nazi party and did not endorse the regime’s acts of hate and violence. Comments on the post from the user’s friends appear to support the user’s claim that the post sought to draw comparisons between the presidency of Donald Trump and the Nazi regime” [p. 8].

The Oversight Board also underscored “an information gap between the publicly available text of the dangerous individuals and organisations policy and the additional internal rules applied by Facebook’s content moderators” [p. 8]. The Board considered that the public text had not clearly outlined that when posting quotes on the platform attributed to dangerous individuals, it is also necessary to provide additional context, specifying that the content neither shows support nor praise for individuals or organizations involved in organized hate. 

Taking into consideration these reasons, the Oversight Board found that “the removal of the post clearly falls outside the spirit of the [Dangerous Individuals and Organizations] policy” [p. 8].

Compliance with Facebook’s values

Facebook’s Community Standards establish “Voice” and “Safety” as fundamental values for the platform. “Voice” seeks “to create a place for expression and give people a voice” so they can talk about issues openly [p. 5]. “Safety” aims to make Facebook a safe place. “Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook” [p. 5.]

The Board explained that when analyzing “content removed under the Dangerous Individuals and Organizations policy, the value of ‘Safety’ is balanced against the ‘paramount’ value of ‘Voice’” [p. 8]. Considering the content of the post, the Board found minimal benefit to “Safety” in removing the post. Its removal “unnecessarily undermined the value of ‘Voice’” [p. 8], the Board argued. In light of this, for the Oversight Board, the removal of the post didn’t comply with Facebook’s values.

Compliance with Human Rights Standards

Upon analyzing Facebook’s decision to remove the user’s content in relation to Human Right standards on freedom of expression, the Board cited the Human Right Committee General Comment 34 about Article 19, para. 2, of the ICCPR, to highlight the fact that “[t]he value placed on the right to freedom of expression is particularly high in public debate about political figures, which was the subject of this post” [p. 9].

The Board considered that restrictions to freedom of expression, such as the decision issued by Facebook to remove content, must meet the requirements of a three-part that analyzes the legality, legitimate aim, and necessity and proportionality of the restrictive measure. 

  • Legality

According to the Board, citing the Human Rights Committee General Comment 34, the legality requirement states that “[a]ny rules restricting expression must be clear, precise and publicly accessible to allow individuals to change their conduct accordingly” [p. 9]. For the Board, Facebook’s Community Standard on Dangerous Individuals and Organizations fails to meet these conditions since this policy lacks clear examples that explain the application of “support”, “praise”, and “representation”, making it difficult for users to understand this Community Standard [p. 9]. Similarly the Board argued that Facebook also failed to provide a list of individuals or organizations deemed dangerous, or at the very least examples of groups or individuals designated as such. The Board considered that the aforementioned policy also lacked an explanation as to how “it ascertains a user’s intent, making it hard for users to foresee how and when the policy will apply and conduct themselves accordingly” [p. 9].

The Board also expressed concern that the user in this case was not informed or notified about which Community Standard they breached or violated upon removal of the content.

  • Legitimate aim 

Regarding this requirement, the Board noted that Article 19, para, 3 of the ICCPR states that “legitimate aims [to restrict freedom of expression] include respect for the rights or reputations of others, as well as the protection of national security, public order, or public health or morals” [p. 10]. The Oversight Board considered that the rationale and aim behind the Dangerous Individuals and Organizations policy is the protection of the rights of others. It is aimed at protecting individuals from discrimination, and “attacks on life or foreseeable intentional acts resulting in physical or mental injury” [p. 10]. This is, according to the Board, a legitimate aim that satisfies this requirement. 

  • Necessity and proportionality

For the necessity and proportionality requirements to be satisfied, the Board argued, “[a]ny restriction [to freedom of expression] must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” [p. 10]. 

Based on a report from UN Special Rapporteur on freedom of opinion and expression, the Board noted that there has been “a global rise in support and acceptance of neo-Nazi ideology” [p. 10] and that posts about dangerous organizations may need to be removed where there is insufficient context.  In this case however, “the content of the quote and other users’ responses to it, the user’s location and the timing of the post during an election campaign are all relevant” [p. 10]. The Board stated that Facebook’s lack of review or analysis regarding “these contextual cues resulted in an unnecessary and disproportionate restriction on expression” [p. 10].

The Board noted the importance of Facebook’s task in taking action against Nazi ideologies on its platform. However, the user’s post in this case sought to criticize a politician by comparing his governance to that of Nazis. Removing this content, the Board opined, does not promote equality and nondiscrimination. The Board noted that any restriction on freedom of expression must respect these principles, citing the Human Rights Committee General Comment 34.

Taking into account the Board’s finding that Facebook’s measure to remove the user’s content did not pass the legality and necessity and proportionality requirements of the three-part test, the Board overturned “Facebook’s decision to take down the content, requiring the post to be restored” [p. 11]. 

Policy advisory statement:

Additionally, the Board made several recommendations to Facebook. The Board suggested that users must be always notified “of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing” [p. 11].  

The Board also urged the company to “[e]xplain and provide examples of the application of key terms used in the Dangerous Individuals and Organisations Policy, including the meanings of ‘praise’, ‘support’ and ‘representation’ [and] provide clearer guidance to users on how to make their intent apparent when discussing individuals or organisations designated as dangerous” [p. 11]. 

Finally, the Oversight Board requested that Facebook provide a public list naming which organizations and individuals were deemed or designated as dangerous by the company “under the Dangerous Individuals and Organisations Community Standard” [p. 11]. At the very least, the list should provide illustrative examples.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

In this decision, the Oversight Board expands freedom of expression by highlighting the importance of context in the process of moderating content related to political speech or matters of public debate. Likewise, the Board’s assessment regarding the lack of clarity of the Dangerous Individuals and Organizations Community Standard fosters transparency in content moderation and underscores the need to articulate more precise rules for social platforms. 

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used General Comment No. 34 as the legal basis to apply the three-part test and to underscore that the value of expression is particularly high when discussing matters of public debate or concern

  • United Nations Guiding Principles on Business and Human Rights (2011)

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback