Facebook Community Standards, Objectionable Content, Hate Speech/Hateful Conduct, Instagram Community Guidelines, Hate Speech, Bullying and Abuse, Referral to Facebook Community Standards
Oversight Board Case of Reclaiming Arabic Words
International
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On January 23, 2024, the Oversight Board overturned Meta’s original decision to leave up an Instagram post that contained misinformation about the Holocaust. The post questioned the number of Holocaust victims and the existence of crematoria at Auschwitz, presenting these false claims as “true history.” Meta initially left the post up despite user reports. The Board found that the content violated Meta’s Hate Speech Community Standard, which prohibits Holocaust denial. It concluded that the prohibition is consistent with Meta’s human rights responsibilities and expressed concern over the company’s failure to remove the post in a timely manner. The Board also highlighted broader deficiencies in Meta’s enforcement of its policy against Holocaust denial.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
On September 8, 2020, a user posted a meme of the titular cartoon character from the television series SpongeBob SquarePants on Instagram with a speech bubble entitled “Fun Facts About the Holocaust.” The post contained false and distorted information about the Holocaust. It questioned the number of Holocaust victims as well as the existence of crematoria at Auschwitz. Moreover, the content claimed that world leaders of the historical period did not acknowledge the Holocaust in their memoirs.
The caption of the post included meme-related tags, including some targeting geographical audiences. The user was not considered a public figure by Meta and had around 10,000 followers. The post received less than 500 views and fewer than 100 likes. The comments criticized the post to which the user replied by claiming their false claims were “real history.”
A few weeks after the content was posted, Meta announced updates to its Hate Speech policy to prohibit Holocaust denial or distortion due to a rise in this type of content and antisemitism globally. On November 23, 2022, Holocaust denial was classified as Tier 1 hate speech prohibited on Meta’s platforms.
The post received a total of six user reports. Four of these were made before Meta’s policy update—when the Covid-19 automation policies were still in operation, which classified the post as “non-violating.” Meta received two other reports in May 2023 after the Hate Speech policy introduced a ban on Holocaust denial content. However, these were also automatically closed due to Meta’s COVID-19 automation policies, which were still in force in May 2023. They then appealed to the Oversight Board (OSB).
On January 23, 2024, the Oversight Board issued a decision on the matter. The main issue before the Board was whether Meta’s decision to keep a post with misinformation about the Holocaust was compatible with Meta’s Hate Speech Community Standard—which prohibits Holocaust denial on Instagram and Facebook—and with its human rights obligations.
In their appeal to the OSB, the reporting user expressed shock that the content remained on Instagram, stating it reflected blatant neo-Nazi beliefs. They emphasized that millions of Jewish, Roma, disabled, and LGBTQ+ individuals were murdered by the Nazi regime and argued that the content constituted hate speech.
The Board invited the posting user to submit a statement. The user identified as an “LGBT comedian” who claimed they intended to parody far-right beliefs and uplift marginalized groups by mocking the alt-right.
On Meta’s side, the company reversed its original decision and removed the post for violating the Hate Speech policy after the Board selected the case. However, it did not apply a strike to the user’s account, as the content had been posted more than 90 days prior. Meta explained that the post violated the Hate Speech policy by denying the Holocaust, suggesting the number of victims was fabricated, and questioning the existence of crematoria at Auschwitz. Meta acknowledged that the second human review, conducted on May 25, 2023, after the policy change, was incorrect.
The Board submitted 13 questions to Meta, covering topics such as automated policies leading to closed reports, the development of the Holocaust denial policy, enforcement mechanisms, and Meta’s efforts to promote credible information about the Holocaust and address antisemitism. Meta responded to all questions.
In response to one question, Meta confirmed that before the policy update, Holocaust denial would not have led to removal unless it was combined with other forms of prohibited hate speech. The company clarified that the content in question did not include such additional elements.
At the outset of its analysis, the Board noted that UN General Assembly Resolution 76/250, adopted by consensus in 2022, reaffirmed the importance of remembering the Holocaust’s six million victims and expressed concern over the spread of Holocaust denial online. It also referenced the UN Special Rapporteur on contemporary forms of racism (A/74/253), who noted the increasing magnitude of antisemitic incidents, particularly in North America and Europe. International standards emphasize that Holocaust denial constitutes a form of antisemitism and a conspiracy theory that reinforces harmful stereotypes, especially the false claim that Jewish people invented the Holocaust for political gain.
Organizations such as the Anti-Defamation League (ADL) and the American Jewish Committee have reported a sharp increase in antisemitic incidents. Investigations revealed that Meta’s platforms, especially Instagram, continued to host hate groups previously banned by Meta as “dangerous organizations.” One study found that Instagram had even recommended accounts spreading virulent antisemitism to a 14-year-old test persona.
1. Compliance With Meta’s Content Policies
Content Rules
The Board found that the post violated the Hate Speech policy. To reach this conclusion, it consulted external experts on forms of Holocaust denial and distortion on Meta’s platforms and beyond. Experts confirmed that all claims in the post constituted Holocaust denial or distortion. The OSB also highlighted the Brandeis Center’s submission that the Holocaust was proven beyond a reasonable doubt. Additionally, the Board commissioned an assessment on Holocaust denial content on Meta’s platforms, which revealed a correlation between the Squidward meme format and the spread of antisemitic content.
As underscored by the Board, the Hate Speech policy allowed certain exceptions in favor of content that condemned and raised awareness or was self-referential or empowering, provided the user’s intent was clear. The Board found that none of these exceptions applied in this case.
Another exception for satire also existed. Despite the user’s claim that the post was parodying alt-right talking points, the OSB found no evidence of satire in the content. The meme lacked satirical exaggeration, the Board said, and cartoon characters were used to evade content moderation and target younger audiences. The use of hashtags aimed to amplify reach rather than signal satire. The user’s response to criticism—stating the post reflected “real history”—further confirmed the intent was not satirical.
The Board stressed that the post should have been removed by the human reviewer on May 25, 2023, as it questioned the number of Holocaust victims and the existence of crematoria at Auschwitz. It also claimed world leaders omitted the Holocaust in their memoirs—another denialist statement.
The OSB also concluded that the content violated the Hate Speech policy even before the update, as it mocked the victims of hate crimes. The cartoon meme format inherently ridiculed the Holocaust and its victims, the Board underscored.
Enforcement Action
The Board’s assessment found that some of Meta’s users circumvent enforcement by altering spellings or indirectly referencing the Holocaust. Downplaying the number of Jewish victims was widespread. Holocaust denial content was more accessible and garnered more engagement on Instagram than on Facebook, the OSB stated.
While use of terms like “Holohoax,” and references to neo-Nazi propaganda, declined after the policy update, enforcement gaps persisted. The content in this case was reported twice after the update, and yet it was not removed. As the OSB mentioned, Meta did not require its large-scale reviewers to explain non-removal decisions, leaving unclear why the reviewer erred on May 25, 2023.
The Board stressed that Meta must ensure that content reported after a policy update is reviewed according to the latest standards, regardless of when it was posted. This requires updating training and automated tools, plus regular evaluation of enforcement effectiveness.
Meta could not provide data on how effective its systems are at removing Holocaust denial. The company cited technical limitations, but the Board emphasized that these were solvable.
Subsequently, the OSB expressed concern over Meta’s continued use of COVID-19-era automation policies as of May 2023, which led to the automatic closure of reports and the appeal in this case. These outdated measures—justified by pandemic-era staffing shortages—had a negative impact on content moderation. The Board urged Meta to reinstate human review processes.
2. Compliance With Meta’s Human-Rights Responsibilities
Legality (Clarity and Accessibility of the Rules)
The Board considered that the current prohibition on Holocaust denial met the legality standard, though it is now less clear than when it was introduced. This is due to the removal of explicit references to the “distortion” element. The earlier policy also prohibited mocking victims of hate crimes, effectively covering most Holocaust denialism. However, the lack of alignment between Facebook’s Community Standards and Instagram’s Guidelines remained a concern. Meta said it was committed to harmonizing them but deprioritized this work. The OSB reiterated its call for consistent short-term solutions given the rise of Holocaust denial on Instagram.
It held that applying policy updates to ongoing content moderation does not violate the legality requirement, as the content remains accessible and subject to the rules. However, it discouraged applying retroactive strikes.
Legitimate Aim
As established in previous cases, the Board recognized that the Hate Speech policy serves the legitimate aim of protecting the rights of others. The prohibition of Holocaust denial upholds Jewish people’s rights to equality, non-discrimination, and expression. Allowing such hate speech fosters an exclusionary and hostile environment. Denial of the Holocaust also undermines the dignity and memory of victims and their relatives.
Necessity and Proportionality
The Board found that Meta’s decision to prohibit Holocaust denial aligns with its human rights responsibilities. The majority agreed that removing such content met the necessity and proportionality test due to the increasing spread of antisemitic content online and offline and the harm it causes. The ban, the OSB highlighted, fulfills Article 4(a) of the International Convention on the Elimination of All Forms of Racial Discrimination (ICERD), which mandates restrictions on hate speech.
Some Board members from the majority referenced the Faurisson v. France decision, where the UN Human Rights Committee upheld a ban on Holocaust denial. They argued that misinformation about proven historical facts can infringe on others’ rights to protection from violence and discrimination.
The Board encouraged alternative measures in content moderation—like Holocaust education, referrals to credible information, and partnerships with anti-hate groups—but found them insufficient alone to prevent harm. Thus, removal remained the least intrusive and effective measure for the Board.
A minority of the OSB condemned Holocaust denial but disagreed with the legal reasoning behind the removal. They argued that Faurisson was superseded by the International Covenant on Civil and Political Rights (ICCPR) General Comment 34, which opposes general bans on historical opinions unless they incite hatred or violence. They held that Meta did not meet the burden of showing that the removal was the least intrusive option and called for public transparency on any deviation from international norms.
Ultimately, the Oversight Board overturned Meta’s original decision to leave up the content.
3. Policy Advisory Statement
Enforcement
The Board recommended that Meta take technical steps to systematically assess the accuracy of Holocaust denial enforcement. This should include collecting granular data, as was done in the Mention of the Taliban in News Reporting case reporting recommendations.
Transparency
The Board recommended that Meta publicly confirm whether COVID-19 automation policies remain active. It reminded Meta to retire these outdated measures and urged transparency regarding any delays, alongside the implementation of interim solutions.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
In this decision, the Oversight Board restricted expression by upholding Meta’s blanket prohibition on Holocaust denial content. However, the restriction is aimed at protecting the rights of Jewish people to safety, equality, non-discrimination, and freedom of expression. The Board found that prohibiting this specific type of content was consistent with Meta’s human rights responsibilities and necessary in light of the rising threat of antisemitism. While the decision limits certain forms of expression, it is compatible with international human rights standards, including the approaches of the UN Human Rights Committee and the European Court of Human Rights, both of which recognize that Holocaust denial and related misinformation can constitute hate speech and incitement to discrimination.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior Board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.