Case Summary and Outcome
The Oversight Board upheld Meta’s decision to leave a video featuring French politician Éric Zemmour discussing demographic changes on his official Facebook page. In the video, Zemmour claimed that population growth in Africa compared to Europe had shifted the “power balance,” asserting that Africa was now “colonizing” Europe. The Board’s majority concluded that the content did not violate Meta’s Hate Speech policy because it did not amount to a “direct attack” on a protected group, nor did it violate the Dangerous Organizations and Individuals policy, as it lacked the elements required for a Violence-Inducing Conspiracy Network. The Board also held that removing the content would disproportionately burden freedom of expression on a salient political topic. However, the Board recommended that Meta provide clearer guidance on how it distinguishes between political speech on immigration and harmful conspiracy theories.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
Facts
On July 7, 2023, the administrator of Éric Zemmour’s verified Facebook page posted a 50-second video in French in which Zemmour discussed demographic changes in Europe and Africa. He stated that Africa had experienced a population explosion since the early 20th century, while Europe’s population had remained roughly stable at around 400 million. He claimed that the African population had reached 1.5 billion, concluding that “the power balance has shifted.” The post’s caption reiterated that in the 1900s, “when there were four Europeans for one African, [Europe] colonized Africa,” but now “there are four Africans for one European and Africa colonizes Europe.” The page had about 300,000 followers; as of January 2024, the video had 40,000 views and fewer than 1,000 reactions.
Zemmour, a candidate in the 2022 French presidential election, previously worked as a columnist and TV commentator known for provocative views on Islam, immigration, and women. He has faced multiple legal proceedings and several convictions for inciting racial hatred. In 2011, he was convicted for stating that “most dealers are blacks and Arabs. That’s a fact.” In 2020, he was convicted of inciting racial hatred and fined 10,000 euros for claiming that child migrants are “thieves, killers, they’re rapists. That’s all they are. We should send them back.” He was also convicted of inciting discrimination and religious hatred against the French Muslim community for saying that Muslims should be given “the choice between Islam and France” and that “for thirty years we have been experiencing an invasion, a colonization (…) it is also the fight to Islamize a territory which is not, which is normally a non-Islamized land.” In December 2022, the European Court of Human Rights ruled that this conviction did not violate Zemmour’s right to freedom of expression.
The content was posted ten days after the fatal point-blank shooting of Nahel Merzouk, a 17-year-old French citizen of North African descent, by police on June 27, 2023. His death ignited widespread riots and protests against police brutality and systemic racism. This unrest coincided with a heated national debate over a new immigration bill, which proposed migration quotas and restricted family reunification and social benefits. Into this fraught context, Éric Zemmour and his party actively advocated for stricter immigration restrictions, a salient issue in a country hosting approximately 700,000 refugees and asylum seekers.
Although the content did not explicitly mention the Great Replacement Theory, the concept is central to Zemmour’s political ideology and featured heavily in his presidential campaign. During the campaign, he promised to create a “Ministry of Remigration” and stated he would “send back a million” foreigners in five years.
According to independent research commissioned by the Board, proponents of the Great Replacement Theory argue that white European populations are being deliberately replaced ethnically and culturally through migration and the growth of minority communities, asserting that contemporary migration of non-white (predominantly Muslim) people from Africa and Asia to Europe constitutes a form of demographic warfare. The Board’s experts emphasized that the theory is marked as conspiratorial because of its insistence that there is an actual plot to bring non-whites into Europe to replace or reduce the proportion of white populations. Linguistic experts consulted by the Board explained that the Great Replacement Theory and associated terms “incite racism, hatred and violence targeting immigrants, non-white Europeans, and target Muslims specifically.” The theory has been linked to several violent incidents around the world, including the mass shooting in Christchurch, New Zealand, in which 51 Muslims were killed.
The Board also acknowledged the rise of violent far-right protests in France, including incidents following the Crépol stabbing in November 2023, where protestors blamed immigrants despite the fact that of the nine people arrested in connection with the stabbing, eight were French and one Italian.
Two users separately reported the content as violating Meta’s Hate Speech policy. The company automatically closed both reports because they were not prioritized for review within a 48-hour period. Meta explained that reports are dynamically prioritized based on factors such as the severity of the predicted violation, the content’s virality (number of views), and the likelihood that the content violates company policies. Subsequently, the first person who reported the content appealed Meta’s decision. The appeal was assessed by a human reviewer who upheld Meta’s original decision to keep the content up. Although Meta did not consider the page administrator who posted the video to be a public figure, the company considered Zemmour himself a public figure.
The reporting user then appealed to the Oversight Board.
Decision Overview
The main issue before the Oversight Board was whether Meta’s decision to keep the video on Facebook was compatible with the company’s content policies, values, and human rights obligations.
The appealing user argued that Zemmour’s claims amounted to “fake news” about colonization and migration.
Meta reviewed the post with subject-matter experts and maintained that the content did not violate its Hate Speech policy, which requires both a protected characteristic and a direct attack. Meta clarified that commentary on immigration policy is generally allowed and that it does not consider the allegation that one group is “colonizing” a place to be an attack in itself unless it amounts to a call for exclusion. Meta also argued that references to “Africa” do not identify a protected group because countries and continents are not covered by the policy. The company emphasized that it “want[s] to allow citizens to discuss the laws and policies of their nations so long as this discussion does not constitute attacks against vulnerable groups who may be the subject of those laws.”
The Board submitted eight written questions to Meta, focusing on its policy development related to the Great Replacement Theory, the applicability of various standards under both its Hate Speech and Dangerous Organizations and Individuals (DOI) policies, and the violation history of the page. Meta answered six questions fully but provided incomplete responses to two, prompting the Board to issue a follow-up question. Meta provided limited additional information and declined to disclose internal deliberations on conspiracy-theory-related policy options, citing concerns about over-removal of political speech.
(1) Compliance with Meta’s content policies and values
The Board’s analysis began with a detailed examination of the content under Meta’s policies. The Board concluded that the content did not violate the Hate Speech policy, characterizing the video as a protected, though controversial, expression of opinion on immigration. Applying the policy’s two-pronged test, which requires a “direct attack” on a group defined by a “protected characteristic,” the Board found that Zemmour’s comments, while implying a shift in power and using the term “colonizes,” did not include an explicit call for the exclusion or segregation of any specific group, nor did they employ dehumanizing stereotypes, slurs, or other forms of direct attack. The Board noted that Meta’s policy rationale explicitly allows “commentary on and criticism of immigration policies.”
A point of concern for the Board was Meta’s position that “Africans” do not constitute a protected characteristic group. The Board argued that, in the context of Zemmour’s past statements and French political discourse, the term “Africans” functioned as a proxy for non-white Africans, particularly Black and Muslim individuals, who are protected on the basis of race and religion. Nonetheless, they held that the absence of a direct attack remained the decisive factor.
The Board also found no violation of the DOI policy. For content to fall under the “Violence-Inducing Conspiracy Network” provision, it must be tied to an identifiable network with a defined mission, promote unfounded theories involving secret plots, and be directly connected to offline harm. The Board determined that the post itself did not explicitly mention the Great Replacement Theory or a clear conspiratorial claim of a “secret plot,” and therefore did not violate the current policy. However, the Board strongly criticized Meta for its lack of transparency regarding internal policy development on harmful conspiracy theories, noting that the company provided insufficient information about its research and decision-making process.
Throughout its assessment, the Board weighed Meta’s values of “voice,” “safety,” and “dignity.” It concluded that political expression on immigration, an issue of significant public concern, should be protected in the absence of explicit policy violations, as the potential for offense did not outweigh the imperative to safeguard public discourse.
(2) Compliance with Meta’s human rights responsibilities
The Board first noted that Article 19 of the ICCPR affords broad protection to freedom of expression, guaranteeing the right to seek, receive, and impart information and ideas of all kinds, including political discourse and commentary on public affairs. The Board referred to the Human Rights Committee’s clarification that this protection extends even to expression that may be considered deeply offensive, though such speech may be restricted under Article 19(3) and Article 20 to safeguard the rights or reputations of others or to prevent incitement to discrimination, hostility, or violence. The Board also highlighted the UN General Assembly’s commitment to protecting freedom of expression in accordance with international law, recognizing that open and free debate, particularly on issues such as migration, is essential to fostering a comprehensive understanding of public matters.
The Board then assessed Meta’s decision using the three-part test for permissible restrictions on freedom of expression, derived from Article 19 of the International Covenant on Civil and Political Rights (ICCPR), to determine if it aligned with international standards. This test requires any restriction to be provided by law, pursue a legitimate aim, and be necessary and proportionate.
I. Legality (clarity and accessibility of the rules)
The Board reiterated the UN Special Rapporteur on freedom of expression’s view that, in the context of social media, the principle of legality requires that any rules restricting speech be accessible, clear, and formulated with sufficient precision to allow individuals to understand what conduct is prohibited and to guide reviewers in enforcing those rules consistently.
The Board acknowledged that none of Meta’s current policies “specifically and clearly” prohibited the content at issue. It emphasized that an ordinary user reading the Hate Speech policy would likely understand that only the most severe attacks against immigrants and migrants would be removed, as Meta explicitly states that it wants to allow commentary and criticism of immigration policies on its platforms. The Board found this commitment consistent with Meta’s human rights responsibilities.
The Board further noted that Meta’s current DOI policy contains no provisions that would prohibit the content in this case. It observed that even if Meta did specifically and clearly prohibit content engaging with the Great Replacement Theory, this post did not go so far as to name the theory or elaborate on its elements in ways that could be considered conspiratorial or harmful. The post did not allege that migratory flows to Europe involving specific groups were part of a secret plot orchestrated by actors with hidden agendas.
II. Legitimate aim
The Board affirmed that the aims underpinning Meta’s Hate Speech policy—protecting the rights to equality and non-discrimination—and its Dangerous Organizations and Individuals policy—protecting the rights of others—constitute legitimate aims under international human rights law, as previously held in its Knin Cartoon and Shared Al Jazeera Post decisions. It also reiterated its findings in the Depiction of Zwarte Piet and Former President Trump’s Suspension decisions that restrictions imposed solely to prevent offense do not qualify as a legitimate aim, given the high value international human rights law places on open and uninhibited expression.
III. Necessity and proportionality
The Board noted that the principle of necessity and proportionality requires that any restriction on freedom of expression be appropriate to achieve its protective purpose, represent the least intrusive means available, and remain proportionate to the interest being protected. In line with their human rights responsibilities, social media companies should consider a range of responses to problematic content beyond deletion, ensuring that any restrictions are narrowly tailored.
To assess the potential risk of the video, the Board applied the six-part Rabat Plan of Action test, evaluating its context, speaker, intent, content and form, extent of dissemination, and likelihood of harm. The test places particular emphasis on the “content and form” as a critical element for determining incitement.
The Board highlighted that in the 50-second clip, Zemmour’s comments did not directly reference the conspiratorial elements of the Great Replacement Theory, nor did the video contain inflammatory features such as violent or inciting imagery. The statements and accompanying caption included no direct calls for violence or exclusion. The Board held that excluding politically controversial content based on statements the speaker had made elsewhere would violate freedom of expression. They also noted that the figures Zemmour cited were only slightly exaggerated and that the video’s primary focus was immigration, a highly salient political issue. The Board therefore concluded that removing the content was neither necessary nor proportionate.
The Board further expressed doubt that any policy could target this content, which contained no explicit phrases like “Great Replacement,” while meeting legal standards of necessity and proportionality. They warned that removing content based on coded references would suppress protected political expression, emphasizing that content that is protected on its face must not be penalized due to the speaker’s identity or its similarity to hateful ideologies.
Ultimately, the Board upheld Meta’s decision to leave the content on Facebook, finding it consistent with Meta’s content policies, values, and human rights obligations.
Policy advisory statement
The Board urged Meta to clarify its Hate Speech policy by more clearly distinguishing legitimate discussions about immigration from harmful speech that targets individuals based on their migratory status. The Board also recommended that Meta explain how it evaluates and moderates content that amplifies hateful conspiracy theories. Such clarification is essential to help users understand how Meta protects political expression on immigration while also addressing the potential offline harms associated with conspiracy-driven rhetoric.
The Board will consider this recommendation implemented once Meta publishes an update detailing its approach to immigration-related debates in the context of the Great Replacement Theory and prominently links this update in its Transparency Center.
Dissenting Opinions
A minority of the Board agreed that the content did not violate Meta’s current rules, yet viewed this outcome as a sign that the policies themselves are too narrow. In their view, Meta must draw a clearer line between permissible criticism of immigration policy and content that promotes harmful conspiracy theories targeting protected groups.
They noted that the post relied on themes associated with the Great Replacement Theory, a narrative historically used to stigmatize Black people, Arabs, and Muslims. Even though the post confined itself to a subtler version of the theory, the minority argued that its underlying message would be obvious to many users. They drew on the Board’s “Former President Trump’s Suspension” decision to emphasize that Meta should assess content from influential users in light of how audiences are likely to interpret it, not merely the literal language used. For the minority, this kind of contextual analysis is especially important when content is crafted to avoid explicit violations while still conveying harmful ideas.
What concerned the minority most was the disconnect between Meta’s stated policy goals and how the company handles conspiracy-based content. Meta’s policies are meant to prevent environments that intimidate or exclude protected groups. Yet conspiracy theories like the Great Replacement have repeatedly contributed to real-world harm, including violence against immigrants and non-white communities. The minority stressed that these theories are not abstract political arguments but established drivers of racism and marginalization. They pointed to the Board’s “Holocaust Denial” decision, which recognized the cumulative harm of scaled conspiratorial content, arguing that the same reasoning applies here.
The minority also highlighted the broader social context. International bodies, such as the Committee on the Elimination of Racial Discrimination, have warned about rising racist discourse online in France. French security officials have similarly noted that groups motivated by the Great Replacement Theory pose significant threats. For the minority, Meta cannot ignore this environment: when a harmful conspiracy theory circulates widely and at high speed, it helps normalize discrimination and increases the likelihood of violence.
Given this context, the minority questioned why Meta moderates antisemitic and white supremacist conspiracies but treats Great Replacement Theory content differently, even though the harms are comparable and often directed at similarly vulnerable groups. They also challenged Meta’s claim that restricting such content would unduly limit political speech, noting that the company offered no evidence to support this concern.
In light of these policy gaps, the minority argued that Meta must reassess its policies. They contended that the burden of proof lies with Meta to demonstrate either a lack of material risk from the Great Replacement Theory or that its moderation would disproportionately impact political expression. They suggested an “escalation-only” policy that uses human review as a tool to remove only content openly supporting the Great Replacement Theory, avoiding the overreach of automated systems that remove all related content. They also proposed designating organized propagators as a Violence-Inducing Conspiracy Network. These measures, in their view, aim to protect vulnerable groups without shutting down legitimate public debate.