Oversight Board Case of Greek 2023 Elections Campaign

Closed Contracts Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    March 28, 2024
  • Outcome
    Oversight Board Decision, Agreed with Meta’s initial decision
  • Case Number
    2023-30-FB-UA, 2023-31-FB-UA
  • Region & Country
    Greece, Europe and Central Asia
  • Judicial Body
    Oversight Board
  • Type of Law
    International/Regional Human Rights Law, Meta's content policies
  • Themes
    Hate Speech, Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
  • Tags
    Facebook, Oversight Board Content Policy Recommendation, Meta Newsworthiness allowance

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On March 28, 2024, the Oversight Board upheld Meta’s decision to remove two Facebook posts related to Greece’s June 2023 General Election for violating the Dangerous Organizations and Individuals (DOI) policy. The first case involved an electoral candidate sharing a campaign leaflet featuring an endorsement by Ilias Kasidiaris, a designated hate figure. The second concerned a post displaying the logo of the National Party – Greeks, a political party designated as a hate entity. The Board found both removals consistent with Meta’s policies and human rights obligations, though concerns were raised about the clarity of the “social and political discourse” exception. The Board recommended that Meta clarify how this exception applies to election-related content, particularly when designated entities endorse or participate in electoral processes.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

The Oversight Board reviewed two Facebook posts made during the June 2023 Greek parliamentary elections, which followed the May 2023 elections where no party won a majority.

In the first case, a Spartans party candidate shared an image of his electoral leaflet with a Greek-language caption about his campaign. The leaflet included a statement that Ilias Kasidiaris supported the Spartans. Kasidiaris, a former Golden Dawn politician sentenced to 13 years for directing its criminal activities, was banned from Facebook in 2013 for hate speech. Golden Dawn, declared a criminal organization in 2020, was responsible for hate crimes, including the murders of a Greek rapper and a Pakistani migrant worker. During a 2012 rally, Kasidiaris called the Roma community “human trash” and urged supporters to “fight” if they wanted their area “to become clean.” Before his conviction, Kasidiaris founded the National Party – Greeks, later disqualified from the 2023 elections under constitutional amendments banning parties led by convicted criminals. From prison, Kasidiaris publicly endorsed the Spartans via other platforms.

In the second case, another user posted an image of the National Party – Greeks’ logo, featuring the Greek word for “Spartans.” The Spartans party, founded in 2017 by Vasilis Stigkas, promotes far-right ideology as Golden Dawn’s successor. They did not participate in the May 2023 election but were approved by Greece’s Supreme Court for the June 2023 vote, ultimately winning 12 seats (4.65%). Stigkas stated that Kasidiaris’s support “drove their success.”

Greece’s civic space has seen growing threats and attacks from extremist groups targeting refugees, migrants, and minorities, with far-right organizations using social media to spread hate speech and misinformation.

Both posts were reported to Meta. After human review, Meta found they violated the Dangerous Organizations and Individuals (DOI) policy, removed the posts, and imposed a severe strike and 30-day restriction on both accounts. After internal appeals were denied, both users appealed to the Oversight Board.


Decision Overview

The main issue before the Oversight Board was whether removing posts containing an electoral candidate’s leaflet endorsed by a hate figure and a party logo of a designated entity was consistent with Meta’s policies, values, and human rights obligations.

The first user argued they were a legitimate political candidate and that the account restriction hindered campaign management. The second said they merely shared the Spartans’ logo, expressing surprise about the removal.

Meta explained that Golden Dawn, National Party – Greeks, and Kasidiaris were designated Tier 1 hate entities and figures. It stated that promoting an endorsement from a designated figure constituted “ideological alignment,” which the policy explicitly prohibits. Sharing a logo without clear critical context also amounted to praise. Meta explained that they designate entities through an independent process based on specific signals.

Meta added that neither post qualified for the exception allowing users to “report on, neutrally discuss, or condemn” designated entities, even under the later “social and political discourse” update. This exception was designed to allow electoral discussion of officially registered entities, not to enable indirect promotion of hate groups.

(1) Compliance with Meta’s content policies and values

The Board emphasized Meta’s core value of voice, especially during elections. The Board stressed that permitting public discourse among the electorate, candidates, and parties about designated entities was crucial for enhancing voters’ access to information. It found, however, that the first post represented “praise” by aligning ideologically with a designated hate figure. Even under the December 2023 policy update, the content would still violate the rule against “positive references.”

For the second post, the majority held that sharing the logo of a designated entity without explanatory context violated the policy. Unlike in the Nazi Quote case, where contextual cues showed neutral discussion, the post here lacked indicators of neutral or critical intent.

(2) Compliance with Meta’s human rights responsibilities

In assessing Meta’s human rights responsibilities, the Board applied the three-part test set out in Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR). Under this test, any restriction on freedom of expression, such as the removal of posts, must be grounded in law, pursue a legitimate aim, and be both necessary and proportionate.

I. Legality (clarity and accessibility of the rules)

In social media contexts, the Board noted that the legality principle requires rules limiting freedom of expression to be clear and accessible to users, with reviewers having clear enforcement guidance.

In the first case, the Board noted that the explicit example in the DOI policy prohibiting “ideological alignment” was sufficiently clear for users and reviewers. Though this example was later removed in the December 2023 update, it satisfied the standard at the time.

In the second case, the Board found the rule against sharing symbols of designated entities, absent clear reporting or critical intent, was also sufficiently clear.

However, the Board expressed concern about Meta’s lack of transparency regarding which entities are designated as Tier 1 “hate entities.” Without a public list, users cannot easily know which groups are prohibited from being promoted.

The Board further noted that the examples Meta added to the DOI policy, as recommended in the “Shared Al Jazeera Post” decision, still fail to explain how the exception applies in electoral contexts. Given social media’s central role in elections, users may remain uncertain about how to discuss candidates endorsed by designated individuals.

The Board concluded that while Meta’s bans met the legality standard in June 2023, clearer guidance is needed on how the “social and political discourse” exception applies during elections.

II. Legitimate aim

The Board recognized that the DOI policy serves the legitimate aim of protecting others’ rights, such as equality, non-discrimination, and the right to life, as well as the right to vote.

III. Necessity and proportionality

The Board acknowledged that Meta’s platforms have become virtually indispensable for political discourse, especially during elections.

For the first post, the Board found the removal necessary and proportionate. While expression during elections is crucial, the post went beyond sharing information; it publicly associated the candidate with a designated hate figure. Since multiple media outlets had already reported the endorsement, the removal did not unduly restrict public access to information. These media reports would have qualified for the policy exception, allowing lawful discussion in electoral contexts without furthering real-world harm.

The Board reiterated Meta’s responsibility to mitigate human rights risks in high-stakes contexts, including elections, as recognized in prior cases like Brazilian General’s Speech”, due to potential risk of its platforms being used to incite violence during elections.

For the second post, the Board noted that the removal was likewise deemed necessary and proportionate, as there was no indication the content was shared for reporting or discussion.

The Oversight Board therefore upheld Meta’s removal of both posts as consistent with its policies, values, and human rights commitments.

Policy advisory statement

The Board recommended that Meta clarify the scope of the DOI policy exception allowing users to “report on, neutrally discuss, or condemn” dangerous organizations and individuals within the “social and political discourse” context. In particular, Meta should specify how this exception applies to election-related content.

The Board will consider this implemented once Meta updates its Community Standard accordingly.

Dissenting Opinion

A minority of members disagreed with aspects of the majority’s reasoning.

In the first case, they argued that ideological alignment was not clearly established and that Meta should have applied its newsworthiness allowance because the endorsement was valuable information for voters. They maintained that lawful candidates should be permitted to reference endorsements in neutral terms without promoting hate, as removing such content disproportionately restricted electoral speech.

In the second case, they believed that simply sharing a logo, without intent to praise or incite harm, should not be considered a violation. They argued that context matters and that removal was a disproportionate and excessive restriction.

They finally emphasized Meta’s commitment to expression, arguing the company mistakenly prioritized safety over voice, asserting that a social media platform should not decide what voters are allowed to know about candidates or parties.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Contracts Expression

This decision contracts expression by restricting political speech during an election without conducting a robust contextual analysis, such as applying the Rabat Plan of Action. The content did not contain explicit praise or support for designated entities, yet its removal limited voters’ access to direct information from candidates. The decision diverges from international standards protecting political speech, which enjoys heightened protection during elections. The minority argued that the removals were disproportionate and not the least intrusive means of achieving Meta’s objectives.

As the UN Special Rapporteur on the rights to freedom of peaceful assembly and association has emphasized, political parties’ ability to express opinions during campaigns is essential to electoral integrity. By removing a lawful candidate’s post about an endorsement, without applying the newsworthiness exception, Meta risked unduly restricting voters’ access to relevant electoral information beyond what international standards permit.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ICCPR, art. 19

    The Board analyzed Meta’s obligations towards freedom of expression stipulated by this article. It also applied the three-part test in its analysis.

  • ICCPR, art. 6

    The protection of the right to life stipulated in this article was among the legitimate aims of the Dangerous Organizations and Individuals policy.

  • ICCPR, art. 2

    The protection of the right to non-discrimination and equality stipulated in this article was among the legitimate aims of the Dangerous Organizations and Individuals policy.

  • ICCPR, art. 26

    The protection of the right to non-discrimination and equality stipulated in this article was among the legitimate aims of the Dangerous Organizations and Individuals policy.

  • ICCPR, art. 7

    The protection of the right to the prohibition of torture, inhuman and degrading treatment was among the legitimate aims of the Dangerous Organizations and Individuals policy.

  • ICCPR, art. 25

    The protection of the right to participate in public affairs and the right to vote was among the legitimate aims of the Dangerous Organizations and Individuals policy.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board used this General Comment as a guide to how the three part test is applicable to Meta’s restrictions on freedom of expression, and to explain the elements of the three part test.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board analyzed Meta’s human rights obligations within the framework of the Guiding Principles on Business and Human Rights.

  • OSB, Brazilian General's Speech, 2023-001-FB-UA (2023)

    The Board cited this case an example of platforms being used to incite violence in the context of elections.

  • OSB, Mention of the Taliban in News Reporting, 2022-005-FB-UA (2022)

    The Board referenced this case in its analysis of the legitimate aim of the Dangerous Organizations and Individuals policy.

  • OSB, Punjabi concern over the RSS in India, 2021-003-FB-UA (2021)

    The Board referenced this case in its analysis of the legitimate aim of the Dangerous Organizations and Individuals policy.

  • OSB, Shared Al Jazeera Post, 2021-009-FB-UA (2021)

    The Board highlighted one of its recommendations in this decision, which led to the August 2023 update to the Dangerous Organizations and Individuals policy.

  • OSB, Nazi quote, 2020-005-FB-UA (2021)

    The Board distinguished between the case at hand and this case throughout the analysis.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback