Facebook Community Standards, Objectionable Content, Violent and graphic content, Violence And Criminal Behavior, Coordinating Harm and Promoting Crime
Oversight Board Case of Armenian Prisoners of War Video
Armenia
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board upheld Meta’s removal of two Facebook posts related to Australia’s Indigenous Voice to Parliament Referendum. The posts featured a cropped screenshot of an Australian Electoral Commission (AEC) statement on X, which explained that if a voter cast ballots at two different polling places within their electorate, both votes would be counted due to ballot secrecy protections. However, the screenshot omitted key information from the AEC, including that voting multiple times is an offense. The first post captioned the screenshot with “vote early, vote often, and vote NO,” implying that multiple voting was allowed. The second post accused the system of being rigged, also suggesting that multiple voting was permissible, and urging followers to “smash the voting centres” alongside repeated “NO” slogans. The Board agreed with Meta that the posts violated the Coordinating Harm and Promoting Crime policy, as they advocated for multiple voting in the Indigenous Voice to Parliament Referendum (Voice Referendum), which risked undermining trust in the democratic process. The Board called on Meta to clarify its rules prohibiting voter fraud by publishing its definition of “illegal voting.”
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
On October 14, 2023, Australia held its Voice Referendum to determine whether to amend the Constitution to establish an Aboriginal and Torres Strait Islander body. This proposed body would have represented Indigenous Australians’ interests in Parliament and the Executive Government regarding matters affecting them. Days before the vote, two Facebook users shared a screenshot of a post from the Australian Electoral Commission’s (AEC) official X account. The screenshot showed the AEC’s statement: “If someone votes at two different polling places within their electorate and places their formal vote in the ballot box at each polling place, their vote is counted.” However, the screenshot excluded crucial context from the AEC, including that voting multiple times constitutes an offense.
The first user, a Facebook group administrator, posted the screenshot with the caption: “vote early, vote often, and vote NO.” The second user shared the same screenshot on their profile with an overlay text stating, “so you can vote Multiple times. They are setting us up for a ‘Rigging’ … smash the voting centres … it’s a NO, NO, NO, NO, NO.” This post also included a caption featuring the “stop” emoji and the words “Australian Electoral Commission.” Neither user shared the AEC’s important clarification that multiple voting is illegal.
Meta’s proactive automated systems detected both posts through its “keyword-based pipeline initiative,” which flags terms like “double vote” and “vote multiple times” to identify potential policy violations, including voter interference. After human review, Meta removed the posts for violating its Coordinating Harm and Promoting Crime policy. Both accounts received a standard strike and a 30-day feature limit, restricting them from posting, commenting in groups, creating new groups, or joining Messenger rooms.
The users subsequently appealed Meta’s decisions to the Oversight Board (OSB).
On May 9, 2024, the Oversight Board issued a decision on the matter. The main issue it analyzed was whether Meta’s decision to remove two posts—inviting people to vote multiple times during a referendum—was compatible with Meta’s content policies and human rights obligations.
The policy upon which the two posts were reviewed was Meta’s Coordinating Harm and Promoting Crime policy, which prohibits advocating for or instructing others on illegal voting/census participation. Meta’s policies against voting interference apply equally to elections and government-organized referendums. Meta defines illegal voting to include (but is not limited to): double voting, falsifying voting information or eligibility, and ballot theft. As an adaptive approach to enforce the policy during high-stakes democratic events, Meta deployed a keyword-based pipeline initiative, which is a global system that automatically detects and queues potentially policy-violating content for human review by scanning for specific keywords tailored to local contexts. This system was developed by regional misinformation teams and activated for Australia’s Voice Referendum through a virtual Integrity Operations Center, focusing primarily on enforcing policies against voter fraud (such as illegal voting instructions) and election misinformation (including false claims about voting procedures).
In their statements to the Board, both users asserted they were merely “sharing information posted by the AEC.” [p. 8] The second user further claimed their post served as a warning about potential referendum fraud, arguing that the system allowed double voting since polling stations did not require ID verification.
Meta, on the other hand, explained to the Board that both posts violated the Coordinating Harm and Promoting Crime policy prohibition on “advocating, providing instructions for, or demonstrating explicit intent to illegally participate in a voting or census process.” [p. 8] Regarding the first post, the company explained that the phrase “vote often” usually refers to illegal double voting. Meta concluded that the phrase was not humorous or satirical but instead promoted the user’s political stance by urging people to vote “no.” About the second post, the company considered that the phrase “smash the voting centers” was violating, as it advocated illegal voting through double voting. Meta further clarified that if the literal interpretation of the second post was to be considered, it would also violate the Violence and Incitement policy since it was a threat of high-severity violence.
The Board selected this case to assess Meta’s content moderation on voting misinformation and voter fraud due to the high number of elections in 2024.
(1) Compliance with Meta’s content policies
I. Content Rules
The OSB concluded that both posts violated the Coordinating Harm and Promoting Crime policy, as they advocated for illegal voting. It stated that the phrase “vote often” in the first post, coupled with the screenshot of the AEC’s post on X, was a clear call to engage in the practice. Regarding the second post, the Board highlighted that while the user may have been expressing frustration with the AEC for allegedly allowing multiple voting, the caption and text overlay were more reasonably interpreted as a call for multiple voting. This interpretation was further supported by the fact that voting in Australia is mandatory, and the user’s request for people to vote “No.”
The OSB recognized that the posts could have been deemed satirical; however, their satirical intent was not explicit. The Board concluded that the language of the captions and text overlay did not indicate that the posts were implicitly satirical, but rather called for illegal multiple voting. It emphasized that Meta’s humor or satire exception should only be applied in electoral contexts when content is explicitly humorous, to mitigate the risks associated with voter fraud. Therefore, the OSB held that neither post qualified for the satire exception.
Additionally, the Board considered that the posts did not qualify for the awareness-raising exception, as they went beyond calling attention to possible voter fraud to actively encourage others to engage in multiple voting. It underlined that the posts did not include the additional information shared by the AEC in the same thread on X (that double voting is an offense in Australia), thus decontextualizing the AEC’s posts to imply that multiple voting was permissible.
The OSB further responded to Meta’s remark regarding the literal interpretation of “smash,” noting that it was not applicable in this case due to a lack of contextual signals prompting such an interpretation. Hence, it considered that the second post did not violate Meta’s Violence and Incitement policy.
The Board also analyzed whether the posts violated Meta’s Misinformation policy, given that they decontextualized the AEC’s statement. However, it concluded that the Coordinating Harm and Promoting Crime policy was the applicable policy, as both users had encouraged voter fraud.
II. Enforcement Action
The OSB recognized Meta’s use of localized keyword-detection systems to enforce voting integrity policies during Australia’s Voice Referendum as effective, but recommended expanding this approach globally during elections. It also recommended that this system address voting interference through not just the Coordinating Harm and Promoting Crime and Misinformation policies, but also the Violence and Incitement policy. While acknowledging the system worked in this case, the Board highlighted the limitations of keyword-based detection and urged Meta to develop measurable success metrics for election integrity tools, building on its prior recommendation from the Brazilian General’s Speech case.
(2) Compliance with Meta’s human rights responsibilities
The OSB used the three-part test stipulated in Article 19 of the International Covenant on Civil and Political Rights (ICCPR) to analyze whether the decision to remove the post was compatible with Meta’s human rights responsibilities. It noted that, under the three-part test framework, any restriction on expression must be prescribed by law, pursue a legitimate aim, be necessary in a democratic society, and be proportionate. In previous cases, such as Armenians in Azerbaijan and Armenian Prisoners of War Video, the Board highlighted the report of the UN Special Rapporteur on freedom of expression that while obligations of companies differ from those of States, they still have the obligation to protect their users’ right to freedom of expression due to their impact on society.
I. Legality (clarity and accessibility of the rules)
The OSB emphasized that for a restriction to be prescribed by law, it must be based on clearly defined and accessible rules to both the users and reviewers, so that users are able to predict the consequences of their content. The Board said that the public-facing language of Meta’s Coordinating Harm and Promoting Crime policy was insufficiently clear, as it did not include what constitutes illegal voting. Given elections’ significance as matters of public interest, where social media discussion should be encouraged, the OSB emphasized the need for greater user awareness and compliance. Accordingly, it recommended that Meta incorporate its internal “illegal voting” definition into the publicly available policy.
II. Legitimate aim
On this point, the Board held that the Coordinating Harm and Promoting Crime policy serves two legitimate aims under international law: (1) protecting the right to vote and participating in public affairs (Article 25, ICCPR), which qualifies as a legitimate aim that justifies restrictions on expression under Article 19(3) of the ICCPR; and (2) preserving public order by preventing unlawful interference in democratic processes. To the OSB, the policy’s prohibition on encouraging voter fraud directly safeguards these fundamental rights while maintaining electoral integrity.
III. Necessity and proportionality
The Board argued that the removal of both posts complied with the necessity and proportionality requirements under Article 19 of the ICCPR. It noted that the content appeared “days before an upcoming referendum that marked a significant constitutional moment in Australia, especially for Aboriginal and Torres Strait Islander peoples.” [p. 15] While acknowledging political speech as fundamental to democracy and recognizing the users’ participation in public debate, the Board determined their calls for illegal voting undermined other Australians’ rights to vote and participate in public affairs.
The OSB noted that while the calls to “vote No” constituted protected political speech, the phrases “vote often” and “smash the voting centres” did not, as they encouraged illegal multiple voting. Consultations with experts revealed frequent claims about the Voice Referendum being rigged and voter fraud. This, the Board held, reaffirmed Meta’s obligation to protect democratic processes by prohibiting voter fraud-related content, whose circulation could compromise electoral integrity by fostering an environment of distrust.
Finally, regarding Meta’s enforcement approach, the OSB deemed it reasonable to require clear user intent when applying exceptions (e.g., satire, awareness-raising). Finding no evidence that the phrases were rhetorical rather than literal advocacy for multiple voting, the Board concluded the removals were necessary and proportionate.
Ultimately, the Board upheld Meta’s decision to take down the two posts.
Policy advisory statement:
The Board recommended that Meta incorporate its definition of “illegal voting” into the public-facing language of the policy to ensure users clearly understand what content is prohibited under the “Voter and/or census fraud” section of the Coordinating Harm and Promoting Crime Community Standard.
Dissenting or Concurring Opinions:
A minority of the Board deemed the removal of the second post unnecessary and disproportionate, as Meta failed to demonstrate a direct and immediate link between the expression and an actual threat. They viewed the phrase as ambiguous, arguing its connection to voter fraud was neither direct nor imminent. Additionally, they believed content removal was not the least intrusive measure available to address alleged voter fraud. Some members emphasized that Meta bore the burden of proving necessity and proportionality but failed to do so. They stressed that companies must assess whether restrictions represent the least intrusive option and justify their actions under human rights standards. Specifically, Meta should have either explained why the removal was minimally intrusive or acknowledged its departure from UN standards and provided public justification. In the latter case, the minority noted the Board would then evaluate whether Meta’s justification risked distorting existing human rights norms.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision has a mixed outcome. While the Board upheld Meta’s decision to remove the posts, this action was made to protect democratic processes and voting rights. Given the high number of elections taking place throughout 2024, the Board carefully balanced freedom of expression with the need to safeguard voting rights and protect electoral processes from social media-driven interference.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.