Facebook Community Standards, Violence And Criminal Behavior, Dangerous Individuals and Organizations
Oversight Board case of a Nazi quote
United States
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
On February 27, 2024, the Oversight Board overturned, through a summary decision, Meta’s original decision to remove a video from Facebook expressing support to the victims of floods in Libya caused by Storm Daniel and the collapse of two dams, which had been wrongfully flagged under the Dangerous Organizations and Individuals (DOI) policy. The user posted a video featuring several images of army personnel during rescue operations following Storm Daniel in September 2023. Meta deleted the post arguing that it breached its DOI policy. The user appealed the decision stressing that their post emphasized the unity of Libyan people during crises despite all conflicts. After being notified of the appeal by the Board, the company reversed its original decision and restored the post. The Board noted that this case underlined the over-enforcement of the Dangerous Organizations and Individuals policy which hindered users’ ability to express solidarity.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. Decisions, except summary decisions, are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
In September 2023, a user posted a video on Facebook featuring two juxtaposed images. The background image depicted two individuals in military uniforms with badges, one of which read “Brigade 444 – Combat” in Arabic. The overlaid image showed two people pulling a third person out of the water, “The people on the sides had the Arabic words for ‘west’ and ‘south’ on their chests, while the person in the middle had the word ‘east.’” [p. 1]
In August 2023, armed clashes had broken out between the 444th Combat Brigade and the Special Deterrence Force. Both parties are militias fighting for power since the 2011 overthrow of Muammar Gaddafi.
Meta originally removed the post considering that it violated its Dangerous Organizations and Individuals (DOI) policy.
After the Board notified Meta of the user’s appeal, Meta reversed its original decision and restored the content as the company found that the content did not violate the DOI policy. In this context, the Board issued a summary decision. Summary decisions examine cases where Meta reversed its original decision. Although they do not have precedential value, they aim to provide transparency regarding Meta’s corrections and call attention to areas in which the company could improve.
The main issue before the Oversight Board was whether the removal of a video—expressing solidarity with victims of the Libya floods—was consistent with Meta’s content policies.
In their submission to the Board, the user clarified that the video showed that Libya was “one people” with “one army” supporting the city of Derna after the floods caused by Storm Daniel and the dam collapses in September 2023.
On the other hand, Meta reversed its decision after the company was notified by the Board of the appeal. The company held that the post did not violate Meta’s policies as it did not refer to any designated organization or individual.
The Board emphasized that this case exemplified the over-enforcement of the DOI policy, especially through automated systems, which could hinder users’ ability to comment on current events on Meta’s platforms. The Board recalled its recommendation in the Öcalan’s isolation case, urging Meta to evaluate the automated enforcement of the DOI policy, which Meta declined to implement arguing that the policy guidance in that case did not contribute to automated enforcement.
The Board reiterated two recommendations from the Breast cancer symptoms and nudity decision. The first was for Meta “to implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes.” [p. 3]. The second recommendation was for Meta to expand “transparency reporting to disclose data on the number of automated removal decisions per Community Standard, and the proportion of those decisions subsequently reversed following human review.” [p. 3]. Meta reported progress regarding the implementation of a consistent accounting methodology for these metrics.
Moreover, the Board recalled too its recommendation in the Punjabi concern over the RSS in India case, which urged Meta to improve transparency reporting on enforcement error rates by making the data viewable by country and language for each policy.
The Board concluded that “full implementation of its recommendations will help to decrease enforcement errors under the Dangerous Organizations and Individuals policy, reducing the number of users whose freedom of expression is infringed by wrongful removals.” [p. 4]. Thus, the Board overturned Meta’s original decision to remove the content and acknowledged its correction of the initial error.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The decision to overturn Meta’s initial removal of the Libya floods solidarity video significantly expands expression by recognizing the importance of context in content moderation. It underscores the Oversight Board’s emphasis on the need for Meta to fine-tune its automated moderation systems and policies to better differentiate between content that genuinely violates guidelines from that which expresses solidarity or dissent. This case illustrates the critical balance between enforcing policies and preserving the digital space as a platform for free expression, particularly in times of crisis.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
The Board recalled recommendation no. 2 from this decision to urge Meta to evaluate its automated enforcement systems for content moderation.
The Board cited several recommendantions from this case urging Meta to expand its transparency reporting regarding data about removal decisions and to analyze them o learn from enforcement mistakes.
The Board cited this case to urge Meta to improve transparency reporting on enforcement error rates.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.