Facebook Community Standards, Objectionable Content, Adult Nudity and Sexual Activity, Safety, Bullying and Harassment, Referral to Facebook Community Standards
Oversight Board Case of Explicit AI Images of Female Public Figures
International
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s original decision to remove two posts (on Facebook and Instagram) aimed at educating users about ovulation and cervical mucus, both of which fell within the health-related exception to the Adult Nudity and Sexual Activity policy. Initially taken down by Meta’s automated systems, the posts were later reinstated after the Board selected the cases for review. The company acknowledged the removals were made in error. The Board stressed that such enforcement mistakes hinder access to essential, often hard-to-find information about women’s reproductive health, particularly in stigmatized areas like fertility and menstrual education. It highlighted the importance of ensuring content moderation systems do not disproportionately suppress medical or educational content related to women’s bodies and urged Meta to strengthen its detection tools and review processes to avoid similar errors in the future.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
On March 15, 2023, a Facebook user in the United States replied to a post in a women’s health group with a comment featuring an image explaining various types of cervical mucus and their relation to fertility. The image included descriptive labels and was intended as a response to a question about Polycystic Ovary Syndrome (PCOS), fertility, and vaginal discharge. The post received no views or shares and was flagged once by Meta’s automated systems. The group where the post appeared is focused on supporting women in Pakistan dealing with reproductive health conditions like endometriosis, adenomyosis, and PCOS.
On March 7, 2023, an Instagram user shared a video showing vaginal discharge on a person’s fingers, accompanied by a Spanish-language caption titled “Ovulation – How to Recognize It?” The caption offered detailed information about changes in cervical mucus during ovulation and other signs of fertility, such as increased libido and changes in body temperature and sleep patterns. The account is dedicated to menstrual and vaginal health education. The video had over 25,000 views, no shares, and was also flagged once by Meta’s automated detection tools.
Initially, Meta removed both posts under its Adult Nudity and Sexual Activity policy, which restricts sexual imagery unless it serves a clear health or medical-related purpose. The content was restored to Facebook and Instagram after the Oversight Board (OSB) brought the cases to Meta’s attention, when the company confirmed that the removals were made in error.
On 16 November 2023, the Oversight Board issued a summary decision on the matter. The main issue it analyzed was whether Meta’s original removal of two posts with information about vaginal and reproductive health complied with the company’s Adult Nudity and Sexual Activity policy, which allows exceptions for content about medical or health-related issues.
In the first case, the user argued that information on the appearance and texture of cervical mucus is vital for women to track ovulation and fertility. They also emphasized that such content is often difficult to access for free, and that Meta’s removal contributed to the broader issue of limited access to accurate, non-stigmatizing health information for women.
Although Meta ultimately acknowledged its mistake and reinstated the content, the Board found that these cases highlighted persistent weaknesses in how Meta enforces the health-related exceptions to its nudity policy. On this point, the Board recalled earlier recommendations on similar issues. In the Breast Cancer Symptoms and Nudity case, the Board called on Meta to improve its automated detection of educational images with text overlays. In the Two Buttons Meme case, it recommended improving the appeals process—particularly for posts involving policy exceptions, ensuring they are prioritized for human review.
Meta has since implemented the first recommendation by deploying a new image-based classifier to better recognize contextual cues in health-related content. Within 30 days in 2023, this led to 3,500 posts being routed to human review rather than being automatically removed. However, Meta is still evaluating the feasibility of improving how appeals related to exceptions are handled. The OSB considered that fully implementing both recommendations would reduce enforcement errors and better protect educational content on women’s reproductive health from unjustified removals.
In light of this, the Board overturned Meta’s original removal decisions and welcomed the company’s correction of its decisions following the OSB’s intervention.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The Oversight Board’s decision in this case expands freedom of expression by reaffirming that women have a right to access and share accurate, health-related information online, particularly about their reproductive health. By overturning Meta’s removal of two educational posts on vaginal discharge and ovulation, the Board emphasized the importance of protecting medical and health content under the company’s Adult Nudity and Sexual Activity policy. This decision reinforces the idea that content moderation must not disproportionately restrict expression related to women’s health, especially in areas that are already stigmatized and underserved. Ensuring that women and girls can access this information freely and without undue censorship is essential to both their right to health and their right to speak openly about their bodies.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.