Facebook Community Standards, Objectionable Content, Sexual Solicitation, Adult Nudity and Sexual Activity, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of Gender Identity and Nudity
United States
Closed Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a summary decision overturning Meta’s removals of 15 posts shared on Instagram and Facebook during Breast Cancer Awareness Month that featured medical and health-related content, such as illustrations of symptoms, mastectomy scars, and nipple tattoos. Meta initially took down the 15 posts, and all 15 different users appealed against the removal. When the Oversight Board informed Meta about the appeal, the company reconsidered its initial decision and reinstated all the posts. While the Board welcomed Meta’s improvements in detecting breast cancer-related content, it found that enforcement errors continued to hinder users’ ability to raise awareness and access vital health information. The Board emphasized that such content is protected under Meta’s Adult Nudity and Sexual Activity Community Standard when shared for medical, educational, or awareness purposes. Acknowledging Meta’s correction of its initial errors, the Board reiterated the need for further improvements in enforcement accuracy. It encouraged the implementation of prior recommendations to strengthen Meta’s ability to identify and reverse mistakes.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
In October and November 2024, 15 users from 11 different countries—including Belgium, Canada, France, Italy, Japan, the UK, and the US—shared posts on Facebook and Instagram to raise awareness during Breast Cancer Awareness Month, also known as “Pink October.” These posts featured educational and personal content related to breast cancer symptoms, prevention, treatment, and recovery.
The posts fell into three categories: (1) six posts used illustrated depictions of visible female nipples to show early signs of breast cancer and encourage self-exams; (2) five posts showed mastectomy scars or nipple tattoos, often accompanied by messages of empowerment or recovery; and (3) four posts with real photographs showed visible female nipples, shared as part of personal testimonies or awareness campaigns, including relevant hashtags such as #PinkOctober and #Mastectomy.
Meta removed all 15 posts under its Adult Nudity and Sexual Activity Community Standard, which restricts the display of uncovered female nipples unless explicitly allowed under exceptions for “mastectomy,” “medical,” or “health” contexts. The users who appealed the company’s decision to the Oversight Board (OSB) argued that their content aimed to educate and destigmatize breast cancer, promote early detection, and support survivors’ healing journeys.
After the Board selected these cases for review, Meta reassessed its enforcement actions and acknowledged that all 15 removals were in error. It found that the content fell within the policy exceptions: cartoon imagery illustrating medical symptoms, mastectomy-related content, and educational or testimonial content about health. Consequently, Meta restored all the posts to its platforms.
The main issue before the Board was whether Meta’s removal of breast cancer awareness content from Facebook and Instagram complied with the company’s content policies and aligned with its human rights responsibilities. The OSB issued a summary decision to highlight continuing errors in the enforcement of its exceptions to the Adult Nudity and Sexual Activity Community Standard and how they continued to hinder users’ ability to raise awareness about breast cancer on Meta’s platforms.
Referring to its previous decision in the Breast Cancer Symptoms and Nudity case, the Board held that the removal of content raising awareness about early symptoms and prevention of breast cancer impacts not only users’ freedom of expression but also their right to health, given that access to health-related information is an important part of the aforementioned right—as the Board had previously addressed in other summary decisions, such as Education Posts About Ovulation, Breast Self-Exam and Testicular Cancer Self-Check Infographics. Regarding content about breast cancer awareness, the OSB previously recommended Meta to reduce enforcement errors concerning exceptions to its Adult Nudity and Sexual Activity Community Standard, by, for example, “implement[ing] an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes.” (recommendation n.5)
The Board also recommended that Meta “improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer symptoms are not wrongly flagged for review.” (recommendation 1). Meta stated that it had upgraded its text-overlay system and introduced a new health content classifier designed to better detect breast cancer-related material. These tools, in use since July 2021, have reportedly led to tangible changes. For example, between March 21 and April 18, 2023, the improvements resulted in around 1,000 additional posts being flagged for human review instead of being automatically removed. However, the Board noted that, even with these advances, large-scale content moderation can still produce enforcement mistakes and urged Meta to continue refining the accuracy of its policy enforcement.
Referring to the Breast Cancer Symptoms and Nudity case, the OSB recommended Meta to allow users to request human review when automated systems remove content under the Adult Nudity and Sexual Activity Community Standard. Meta rejected this recommendation, stating that most appeals are already reviewed by human moderators—except when capacity constraints do not permit it. The OSB underscored that in cases such as this one, where 15 posts were incorrectly deleted, implementing this recommendation could significantly improve enforcement accuracy.
The Board urged Meta to keep enhancing its capacity to correctly identify content covered by exceptions to the Adult Nudity and Sexual Activity Policy. Although the company’s implementation of recommendation no. 1 has reportedly helped reduce enforcement mistakes, the OSB considered that adopting recommendation 5 was also necessary, given the number of errors identified. Applying these measures would further reinforce Meta’s ability to correct wrongful removals.
The Board concluded that Meta should not have removed the 15 posts raising awareness about breast cancer.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The Oversight Board welcomed Meta’s decision to reinstate the 15 posts, stressing the importance of ensuring that breast cancer awareness content remains accessible on its platforms. Such content plays a crucial role in enabling users to share vital health information, exercise their right to freedom of expression, and access knowledge that can support public health. By affirming that this type of content falls within policy exceptions and should stay online, the decision strengthens protections for important health-related speech.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.