Global Freedom of Expression

Oversight Board Case of Gender Identity and Nudity

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    January 17, 2023
  • Outcome
    Oversight Board Decision, Overturned Meta’s initial decision
  • Case Number
    2022-009-IG-UA, 2022-010-IG-UA
  • Region & Country
    United States, North America
  • Judicial Body
    Oversight Board
  • Type of Law
    International Human Rights Law, Meta's content policies
  • Themes
    Facebook Community Standards, Objectionable Content, Sexual Solicitation, Adult Nudity and Sexual Activity, Instagram Community Guidelines, Referral to Facebook Community Standards
  • Tags
    LGBTI, Oversight Board Content Policy Recommendation, Oversight Board Enforcement Recommendation, Oversight Board on Meta Interstitials, Oversight Board Policy Advisory Statement, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On January 17, 2023, the Oversight Board (OSB) overturned Meta’s original decision to remove two Instagram posts depicting bare-chested transgender and non-binary people. Both images were posted by a US-based couple who identify as transgender and non-binary. The first image was posted in 2021 and depicted the couple bare-chested with flesh-colored tape covering their nipples. The second image was posted in 2022 and showed one person clothed while the other was bare-chested and covering their nipples with their hands. The captions discussed that the bare-chested person would undergo gender-affirming surgery to have a flatter chest. The posts were removed under Meta’s Sexual Solicitation and Adult Nudity policies after they were flagged by the company’s automated systems and were reported by other users. However, Meta later recognized the decision was an enforcement error of its policies and restored the content after being notified of the case by the OSB. The Board found that Meta’s policies and their enforcement criteria were unclear. This, the OSB argued, led to enforcement errors that disproportionately impacted women and the LGBTQI+ community. The Board recommended Meta to modify its policies and internal guidances to include clearer definitions regarding what constitutes an “offer or ask” for sex and “sexually suggestive poses,” among other measures.

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In 2021, an image of two bare-chested people, with flesh-colored tape covering their nipples, was posted on Instagram. In 2022, another photo of the couple was posted on the same account. In it, one person was clothed while the other one was bare-chested—covering their nipples with their hand. The two people in the images are a US-based couple who identify as transgender and non-binary.

The pictures were accompanied by captions about how the person who was bare-chested in both photos would undergo gender-affirming surgery to obtain a flatter chest. The captions explained the couple’s plans to document the surgery as a way of discussing transgender healthcare issues. The couple announced they were holding a fundraiser to pay for the surgery after they were unable to secure insurance coverage.

Initially, the first image was classified by Meta’s automated system as unlikely to be violating. After three users reported the content for pornography and self-harm, human moderators reviewed it and, again, found it non-violating. However, after a fourth report, another human moderator removed the post arguing it violated the Sexual Solicitation Community Standard.

Similarly, the second image was considered twice by Meta’s automated systems and human moderators to be non-violating. Subsequently, two users reported the content, but the reports were closed automatically without human review. Meta’s automated systems identified the content for a third time (the company’s “automated Adult Nudity and Sexual Activity classifier flagged the content” [p. 5]) and sent it for human review. It was then that the content was removed under the Sexual Solicitation Community Standard.

Although the account owners appealed the removal decisions to Meta—and the content underwent further human review—, the posts weren’t restored. This led the users to appeal the decisions before the Oversight Board (OSB). The Board decided to consider the two cases together in order to identify similar issues in Meta’s content policies and to address them.

After the Board asked the company “to provide a justification for its decision to remove the content, Meta identified the removals as ‘enforcement errors’ and restored the posts.” [p. 6]


Decision Overview

On 17 January 2023, the Oversight Board issued a decision on the matter. The OSB analyzed whether Meta’s original decisions to take down two posts—which were initially classified as non-violating—depicting bare-chested transgender and non-binary people complied with Meta’s Sexual Solicitation and Adult Nudity and Sexual Activity policies, values, and human rights obligations.

The users submitted that they believed the company’s decisions stemmed from transphobia. They urged the Board to overturn Meta’s decisions as that would make Instagram a safer place for LGBTQI+ expression.

Meta recognized that both decisions were the results of “enforcement errors” as the posts didn’t violate the Sexual Solicitation Community Standard since they only asked for donations to a fundraiser. Meta further recognized that the posts did not violate its Adult Nudity and Sexual Activity Community Standard considering “[t]he content in these cases was shared in an ‘explicitly non-binary or transgender context as evidenced by the overall topic of the content (undergoing top surgery) and the hashtags used.’” [p. 10]

As explained by the Board, Meta couldn’t provide a rationale behind why its reviewers thought the content violated the Sexual Solicitation policy. The company acknowledged that an overly technical application of the internal reviewer guidance might have led to the errors.

Compliance with Meta’s content policies

1. Sexual Solicitation Community Standard

For the OSB, implicit sexual solicitation, under Meta’s Sexual Solicitation Community Standard,  requires  two elements: “Content that contains an implicit offer or ask AND Sexually suggestive elements.” [p. 11] According to Meta’s Known Questions, which provides guidance to the company’s reviewers, “the list of contact information that triggers removal as an implicit offer includes social media profile links and ‘links to subscription-based websites (for example, OnlyFans.com or Patreon.com).’” [p, 11] In the case at hand, the Board noted that the users only provided a link to their surgery’s fundraiser. On this point, the OSB held that in light of Meta’s broad internal criteria, the inclusion of this link could technically constitute an implicit offer or ask despite not violating the public-facing policy, which considers that the offer must be sexual in nature.

Regarding the sexually suggestive element, the Board noted that the “Community Standard provides a list of sexually suggestive elements which includes poses. The Known Questions provide a list, described by Meta as exhaustive, of what are characterized as sexually suggestive poses, including nude ‘female breasts covered either digitally or by human body parts or objects.’” [p. 11]

Referring to the case at hand, the Board held that both contested images featured breasts covered by objects and human body parts. Nonetheless, the OSB said that the depicted breasts belonged to persons who do not identify themselves as women, rather they identify as trans and nonbinary. The Board also held that the content was not sexually suggestive either. Hence, the second element of the Sexual Solicitation policy was not met.

Taking into consideration the arguments outlined above, the OSB concluded that the removed posts did not violate Meta’s Sexual Solicitation Community Standard.

2. Adult Nudity and Sexual Activity Community Standard

The Adult Nudity and Sexual Activity Community Standard, the Board explained, “states that users should not post images of ‘uncovered female nipples except in the context of breastfeeding, birth giving and after-birth moments, medical or health context (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest.’” [p. 12] Meta has further instructed reviewers, through its Known Questions, to allow “imagery of nipples when shared in an explicitly female-to-male transgender, non-binary or gender-neutral context (e.g. a user indicates such gender identity), regardless of size or shape of breast.” [p. 12] As both contested images featured covered nipples, whether by objects or body parts, the Board opined that the images did not breach Meta’s Adult Nudity and Sexual Activity policy. Moreover, it argued that even if the nipples were uncovered, the posts still wouldn’t have violated the policy because “the images were shared with accompanying text that made clear the individuals identify as non-binary.” [p. 12]

Compliance with Meta’s Values

The Board considered that Meta’s decisions to remove the posts were inconsistent with the values of “Voice” and “Dignity”—and that they “did not serve the value of ‘Safety’” [p. 12]—as they were enforcement errors that disproportionately impacted discriminated groups while jeopardizing the first two values. Regarding “Safety,” the Board held that removing the content did not advance this value since the posts were unrelated to non-consensual image sharing, sex trafficking, and child abuse.

Compliance with Meta’s human rights responsibilities

1. Freedom of expression

The Board referred to articles 2 and 19 of the International Covenant on Civil and Political Rights (ICCPR) to reiterate that the right to  freedom of expression “is guaranteed to all people without discrimination.” [p. 13] Following the Human Rights Committee’s decision in Nepomnyashchiy v Russia, the OSB held that discrimination on the grounds of gender identity is prohibited.

Subsequently, the Board stressed the importance of Instagram as a forum for the LGBTQI+ community to discuss their struggles and find support. Additionally, it highlighted that the contested content was crucial to the user as it brought to the fore their journey to undergo top surgery and their fundraising endeavors.

The Board employed the three-part test, outlined in Article 19(3) of the ICCPR, to analyze whether Meta’s original decision to remove the posts was compatible with its human rights obligations. According to the test, restrictions on freedom of expression “must meet the requirements of legality, legitimate aim, and necessity and proportionality.” [p. 13]

a. Legality (clarity and accessibility of the rules)

For the reasons explained below, the OSB considered that both the Sexual Solicitation and the Adult Nudity and Sexual Activity community standards were not clear nor accessible for reviewers and users—which made them incompatible with Article 19 of the ICCPR.

i. Sexual Solicitation Community Standard

The Board reiterated that Meta’s Sexual Solicitation Community Standard “contained overbroad criteria in the internal guidelines provided to reviewers,” [p. 14] which led to over-enforcement.

Regarding the first element of the aforementioned policy—the “offer or ask” component—the Board noted that the “public-facing rules refer to a ‘method of contact’ for the soliciting party. However, the guidance for moderators, the Known Questions, state[d] that a ‘method of contact’ for an implicit ‘offer or ask’ includes social media profile links or links to third party subscription-based websites such as Patreon.” [p. 14] To the OSB, this was confusing because it was not clear for users whether links to other social media pages or payment platforms could be regarded as solicitation too. Public comments received by the Board exemplified this confusion as the people who submitted them highlighted their uncertainty about why “content including such third-party links was removed or led to their accounts being banned.” [p. 14]

Upon analyzing the second criterion—“the sexually suggestive element”—the Board said it was imprecise, excessive, and “inconsistent with Meta’s Adult Nudity and Sexual Activity policy.” [p. 14] As the OSB mentioned, the public-facing policy includes “‘sexually suggestive poses’ as a sexually suggestive element. The Known Questions then provide a detailed list of ‘sexually suggestive poses’ which includes being topless and covering breasts with hands or objects.” [p. 14] To the Board, this fact made it difficult for users to predict whether an image with covered breasts could be regarded as sexually suggestive. The confusion is furthered by the Adult Nudity policy which allows topless photos where nipples are covered. The fact that many reviewers reached different decisions when assessing the contested content, the OSB said, suggested that the criterion was not only unclear to users but to reviewers too.

ii. Adult Nudity and Sexual Activity Community Standard

The Board noted that the Adult Nudity and Sexual Activity Standard “is premised on sex and gender distinctions that are difficult to implement and contain exceptions that are poorly defined.” [p. 15] To it, Meta’s rules focused on a binary division of the body (“male and female genitalia,” “female breasts” and “female nipples”). Considering this, the Board also held that Meta failed to explain how “the company handles content depicting intersex, trans or non-binary people.” [p. 15] Vagueness on this aspect, the OSB said, brought uncertainty on how the rules are applied to people who do not align with the aforementioned parameters.

Furthermore, the Board considered that the company’s rules failed to recognize that perceptions of sex and gender are highly subjective and prone to many errors. Confusion in this regard was exacerbated by Meta’s “‘default to female principle’ whereby more restrictive policies applicable to female (as opposed to male) nudity are applied in situations of doubt.” [p. 15]

To the OSB, both the restrictions and the exceptions to rules on nipples perceived as female were too broad and confusing. The exceptions were rarely defined—if not undefined altogether. The list of exceptions was also constantly growing and is expected to grow more as expression evolves. Considering this, the Board held, in line with many public comments, that the exceptions created confusion—for example, “over whether permitted content under the exception could include pre-surgery photos (to create a before-and-after image) and images of trans women who have received breast augmentations. The internal guidelines and Known Questions make clear that this exception is narrower than the public guidance may be construed to imply.” [p. 16]

The Board argued that since Meta’s policies are premised upon binary distinctions of gender, challenges arise when trying to articulate an exception regarding gender confirmation surgery. Meta explained to the Board that the exception allowed uncovered “female nipples,” before top surgery, in an explicit “female to male” transgender, non-binary, or gender-neutral context. However, transgender women who have undergone top surgery were prohibited from sharing their uncovered nipples unless scarring over the nipple was present.

To the Board, the internal guidelines on surgical scarring and nipples were confusingly complicated. For example, the rules for mastectomies allow content where nipples are reconstructed, stenciled, or tattooed. Photos where at least one surgically removed breast is visible are also allowed, even if the other female nipple is visible too. Another confusing rule said that “[f]or mastectomies, scarring includes depiction of the area where the removed breast tissue used to be. The actual surgical scar does not need to be visible.” [p. 16]

Considering, the Board acknowledged that reviewers likely struggled “to apply rules that require that they rapidly assess sex-specific characteristics of the depicted person to decide whether to apply female nipple rules, and then the gender of the person to determine if some exceptions apply, and then consider whether the content depicts the precursor or aftermath of a surgical procedure, which surgical procedure, and the extent and nature of the visible scarring, to determine whether other exceptions may apply.” [p. 16] As the OSB noted, the same image of female-presenting nipples could produce opposite content moderation decisions if posted by a cisgender woman (prohibited) or a non-binary person (permitted). For the Board, this complex set of rules and exceptions created uncertainty for users and left too much room for misapplication: “The lack of clarity for users and moderators inherent in this policy [made] the standard unworkable.” [p. 17]

b. Legitimate aim

As the OSB noted, for a restriction on freedom of expression to be valid, it must pursue one of the legitimate aims outlined by the ICCPR in Article 19, which include the protection of the rights of others and the protection of public morals.

i. Sexual Solicitation Community Standard

The Board recognized that the Sexual Solicitation Community Standard sought to protect the rights of others as it prohibits using Meta’s platforms to facilitate trafficking, coercion, and non-consensual sexual acts that could occur off-platform.

ii. Adult Nudity and Sexual Activity Community Standard

The OSB considered that the Adult Nudity and Sexual Activity policy aligns, to an extent, with the legitimate aim of protecting the rights of others as it seeks to protect minors and women from sexual exploitation and to prevent the spread of non-consensual content. Nevertheless, the Board questioned the policy’s aim—as explained by Meta—of protecting “community sensitivity.” The OSB held that this purpose could align, in principle with the protection of “public morals.” That being said, it considered, following the Human Rights Committee’s General Comment 34, that this aim “has sometimes been improperly invoked by governmental speech regulators to violate human rights, particularly those of members of minority and vulnerable groups.” [p. 17-18]

Taking into account this context, the Board showed concerns about how Meta’s policies disproportionally impacted the freedom of expression of women and transgender and non-binary people. Furthermore, the Board also expressed worries about how Meta’s nudity policy automatically sexualized the bodies of women, trans and non-binary people while cisgender men were treated differently.

Referring to the United Nations Special Rapporteur on violence against women (Report A/HRC/38/47), the OSB held that Meta could legitimately factor in the gendered impact that certain harms have when designing policies—as women are more likely to be victimized by the non-consensual digital dissemination of intimate images. Thus, it recommended the company “to limit gendered harms, both in the over-enforcement and under-enforcement of nudity prohibitions.” [p. 19]

c. Necessity and proportionality

The Board, guided by General Comment 34, clarified that “the principle of necessity and proportionality provides that any restrictions on freedom of expression ‘must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; [and] they must be proportionate to the interest to be protected.” [p. 19]  Considering this, the OSB found that Meta’s aforementioned policies restricted more content than necessary in a manner that was not proportionate.

i. Sexual Solicitation Community Standard

According to the Board, the definitions of “implicit offer or ask” and “sexually suggestive poses”, as outlined in the policy, were overboard “and bound to capture a significant amount of content unrelated to sexual solicitation.” [p. 18].  Citing a UNESCO report—and referring to specific examples provided by public comments—, the OSB explained too that the over-enforcement of policies against sharing explicit images could impact digital sexual education negatively.

ii. Adult Nudity and Sexual Activity Community Standard

The Board considered that the Adult Nudity and Sexual Activity policy was disproportionate too. It highlighted that Meta could have achieved the aims that the policy pursued through different, less restrictive means other than removals—such as warning screens and age-gating content to people over the age of 18. Such measures have been applied before under the same policy in cases of artistic depictions of sexual activities. The OSB emphasized that Meta “may also wish to engage automated and human moderators to make more refined, context-specific determinations of when nude context is actually sexual, regardless of the gender of the body it depicts.” [p. 20] Additionally, it advised Meta to employ a wider range of policy interventions to limit the visibility of explicit content to users who don’t want to view it.

The Board drew attention to Meta’s Community Standards enforcement report for Instagram—which disclosed that 21% of the content removed under this policy was appealed and restored. This, the Board commented, highlighted the over-enforcement of this policy.

2. Nondiscrimination

Referring to the right to non-discrimination, as enshrined in Article 2 of the ICCPR, the OSB analyzed how the Adult Nudity and Sexual Activity policy, and its enforcement, disproportionately impacted women and LGBTQI+ people as it severely limited the ways in which these groups expressed themselves. On this point, the Board underscored that international human rights bodies have not discussed how permitting or prohibiting consensual adult nudity depictions impacted human rights. The Board highlighted principles 18 and 20 from the UN Guiding Principles on Business and Human Rights, which emphasize that business enterprises should give special attention to human rights impacts on individuals from groups or populations at heightened risk of vulnerability or marginalization. Additionally, the Board referenced the Gender Dimensions Handbook in which the United Nations Working Group on Business and Human Rights recommended tech companies ensure that artificial intelligence and automation do not disproportionately harm women’s human rights.

The OSB reiterated its stance that Meta should be mindful of how its policies, and their enforcement, disproportionally impact marginalized groups. The Board referenced the Reclaiming Arabic Words case to highlight the risks that over-moderation poses to persecuted minority groups.

Meta’s practices, in the case at hand, particularly affected the expression of women, trans and non-binary people, the Board said. The fact that the contested posts were identified multiple times by adult nudity and sexual activity automated systems—despite falling out of the scope of the policies—underscored, to the OSB, the disproportionate negative impact of the company’s rules and enforcement practices on the LGBTQI+ community.

Similarly so, Meta’s policy enforcement impacted women disproportionately too, considering “that up to 22% of images of women’s bodies that were removed from Instagram were apparent false positives.” [p. 22]. The “default to female principle,” the Board held, affected women even further.

The Board advised Meta to rely on contextualized analysis to assess what content is sexual in nature to avoid gender identity-based discrimination. The OSB recognized Meta’s interests in prohibiting sexual and pornographic content on its platforms but emphasized that such business interests can and should be achieved through non-discriminatory practices.

Some members of the Board suggested that Meta should implement adult nudity policies that are not based on sex or gender—referring to Meta’s commitment to the Convention on the Elimination of All Forms of Discrimination Against Women. Other members agreed, in principle, that Meta should not solely rely on gender or sex to limit expression while noting that distinctions based on protected characteristics were acceptable as long as they were reasonable, objective, and pursued a legitimate aim under the ICCPR—as decided by the Human Rights Committee in Nepomnyashchiy v Russia.

In light of all the arguments presented above, the OSB overturned Meta’s decision to remove the content as it didn’t violate the company’s policies. Additionally, the Board found that the removal decision was not compatible with Meta’s values or human rights obligations.

Policy Advisory Statement

1. Content Policy

The Board recommended Meta to “define clear, objective, rights-respecting criteria to govern the entirety of its Adult Nudity and Sexual Activity policy, ensuring treatment of all people that is consistent with international human rights standards, including without discrimination on the basis of sex or gender identity.” [p. 25] Furthermore, it advised Meta to conduct a comprehensive human rights impact assessment, that includes the interests of diverse stakeholders, to review the implications of adopting the new criteria.

Additionally, the OSB recommended Meta to provide a more in-depth definition of “offer and ask” and of what constitutes a “sexually suggestive pose,” within the Sexual Solicitation Community Standard.

2. Enforcement

The Board advised the company “to ensure that Meta’s internal criteria for its Sexual Solicitation policy do not result in the removal of more content than the public-facing policy indicates” [p. 26]—and to avoid the removal of non-sexual content by requiring a more defined connection between the two elements of the aforementioned policy.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

In this decision, the Board expanded expression in favor of historically marginalized communities as it highlighted how Meta’s policies disproportionally impacted women and the LGBTQI+ community—discriminating against them and limiting their online expression. The Board’s Policy Advisory Statement includes promising remedies to address the harms that the Community Standards’ design has created—restricting their overbroad and arbitrary enforcement to foster plurality in online spaces.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Meta’s human rights responsibilities.

  • ICCPR, art. 2

    The Board referenced this provision to highlight Meta’s obligations towards the right to non-discrimination.

  • ICCPR, art. 19

    The Board cited this norm to analyze Meta’s responsibilities towards human rights through the lens of freedom of expression.

  • ICCPR, art. 26

    The Board referenced this provision to highlight Meta’s obligations towards the right to non-discrimination.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    While employing the three-part test to assess if Meta’s actions legitimately restricted freedom of expression, the Board referred to the General Comment for guidance.

  • HR Committee, Toonen v. Australia, Communication No. 488/1992, U.N. Doc CCPR/C/50/D/488/1992 (1994)

    The Board referred to this case to underscore that sexual orientation and gender identity are unacceptable grounds for discrimination.

  • UNHRC, Kirill Nepomnyashchiy v. Russian Federation, Communication No. 2318/2013 (2018)

    The Board quoted this decision to highlight that differentiations based on the grounds listed by article 26 of the ICCPR are not discriminatory as long as they are based on reasonable and objective criteria pursuing a legitimate aim.

  • OSB, Reclaiming Arabic Words, 2022-003-IG-UA (2022)

    The Board referenced this case to reiterate that the over-moderation of minority groups poses a threat to their freedom of expression.

  • OSB, Wampum Belt, 2021-012-FB-UA (2021)

    The Board mentioned this case to analyze the challenges of applying policy exemptions.

  • OSB, Breast cancer symptoms and nudity, 2020-004-IG-UA (2021)

    The Board referred to this case to reiterate the vagueness of the Adult Nudity and Sexual Activity policy and its exceptions.

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

The decision was cited in:

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback