Global Freedom of Expression

Oversight Board Case of Breast Cancer Symptoms and Nudity

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    January 28, 2021
  • Outcome
    Overturned Meta’s initial decision
  • Case Number
    2020-004-IG-UA
  • Region & Country
    Brazil, International
  • Judicial Body
    Oversight Board
  • Type of Law
    Meta's content policies, International Human Rights Law
  • Themes
    Instagram Community Guidelines, Referral to Facebook Community Standards, Objectionable Content, Adult Nudity and Sexual Activity
  • Tags
    Oversight Board Policy Advisory Statement, Oversight Board Enforcement Recommendation, Oversight Board Transparency Recommendation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

On January 28, 2021, the Oversight Board overturned Facebook’s (now Meta) decision to remove a post on Instagram with images of visible and uncovered female nipples intended to raise awareness about breast cancer and its symptoms. Facebook removed the post through an automated machine-learning classifier, enforcing Facebook’s Community Standards on Adult Nudity and Sexual Activity. Although Facebook restored the post recognizing an enforcement error, the Board issued a decision on the matter. It argued that Facebook’s decision did not comply with its Community Standards since adult nudity is allowed for educational and medical purposes such as raising awareness about breast cancer. Likewise, the Oversight Board considered that Facebook’s original decision affected the users’ right to receive information about health-related issues and disproportionately impacted women, raising discrimination concerns. 

*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. These decisions are binding, unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.


Facts

In October 2020, a user in Brazil posted on Instagram a single picture with eight photographs showing different breast cancer symptoms “with corresponding descriptions such as ‘ripples’, ‘clusters’ and ‘wounds’ underneath” [p. 4]. While three photographs included female breasts with the nipples covered by a hand or out of shot, in five of them female nipples were uncovered and visible. The image had “a title in Portuguese indicating that it was to raise awareness of signs of breast cancer. The image was pink, in line with “Pink October”, an international campaign popular in Brazil for raising breast cancer awareness” [p. 4]. The user did not share any additional commentary in the post. 

An automated machine-learning classifier removed the post, “enforcing Facebook’s Community Standards on adult nudity and sexual activity, which also apply on Instagram” [p. 4].

The affected user appealed the decision to Facebook (now Meta) which owns Instagram. The company had previously stated that it cannot always offer users the option to appeal decisions due to a reduction in their review capacity because of Covid-19. Additionally, Facebook said “that not all appeals will receive human review” [p. 4]. 

The user appealed Facebook’s decision before the Oversight Board. After the Board took the case and assigned to a panel, “Facebook reversed its original removal decision and restored the post in December 2020” [p. 4]. The company claimed that the initial determination to remove the post was automated; however, when it revised its decision, it identified there had been an enforcement error. 

Since Facebook considered the issue moot after restoring the content, the company believed that the Board “should decline to hear the case” [p. 5]. Nonetheless, the Board decided to hear the case since irreversible harm still occurred: “Facebook’s decision to restore the content in early December 2020 did not make up for the fact that the user’s post was removed for the entire “pink month” campaign in October 2020″ [p. 5].


Decision Overview

The Oversight Board analyzed whether Facebook’s decision to remove a user´s post, including images of uncovered and visible female nipples with the intention of raising awareness about breast cancer, complied with the platform’s Adult Nudity and Sexual Activity Community Standard. Likewise, the Board assessed —applying a three-part test— if Facebook’s measure of removing the content complied with Human Rights standards on freedom of expression.

In their submission, the user argued that the post was “part of the national ‘Pink October’ campaign for breast cancer prevention. It shows some of the main signs of breast cancer, which the user says are essential for early detection of this disease and can save lives” [p. 8].

For its part, Facebook recognized that removing the content was a mistake. The company also explained that Community Standards apply to Instagram and that although Community Standards generally forbid visibly and uncovered female nipples, “they are allowed for ‘educational or medical purposes’, including for breast cancer awareness” [p.8].

Compliance with Community Standards

The Oversight Board began its argumentation by analyzing whether Facebook’s decision to remove the content complied with the platform’s Adult Nudity and Sexual Activity policy. The Board noted a discrepancy between Instagram’s Community Guidelines and the aforementioned Facebook Community Standard. Since Instagram’s Community Guidelines were quoted to the user upon notification that their content violated said rules, the Board considered that “[t]he differences between these rules warrant[ed] separate analysis” [p. 9].

The Board highlighted that Instagram’s Community Guidelines do not allow nudity on the platform, with few exceptions, such as “photos of post-mastectomy scarring and women actively breastfeeding” [p. 9]. These exceptions do not expressly “allow photos of uncovered female nipples to raise breast cancer awareness” [p. 9]. However, Facebook’s Community Standard on Adult Nudity and Sexual Activity “specifies that consensual adult nudity is allowed when the user clearly indicates the content is ‘to raise awareness about a cause or for educational or medical reasons’” [p. 10]. The Board considered that “the relationship between the two sets of rules, including which takes precedence, is not explained” [p. 9].

The Board recalled that Facebook’s Community Standard on Adult Nudity and Sexual Activity restricts users from posting uncovered female nipples or sexual activity content on the platform. Nonetheless, it noted that raising awareness for educational or medical reasons were exceptions to this rule: “The ‘do not post’ section of [Facebook’s] Community Standard lists ‘breast cancer awareness’ as an example of a health-related situation where showing uncovered female nipples is permitted” [p. 10].

Thus, the Board considered that the user’s post that was removed fell perfectly within the scope of the exception, “[a]ccepting Facebook’s explanation that the Community Standards operate on Instagram” [p. 10]. Hence, the Board considered that Facebook’s decision to remove the content was inconsistent with its own Community Standards.

Compliance with Facebook’s Values

Facebook’s Community Standards establish “Voice”, “Safety” and “Privacy” as fundamental values for the platform. The goal of “Voice”, which is a paramount value, is to foster a place for expression in which people are “able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable” [p. 7]. “Safety” underscores the importance of making Facebook a safe place in which expression that threatens people or “has the potential to intimidate, exclude or silence others and isn’t allowed” [p. 7]. “Privacy” aims at the protection of personal privacy and information, by letting people “choose how and when to share on Facebook” [p. 7].

Taking this into consideration, the Board considered that Facebook’s decision to remove the user’s post did not comply with the company’s values. For the Board, the “value of ‘Voice’ clearly includes discussions on health-related matters and is especially valuable for raising awareness of the symptoms of breast cancer” [p. 10]. The possibility of sharing this type of content also contributes to the “Safety” of all the people suffering from, or vulnerable to, the disease, the Board argued. Lastly, the Oversight Board opined that since “there is no indication that the pictures [posted by the user] included any non-consensual imagery” [p. 10], “Privacy” is not affected. 

Compliance with International Human Right Standards

The Board, in line with what has been said by a report from the UN Special Rapporteur on freedom of opinion and expression, argued that “health-related information is particularly important”. As such, it is protected —as stated by General Comment No. 14 of the Committee on Economic, Social and Cultural Rights on Article 12 of the IESCR— “as part of the right to health” [p. 11]. The relationship between these two rights was underscored by the Board considering the context of Brazil, “where awareness raising campaigns are crucial to promote early diagnosis of breast cancer” [p. 11].

With this in mind, the Board analyzed Facebook’s measure to remove the user’s content in relation to International Human Rights standards on freedom of expression, as laid out by Article 19 of the ICCPR. To do so, the Board analyzed the legality, the legitimacy of the aim, and the necessity and proportionality of the company’s restriction on freedom of expression. 

  • Legality

Following what the Human Rights Committee General Comment No. 34 has said, the Board specified that “any rules restricting expression must be clear, precise and publicly accessible”. According to the Board, this requirement was not met since there are discrepancies between Facebook’s Community Standards and Instagram’s Community Guidelines; “That Facebook’s Community Standards take precedence over the Community Guidelines is also not communicated to Instagram users” [p. 11]. This inconsistency and lack of clarity is compounded by the fact that removal notices, in this case, only referenced the Community Guidelines. Thus, the Board explained that “Facebook’s rules in this area therefore fail the legality test” [p. 11].

  • Legitimate aim

The Board noted that “any restriction on freedom of expression must be for a legitimate aim, which are listed in Article 19, para. 3 of the ICCPR” [p. 11]. The Adult Nudity and Sexual Activity Community Standard, as Facebook claimed, seeks to “prevent the sharing of child abuse images and nonconsensual intimate images on Facebook and Instagram” [p. 11]. For the Board,  this is a legitimate aim under International Human Rights law since restrictions on freedom of expression are valid to protect the “rights of others”, which include “the right to privacy of victims of non-consensual intimate image sharing (Article 17 ICCPR), and the rights of the child to life and development (Article 6 CRC)”, which can be threatened, as highlighted by the General Comment 13 by the Committee on the Rights of the Child, in cases of sexual exploitation. 

  • Necessity and proportionality 

The Board considered, in accordance with the Human Rights Committee General Comment No. 34, that measures that restrict “freedom of expression ‘must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected’” [p. 12].

The Board showed concern that the content was wrongfully eliminated “by an automated enforcement system and potentially without human review or appeal” [p. 12]. Citing a report by the UN Special Rapporteur on freedom of expression, the Board highlighted the limitations of automated technologies in understanding context and the complexity of human communication. In the case at hand, the machine-learning classifier “failed to recognize the words ‘Breast Cancer’ that appear at the top left of the image in Portuguese” [p. 12].

Although, as the Board argued, automated technologies fulfill a vital role in content moderation, enforcement cannot rely solely on it. The Board argued that “automated removals should be subject to both an internal audit procedure […] and appeal to human review should be offered” [p. 12], and cited a report from the UN Special Rapporteur on freedom of expression.

The Oversight Board concluded that without necessary safeguards, automated content moderation is not proportionate “for Facebook to address violating forms of adult nudity” [p. 12].

The Board expressed concern that Facebook’s actions do not respect the principle of equality and nondiscrimination, as laid out by the Human Rights Committee General Comment No. 34 on freedom of opinion and expression. For the Board, based on article 1 of the CEDAW and Article 2 of the ICCPR, it is likely that reliance on automation “will have a disproportionate impact on women, thereby raising discrimination concerns”, since Facebook’s rules “treat male and female nipples differently” [p. 13]. Facebook’s actions in this case had an impact on the possibility of raising awareness about breast cancer. This jeopardizes both women’s rights to freedom and expression and their right to health.

Finally, the Board considered it was worrying “that Facebook does not inform users when their content is enforced against through automation, and that appeal to human review might not be available in all cases” [p. 13]. This could be indicative of lack of transparency and contravenes the responsibility, as expressed by the UN Special Rapporteur on freedom of opinion and expression, of “business enterprises that engage in content moderation” to provide remedy to clients and users. 

In views of all these reasons, the Oversight Board overturned “Facebook’s original decision to take down the content, requiring the post to be left up” [p. 14], noting that it is aware that Facebook has already restored the removed content. 

Policy Advisory Statement

In addition to overturning Facebook’s original decision, the Board also urged Facebook to improve its automated system of image detection. It also recommended  Facebook be more transparent in regard to the enforcement of the company’s Community Standards and the appeal process, by ensuring that “users are always notified of the reasons for the enforcement of content policies against them”, that they are “informed when automation is used to take enforcement action against their content”,  and that they “can appeal decisions taken by automated systems to human review when their content is found to have violated Facebook’s Community Standard on Adult nudity and Sexual activity” [p. 14]. The Board also recommended that Facebook implement “an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes”. Furthermore, the Board said Facebook should “disclose data on the number of automated removal decisions per Community Standard, and the proportion of those decisions subsequently reversed following human review” [p. 14]. 

The Board also recommended that Facebook revise the Instagram Community Guidelines to specify that the adult nudity ban is not absolute and “that visible female nipples can be shown to raise breast cancer awareness” [p. 15]. Lastly, the Board requested clarification from Facebook in relation to the fact that “Instagram Community Guidelines are interpreted in line with the Facebook Community Standards, and where there are inconsistencies, the latter take precedence” [p. 15]. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

By overturning Facebook’s decision to remove content intended to raise awareness about breast cancer, the Oversight Board widens the scope of freedom of expression, while protecting access to information, the right to health, and women’s rights to nondiscrimination on an issue of the utmost relevance regarding public health. The Board’s decision also strengthens freedom of expression by recommending that Facebook implement safeguards, such as human review, on automated content moderation. By urging the company to be more transparent about the efficacy of its automated content removal, the Board also fosters a better environment for freedom of expression online.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • CEDAW, art. 1

    The Board analyzed Facebook’s human rights responsibilities through this precept on Women’s rights and nondiscrimination.

  • CRC, art. 6

    The Board cited this precept to highlight the protection of children’s right under International Human Rights law.

  • ICCPR, art. 2

    The Board analyzed Facebook’s human rights responsibilities through this precept on the right to nondiscrimination.

  • ICCPR, art. 19

    The Board analyzed Facebook’s human rights responsibilities through this precept on freedom of expression.

  • ICCPR, Art. 19, para. 3

    The Board analyzed the legitimate aim requirement through this precept on freedom of expression.

  • ICCPR, art. 17

    The Board referred to this article to highlight that the protection of privacy can be a legitimate aim for measures that restrict freedom of expression.

  • UNHR Comm., General Comment No. 34 (CCPR/C/GC/34)

    The Board referred to this general comment to state the principle of equality and nondiscrimination and to define the legality requirement.

  • CESCR, General Comment No. 14: The Right to the Highest Attainable Standard of Health

    The Board referred to this general comment to highlight the relationship between the right to health and access to information

  • CESCR, General Comment No. 13

    The Board referred to this general comment to underscore the threat of sexual exploitation to Children’s rights.

  • United Nations Guiding Principles on Business and Human Rights (2011)

    The Board referred to this instrument to highlight Facebook’s businesses’ human rights responsibilities.

  • Rapporteur on freedom of opinion and expression, A/HRC/38/35 (2018)

    The Board referenced the report to point out the key role of remedies in content moderation.

  • UN Special Rapporteur on Freedom of Expression, report A/HRC/44/49 (2020)

    The Board referenced the report to underscore the relevance of health-related information.

  • UN Special Rapporteur on Freedom of Expression, report A/73C/348 (2018)

    The Board referenced the report to highlight the limitations of automated technologies in content moderation and the importance of human review

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback