Facebook Community Standards, Violence And Criminal Behavior, Restricted Goods and Services, Instagram Community Guidelines, Referral to Facebook Community Standards
Oversight Board Case of Ayahuasca Brew
Brazil
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Oversight Board issued a decision holding that Meta should have removed an Instagram post discussing ketamine usage as a treatment for anxiety and depression. A verified user posted the content as a paid partnership with a provider whose medical licensing status was unclear. The post, which was reported and removed three times under Meta’s Restricted Goods and Services policy before being restored, raised concerns about promoting pharmaceutical drugs without clear medical context or disclosure. The Board found the content violated Meta’s Branded Content policies, which prohibit the promotion of drugs or drug-related products in paid partnerships, and its Restricted Goods and Services policy, which prohibits the promotion of non-medical drugs. It analyzed the removal’s compatibility with international freedom of expression standards, concluding it would be necessary and proportionate to protect the right to health. The Board recommended that Meta clarify the meaning of the “paid partnership” label and the role of business partners in its approval, explicitly state that the promotion of drugs altering mental states is allowed only in a supervised medical setting, and improve its review process to ensure paid partnership content is assessed against all relevant policies.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies.
On 29 December 2022, a verified Instagram user published a post consisting of ten professional-quality drawings with text overlay describing their experience with a ketamine therapy provider. The provider was tagged as a co-author, making the content visible to both accounts’ followers. The images implied the user received “therapy sessions” for “treatment-resistant depression and anxiety,” featuring psychedelic imagery and depicting stages from pre-treatment difficulties to the administration of ketamine and a “reintegration” phase. One image praised the treatment as comparable to “any good trip,” while another endorsed the provider’s office and “extraordinary staff.”
The visuals and text did not indicate a formal medical diagnosis, medical supervision, or the provider’s licensed status. The post was labelled a “paid partnership” as required by Meta’s Branded Content policies, which require disclosure of commercial relationships. The caption described ketamine as a “medicine” for anxiety and depression at the provider’s U.S. offices and expressed belief in psychedelics as emerging mental health treatments, again without stating the treatment occurred in a licensed clinic or under medical oversight.
At the time of posting, the user had approximately 200,000 followers, and the post received about 85,000 views, 10,000 likes, and fewer than 1,000 comments. It was reported three times by different users. Less than 30 minutes after the first report, the content was removed following human review. However, the author appealed the decision, and a second human reviewer reinstated the post five hours later.
One hour after reinstatement, the post was reported again and removed almost immediately through human review, only to be reinstated once more within 30 minutes. The post was subsequently reported for a third time, triggering an automated enforcement decision based on precedent and prior human reviews. The system determined that the post violated Instagram’s Restricted Goods and Services policy and removed it.
Upon appeal by the creator, whose status as a “managed partner” grants access to specialized support from Meta, the company escalated the case for expert review and restored the post approximately six months later. This status applies to select individuals and organizations across sectors ranging from business to charities and provides enhanced support, including training, product guidance, and account management, to strengthen their platform presence and strategic alignment with Meta.
Meta referred the case to the Oversight Board, citing its significance amid rising U.S. interest in psychedelics for therapy and the difficulty in distinguishing between promotion of medically supervised use (potentially allowed) and unsupervised or recreational use (prohibited), especially without clear medical framing.
The Oversight Board issued a decision assessing whether a co-authored Instagram post promoting ketamine therapy, framed as a personal testimonial but labeled as a paid partnership, violated Meta’s Restricted Goods and Services and Branded Content policies—and whether its removal or reinstatement was compatible with the company’s human rights responsibilities, including the right to freedom of expression.
The posting user did not submit a statement despite being notified of the Board’s review. Meta, in contrast, stated that as medically supervised use of mind-altering substances becomes more widespread, its current policy framework may become increasingly untenable, given the growing number of users seeking to share experiences with legal drugs on its platforms. Meta requested the Board’s guidance on how to navigate this evolving area. The company explained that its Restricted Goods and Services policy contained overlapping and potentially conflicting definitions of “non-medical drugs” and “pharmaceutical drugs,” particularly in cases where a substance, such as ketamine, is legally prescribed under medical supervision. According to its internal guidelines, content that discusses or promotes the use of pharmaceutical drugs under medical supervision is permitted. In Meta’s view, the content at issue fell within this category, illustrating a user’s experience with a legal, medically supervised treatment for depression and anxiety. The company acknowledged that the case highlighted a tension between the definitions of pharmaceutical and non-medical drugs within its own policy framework.
Meta emphasized the value of public discussion around emerging treatments for mental health conditions like depression and anxiety, citing a 2022 review that found no overdose cases linked to the therapeutic use of ketamine in the United States. While acknowledging that positive portrayals of legal ketamine use could potentially encourage illicit use, the company maintained that this content did not violate its Restricted Goods and Services policy. Rather than treating the case as an exception, Meta viewed its decision to reinstate the post as consistent with the spirit of the policy, which allows space for discussions on medically supervised treatments.
Meta further clarified that the post was not removed under the Branded Content policy, which prohibits paid promotion of drugs, because enforcement is limited to cases where branded content is disclosed using Meta’s “Paid Partnership” tool with brand approval. In this case, the creator appeared to have permission to tag the brand without requiring approval, bypassing brand partner review. The company also noted that posts labeled as paid partnerships are not automatically flagged for specialist review, which contributes to the inconsistent enforcement of the Branded Content policy.
The Board submitted 22 written questions to Meta addressing a range of issues, including the user’s status as a managed partner, moderation appeals processes, the relationship between the user and the ketamine provider, the role of automation in content moderation, and how the company interprets the “spirit of the policy.” Meta responded to all of the Board’s questions.
The Board selected this Meta-referred case to clarify how the Restricted Goods and Services policy applied to content promoting medically supervised drug use and considered that it also raised important questions about “paid partnership” content and pharmaceutical promotion. In its assessment, the Board reviewed Meta’s content policies, as well as Meta’s values and human rights responsibilities.
The Board evaluated the case within the broader U.S. regulatory and medical context surrounding ketamine. Although ketamine is FDA-approved as an anesthetic and may be prescribed off-label for mental health treatment, its promotion is tightly regulated due to risks of misuse. While experts acknowledge growing evidence of its therapeutic potential, especially amid the mental health crisis, they also emphasized risks such as health harms and increasing recreational and black-market use. The Board also considered the rise of influencer marketing and ketamine clinics, including telehealth-based models, alongside heightened regulatory scrutiny by agencies like the FDA and FTC. Recent policy updates by Meta reflecting these concerns influenced the Board’s assessment of whether the post constituted impermissible promotion of a restricted substance.
1. Compliance with Meta’s Content Policies
a. Content Rules
The Board argued that the contested content violated both Meta’s Branded Content policies and its Restricted Goods and Services policy.
Branded Content Policies
The Board determined that the Branded Content policies should have applied, given that the content was clearly labeled as a “paid partnership.” It expressed concern that Meta did not consider this element during its internal reviews or in its referral to the Board. Relevant details were only provided in response to questions from the Board. Nonetheless, the Board appreciated Meta’s engagement on questions concerning the use of Instagram by managed partners for the paid promotion of medical treatments. It also encouraged the company to share complete information about Branded Content and/or business partnerships when submitting cases for the Board’s consideration.
According to Meta’s Branded Content policies, “certain goods, services or brands may not be promoted with branded content,” including drugs and drug-related products. While the user’s treatment appeared lawful in the United States, the Board said that the content nonetheless promoted the use of ketamine, which is prohibited under this policy. The Board further clarified that the distinction between ketamine as a “pharmaceutical” or “non-medical” drug was irrelevant in this context; the post should have been removed based on the Branded Content policies alone, rather than evaluated under the Restricted Goods and Services policy.
Additionally, the Branded Content policies require that business partners promoting “pharmacies” or “prescription drugs” be authorized by Meta, a requirement that applies in a limited number of jurisdictions, including the United States. The company confirmed that the business partner in this case was not authorized, rendering the content in violation of the policy.
Restricted Goods and Services Policy
The Board agreed with Meta that the Restricted Goods and Services policy contained an internal inconsistency: it allowed the promotion of pharmaceutical drugs under medical supervision while prohibiting the promotion of drugs that alter mental states. This conflict arises when a pharmaceutical drug, such as ketamine, also has psychoactive effects. Meta argued that the company’s value of “Voice” and the low likelihood of harm justified allowing the content as a legitimate discussion of a pharmaceutical drug.
However, the Board emphasized that this policy conflict should be resolved with reference to the requirement for a “supervised medical setting,” as reflected in Meta’s internal guidelines. The Restricted Goods and Services Community Standard defines pharmaceutical drugs as those that “require a prescription or medical professionals to administer,” while “non-medical drugs” are defined as “drugs or substances that are not being used for an intended medical purpose or are used to achieve a high.” The Board noted that the disjunctive “or” in the definition of non-medical drugs could result in a substance being misclassified, even if administered in a medical setting. Nonetheless, the Board interpreted the policies as intending a clear distinction: drugs capable of inducing a “high” may still be considered “pharmaceutical drugs” if administered under proper medical supervision.
The Board recommended that Meta revise its policy to explicitly permit content about mind-altering drugs when used under medical supervision. It further advised that indicators of such supervision—such as references to a medical diagnosis, mentions of licensed medical professionals, or the clinical status of the provider—be clearly outlined as requirements for content to be permitted.
In conclusion, the Board found the content violated the Restricted Goods and Services policy because it promoted a mind-altering drug (ketamine) without any clear indicators of medical supervision, contrary to Meta’s assertion. The post did not mention a formal medical diagnosis, identify the provider as medically licensed, or state that the treatment was carried out by qualified medical professionals.
b. Enforcement
Meta stated that its automated system flagged the content as violating the Restricted Goods and Services policy on January 15, 2023, following a third user report that was influenced by prior enforcement actions against the same post. The company clarified that the specific tool used was its “restricted and regulated goods classifier,” a machine-learning model trained to detect violations of relevant platform policies.
Meta further informed the Board that these classifiers are retrained every six months using updated datasets that include appeal outcomes. The Board expressed concern about the six-month retraining delay and urged Meta to accelerate the incorporation of successful appeal outcomes into its automated systems. Although the automated system’s decision ultimately aligned with the Board’s interpretation of the Restricted Goods and Services policy, it did not reflect Meta’s own interpretation at the time.
Meta clarified that not all content labeled as a “paid partnership” is reviewed under this policy, contributing to inconsistent enforcement. The Board further noted that this case illustrated a broader under-enforcement of Meta’s drug-related policies and recommended that Meta conduct a comprehensive assessment of its enforcement in this area.
c. Transparency
The Board asked Meta to clarify how the content qualified as a “paid partnership.” The company explained that the “paid partnership” label indicates the post is branded content for which the creator has received compensation—either monetary or in-kind—from a business partner. Creators are required to tag the relevant brand or business partner when publishing such content, whether using a creator, business, or personal account.
However, Meta clarified that the presence of the “paid partnership” label on Instagram does not necessarily mean the tagged business partner approved the content, as some creators have account-level permissions to use the label without prior approval. This may lead to user confusion. The Board recommended that Meta update its Transparency and Help Centers to clarify the meaning of the “paid partnership” label, noting that current explanations may give the misleading impression that all such content is approved by the tagged brand.
2. Compliance with Meta’s Human Rights Responsibilities
The Board concluded that Meta’s strict policies limiting branded content from promoting drugs and prohibiting the facilitation of trade in non-medical drugs aligned with its human rights responsibilities under the UN Guiding Principles on Business and Human Rights (UNGPs). Specifically, Principle 13 of the UNGPs requires companies to avoid causing or contributing to adverse human rights impacts and to mitigate risks where possible. Given the potential for harm to users’ health and the spread of misleading or unsafe health information, particularly in posts lacking a clear medical context, the Board considered Meta’s restrictions justified from a human rights perspective.
At the same time, the Board emphasized that content moderation must also respect users’ right to freedom of expression under Article 19 of the International Covenant on Civil and Political Rights (ICCPR). This includes the freedom to seek, receive, and share information of all kinds, which may extend to content discussing health treatments and even commercial communications such as advertising. In this case, the Board viewed the paid partnership post as a form of commercial expression and affirmed that such content still falls within the scope of Meta’s responsibility to protect expression under international human rights standards.
The Board also highlighted the importance of accessible health-related information, particularly for people with mental health conditions or disabilities. International human rights bodies, such as the UN Committee on Economic, Social and Cultural Rights and the Committee on the Rights of Persons with Disabilities, have underscored the role of information in enabling access to health care and ensuring equal participation in society. In light of this, the Board reiterated that social media companies have a responsibility to balance public health protections with the right to freely discuss treatments, particularly as mental health challenges become more prevalent globally.
The Board applied the principles of legality, legitimacy, necessity, and proportionality—the so-called “three-part test” used to assess restrictions on expression under international law—to evaluate the company’s actions. The Board referenced guidance from the UN Special Rapporteur on freedom of expression, who has noted that companies, while not bound by the same legal obligations as governments, must still assess their content moderation practices through a human rights lens due to their significant impact on public discourse.
a. Legality
Under international human rights law, the principle of legality requires that rules restricting expression be clear, publicly accessible, and provide sufficient guidance for both users and content moderators. This applies equally to online platforms like Meta, which must ensure their policies are understandable and consistently enforced.
Branded Content Policies
The Board held that Meta’s Branded Content policies were sufficiently clear and accessible for users seeking to engage in paid partnerships. It concluded that the prohibition on “drugs or drug-related products” reasonably extends to services involving the administration of drugs. However, to improve clarity, particularly in light of the growing use of ketamine for mental health treatments, the Board recommended that Meta explicitly state that drug-based treatments and therapies are not permitted in paid partnership content.
The Board also noted that while the policy allows for the promotion of prescription drugs and pharmacies under paid partnerships, this is only permitted when the business partner is authorized by Meta. Although this exception is outlined, the relationship between the general prohibition and the exception could be communicated more clearly. Finally, the Board observed that two versions of the Branded Content policies exist—one on Meta’s Business Help Center and another on Instagram’s Help page. While their content is consistent, the Board suggested consolidating these versions to avoid duplication and enhance user understanding.
Restricted Goods and Services Policy
The Board argued that the definitions of “non-medical drugs” and “pharmaceutical drugs” under Meta’s Restricted Goods and Services policy were not consistent with the principle of legality. The conflict between rules permitting content about pharmaceutical drugs under medical supervision and those prohibiting content involving mental state–altering substances created ambiguity, particularly when such substances are medically prescribed. This lack of clarity made it difficult for content reviewers to apply the rules consistently and highlighted the need for more precise guidance.
The Board also raised concerns about the inconsistent enforcement of the policy, particularly regarding the prohibition on attempts to trade drugs. It referenced its previous Asking for Adderall decision, where it stressed that failure to remove violating content can lead to user confusion about what is and is not allowed on Meta’s platforms.
b. Legitimate Aim
Under Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR), restrictions on freedom of expression must serve a legitimate aim. One such aim is the protection of public health. The Board found that both Meta’s Branded Content policies and Restricted Goods and Services policy align with this aim by prohibiting the promotion of non-medical drugs and the facilitation of drug trading. Additionally, the Board noted that these policies help safeguard the rights of others, including the right to health and to access reliable health-related information.
c. Necessity and Proportionality
The principles of necessity and proportionality require that restrictions on speech be appropriate to the legitimate aim pursued and represent the least intrusive means available. The Board acknowledged the growing prevalence of depression and the increasing use of ketamine as a treatment. However, it also recognized a parallel rise in ketamine misuse, making the removal of similar promotional content necessary and proportionate to prevent potential harm.
Branded Content Policies
The Board referred to the World Health Organization’s ethical guidelines, which discourage the advertisement of narcotic and psychotropic substances to the public to combat drug abuse. It also cited a Wall Street Journal report revealing that two telehealth companies—previous heavy advertisers on Meta’s platforms—were under a U.S. Department of Justice investigation for allegedly contributing to drug abuse through aggressive marketing.
Given this context, the Board considered that restricting paid partnerships promoting ketamine therapy was a necessary measure to mitigate risks of encouraging recreational use. Paid content involving health information carries a higher risk of being misused, particularly when promoted by influencers with large audiences who may include individuals in vulnerable mental health situations. The Board stressed the importance of distinguishing between commercial and non-commercial speech, as restrictions that may be disproportionate in the latter context can be justified when applied to paid advertising.
The Board also considered that less restrictive alternatives were available, such as age-gating the content, but concluded that the risks were not confined to young audiences. Adults could also be influenced by so-called “patient influencers,” especially when the content glamorizes treatments that may not be appropriate or lacks safety warnings. Therefore, it concluded that Meta’s restrictions were proportionate in this case.
While agreeing that the post should not have been permitted as a paid partnership, the Board also expressed concern about Meta’s approach to such content. It noted that the “paid partnership” label serves only to disclose a financial relationship, without offering health disclaimers or risk information. In contrast, the company sometimes attaches information panels or links to authoritative sources on certain health-related posts. The lack of such measures here was inconsistent and concerning.
Restricted Goods and Services Policy
The Board held that the Restricted Goods and Services policy imposed a necessary and proportionate restriction to prevent drug misuse. Unlike the Asking for Adderall case, where the user was seeking medical advice, the content in this case actively promoted ketamine use without reference to medical supervision, posing greater risks to users’ safety.
Although the Board considered whether a more permissive approach could allow content referencing the “therapeutic” use of drugs, it ultimately rejected that view. The term “therapeutic” is too ambiguous, and enforcement based on such an elastic standard would be difficult. Moreover, data indicating increased illicit ketamine use in the United States reinforced the need for a cautious approach. The Board ultimately reaffirmed that any promotion of such drugs must include clear indicators of medical supervision.
The Board also addressed potential inconsistencies with its previous Ayahuasca Brew decision, in which it recommended allowing positive discussion of traditional or religious drug use. On this point, it found no conflict, noting that traditional and spiritual practices involve distinct cultural safeguards and carry dignity-related considerations not applicable to the commercial promotion of pharmaceutical treatments.
Considering these arguments, the Oversight Board overturned Meta’s decision to leave the post online and required its removal.
3. Policy Advisory Statement
a. Content Policies
The Board recommended that Meta update its Transparency Center and Instagram Help Center to clarify the meaning of the “paid partnership” label, specifically regarding whether tagged business partners have approved the content. Additionally, it advised the company to revise the language of its Restricted Goods and Services policy to explicitly permit content describing or promoting the use of pharmaceutical drugs that alter mental states, provided such use occurs in a supervised medical setting. To support consistent enforcement, the Board suggested that Meta define the term “supervised medical setting” and identify clear indicators—such as references to a medical diagnosis, a licensed healthcare provider, or medical personnel—that demonstrate such supervision.
b. Enforcement
The Board recommended that Meta strengthen its content review processes to ensure that all content labeled as a “paid partnership” is reviewed under all applicable policies, including the Branded Content policies. Since not all such content is currently assessed under this policy, the company should develop a mechanism that allows reviewers to escalate potential violations to specialist teams or automated systems trained to apply the Branded Content policies appropriately.
Furthermore, the Board encouraged Meta to conduct an internal audit of how it enforces both the Branded Content and Restricted Goods and Services policies in relation to the promotion, sale, and trade of drugs and drug-related products. While the Board acknowledged that Meta has established policies in place, it expressed concern over inconsistencies in their application. An audit would help identify enforcement gaps and ensure these policies are applied more consistently and effectively across the platform.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
This decision yields a mixed outcome on expression. While it restricts expression by requiring the content’s removal, the Board affirmed that users should be allowed to discuss experiences with pharmaceutical drugs within a clearly defined, medically supervised context. It clarified that paid promotional content involving drug use warrants heightened scrutiny and must meet stricter standards to prevent harm and ensure policy clarity. The decision calls for better transparency around the “paid partnership” label and clearer guidance for users and reviewers.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
According to Article 2 of the Oversight Board Charter, “For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.” In addition, Article 4 of the Oversight Board Charter establishes, “The board’s resolution of each case will be binding and Facebook (now Meta) will implement it promptly, unless implementation of a resolution could violate the law. In instances where Facebook identifies that identical content with parallel context – which the board has already decided upon – remains on Facebook (now Meta), it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well. When a decision includes policy guidance or a policy advisory opinion, Facebook (now Meta) will take further action by analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook (now Meta), and transparently communicating about actions taken as a result.”
Let us know if you notice errors or if the case analysis needs revision.