Case Summary and Outcome
The Oversight Board issued a summary decision on September 13, 2023, overturning Meta’s original decision to remove a video, under its Dangerous Organizations and Individuals and Hate Speech policies, condemning statements made by rapper Ye (formerly known as Kanye West) that praised Hitler and denied the Holocaust. The Board emphasized that the content constituted counter-speech, which represents an important form of expression that challenges hate and misinformation. The Board highlighted that enforcement errors, like the one in this case, impact counter-speech negatively and undermine users’ ability to respond to harmful narratives. Meta reversed its original decision and reinstated the video after the Board notified it of the user’s appeal.
*The Oversight Board is a separate entity from Meta and will provide its independent judgment on both individual cases and questions of policy. Both the Board and its administration are funded by an independent trust. The Board has the authority to decide whether Facebook and Instagram should allow or remove content. The Board issues full decisions and summary decisions. Decisions, except summary decisions, are binding unless implementing them could violate the law. The Board can also choose to issue recommendations on the company’s content policies. Summary decisions are a transparency mechanism, providing information to the public on Meta’s decision making and the Board’s recommendations relating to cases where Meta reversed its original decision on its own accord, after receiving notice from the Board about the appeal.
Facts
In January 2023, a user based in Türkiye posted a video on Instagram of an interview excerpt in English where rapper Ye (formerly known as Kanye West) stated that he “likes” Adolph Hitler and denied the Holocaust. Ye’s statements in the video were followed by a TV reporter expressing outrage and talking about his family members who were killed in the Holocaust. The video is subtitled in Turkish and was accompanied by a caption that translates to “TV reporter responds to Kanye West.”
Meta removed the video for violating the Dangerous Organizations and Individuals (DOI) and Hate Speech policies. Under the first policy, praise of any designated individuals, including Hitler, is prohibited on Meta’s platforms; however, the policy allows content about designated individuals if it is reporting on, neutrally discussing, or condemning them or their activities. Under the second policy, Holocaust denial is prohibited as it is a form of a harmful stereotype historically linked to intimidation or discrimination on the basis of a protected characteristic; however, the policy recognizes that people may share someone else’s hate speech to condemn it or raise awareness.
The user appealed the removal decision to the Board.
Decision Overview
The main issue before the Board was whether Meta’s original decision to remove the video was compatible with its content policies and human rights obligations.
In their appeal to the Board, the user stressed that the video didn’t support Hitler and that it had been misinterpreted.
On the other hand, Meta reversed its decision when it was notified by the Board of the appeal. Meta recognized that while the video contained praise of Hitler and Holocaust denial, the second part of the video clearly condemned these statements, which fulfilled the allowances under both policies.
The Board noted that this case exemplified an enforcement error in applying the exceptions under Meta’s DOI and Hate Speech policies. It highlighted that such errors could undermine speech that condemns hate speech and the glorification of dangerous individuals, and that protecting this type of counter-speech is essential to combating harmful expression.
The Board reiterated several of its previous recommendations, including its recommendation from the “Mention of the Taliban in news reporting” decision, urging Meta to assess the accuracy of its reviewers in enforcing the reporting allowance under the DOI policy to identify the cause of enforcement errors, and the Board’s recommendation in the “Öcalan’s isolation” decision, calling on Meta to evaluate automated moderation under the DOI policy. Meta reported progress on the implementation of the first recommendation but refused to implement the second. Furthermore, the Board reiterated its recommendation from the “Wampum belt” decision to conduct accuracy assessments on the Hate Speech policy allowances for counter-speech, which Meta has demonstrated implementation of.
Ultimately, the Board overturned Meta’s original decision to remove the video and acknowledged Meta’s correction of the initial error. The Board stressed that the full implementation of its recommendations might reduce enforcement errors and better protect counter-speech.