Content Regulation / Censorship, Political Expression
Zhang v. Baidu.com, Inc.
Nominations Are Now Open for the 2024 Columbia Global Freedom of Expression Prizes. Learn more and nominate here.
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Paris Court of Appeal confirmed an order from the Paris Tribunal ordering Twitter to provide information on their measures to fight online hate speech. Six French organizations had approached the Court after their research indicated that Twitter only removed under 12% of tweets that were reported to them, and sought information on the resources Twitter dedicated to the fight against online racist, anti-Semitic, homophobic speech and incitement to gender-based violence and commission of crimes against humanity. The Paris Tribunal had ruled that Twitter provide this information, and despite Twitter’s argument in the Court of Appeal that they had no statutory obligation to disclose this information, the Court held that the organizations were entitled to the information to enable them to determine whether to file an application under French law that Twitter was not promptly and systematically removing hate speech from their platform.
On May 26, 2021, six French organizations which advocate against racism, anti-Semitism and homophobia brought an application against the Twitter International Company and Twitter France for their failure to promptly and systematically remove racist, anti-Semitic or homophobic messages posted on Twitter and reported by its users. The associations were UEJF (Union des Etudiants Juifs de France), SOS Homophobie, SOS Racisme, AIJP (J’accuse! Action Internationale Pour la Justice) MRAP (Le Mouvement contre le Racisme et pour l’Amitié entre les Peuples) and LICRA (La Ligue Internationale contre le Racisme et l’antisémitisme). The organizations’ claims were based on several reports and bailiff’s findings from 2019 and 2020, which allegedly established that only 9 to 28 percent of hateful messages posted on Twitter were removed within 48 hours.
The French Law for Trust in the Digital Economy (LCEN) requires Internet service providers under Articles 6- I. 3 and 6- I. 7 to act promptly to remove manifestly illicit content, to put in place a mechanism allowing Internet users to report hateful messages, and to make public the resources it devotes to fighting against online hate. The organizations planned to sue Twitter under Article 6- I. 3 and 6- I. 7 but first brought a preliminary application under Article 145 of the French Code of Civil Procedure. Under Article 145 if there is a legitimate reason to preserve or to establish, before any legal process, the evidence of the facts upon which the resolution of the dispute depends, legally permissible preparatory inquiries may be ordered at the request of any interested party, by way of a petition or by way of a summary procedure.
Before the Paris Tribunal, the organizations requested the Tribunal appoint an expert to obtain all administrative, contractual, technical or commercial documents related to the material and human resources put in place to fight against dissemination of condoning crimes against humanity, homophobic hatred, incitement to gender-based violence as well as abuses against human dignity. They also sought an order that Twitter submit information on Twitter staff assigned to address complaints and data on how these complaints were processed: this would include the number of removals of tweets and cases of condoning crimes against humanity and incitement to racial hatred referred to the public prosecutor’s office over the previous three years.
On July 6, 2021, the Paris Tribunal ordered Twitter International Company to submit the required documents and data within two weeks and to pay 1.000 euro to each organization per day if it would not comply with the judgment, holding that the organizations had a legitimate reason to obtain information on how Twitter complied with its legal obligation to fighting online dissemination of hateful content. The Tribunal dismissed the case against Twitter France, holding that it did not have a role in moderating online content.
Twitter International Company appealed the decision to the Paris Court of Appeal.
Marie-Hélène Masseron, the President of the Court, delivered the judgment of the three-judge bench.
Twitter argued that the Paris Tribunal’s injunction was not based on precise, objective and verifiable facts under Article 145 of the Code of Civil Procedure. Further, demanding such disclosure by organizations was a violation of the right not to incriminate oneself and against the specific guarantees of the criminal proceedings [p. 11 of the decision of the court of appeal]. Twitter submitted that any request under Article 145 must rely on a legitimate reason, meaning that it must be relevant and useful for the purpose of a future proceeding and argued that the information sought by the organizations could not establish the deficiency of Twitter’s moderation system and would not enable the organizations to engage the responsibility of Twitter under the relevant articles of the LCEN. As to the obligation to make public the means employed to fight online hate, Twitter argued that such an obligation was not specified in the texts and that, in any case, Twitter published annual reports on reporting of hateful content on social media.
“Twitter argued that such demands of the organizations were against the right not to incriminate oneself and against the specific guarantees of the criminal proceedings” (page 11 of the decision of the court of appeal).
The Attorney General at the Paris Court of Appeal submitted that the judgment of the Paris Tribunal should be confirmed by the Court.
The Court noted that under Article 6.- I. 7 of the LCEN, Internet service providers are required to implement an easily accessible and visible mechanism enabling any person to alert them of hateful content on Internet, to inform the public authorities at the earliest opportunity of such alerts received and to make public the means used in the fight against online illicit activities. The organizations’ intended litigation was to establish whether Twitter was complying with its obligations under this law.
The Court stressed that the organizations were not required to demonstrate non-compliance at this preliminary stage. The Court referred to the studies, observations and testimonies by members of the organizations that had been submitted in the case file including a report by l’Union des étudiants juifs de France and SOS Racisme which established that between March 17 and May 26, 2022, only 126 out of 1110 allegedly hateful tweets were removed from the online platform – that is only 11,4 %. This inquiry was accompanied by the bailiff’s observations that, despite having been reported by the organizations, a large number of racist, anti-Semitic and homophobic tweets had not been removed from the platform. Additionally, testimonies by the members of the organizations indicated that the overwhelming majority of the tweets they had reported remained online.
The Court held that the organizations had satisfactorily disposed of the factual elements supporting the allegation that Twitter did not efficiently remove hateful content on its platform.
The data and information requested under the Article 145 legal procedure would thus provide the organizations with the necessary evidence for future legal action on whether Twitter had complied with its legal obligations under LCEN. The Court found that the requested information and data were perfectly capable of establishing whether Twitter was committed to countering hate speech and in particular whether it promptly informed the public authorities of reported illicit activities and made public the means it deployed for fighting against online offences. The Court once again emphasized that the aim of the present procedure was not to establish whether Twitter failed to meet its legal obligations but to preserve the evidence of the facts upon which the plaintiffs would rely their future legal claims. Finally, the Court added that the interim injunction was proportionate in that the Tribunal of first instance limited the scope of the interim injunction between May 18, 2020 and July 6, 2021.
The Court also ruled that Twitter must disclose information on the number, location, nationality, and spoken language of its staff employed to tackle reports from the users of the French Twitter Platform as well as the number of tweets reported for condoning crimes against humanity and inciting to hatred. Twitter was also required to disclose the criteria of its withdrawal policy and to submit the record of tweets communicated to the French public authorities, particularly the Public Prosecutor’s Office.
Accordingly, the Court confirmed the judgment of the Paris Tribunal and ordered Twitter International Company to pay 1500 euros to each of organizations.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.