Defamation / Reputation
Johnson v. Steele
Closed Contracts Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Third Chamber of the Court of Justice of the European Union (‘CJEU’) found that the E-Commerce Directive (‘the Directive’) does not preclude a Member State from ordering a hosting provider to remove or block content that has been declared unlawful, or content that is identical or equivalent to such unlawful information. The Court also held that the Directive does not preclude Member states from ordering such removal worldwide, and therefore left it to the Member States to determine the geographic scope of the restriction within the framework of the relevant national and international laws. The case arose in 2016 when an anonymous Facebook user in Austria shared an article and a defamatory comment against the applicant Eva Glawischnig–Piesczek, an Austrian politician. Ms. Glawischnig-Piesczek obtained an injunction from the Vienna Commercial Court to remove the infringing content as well as text with an equivalent meaning, after which Facebook disabled access to the impugned content in Austria. The Vienna Higher Regional Court upheld this injunction but limited the blocking of equivalent content upon notice given by Ms. Glawischnig-Piesczek or a third party to Facebook. Both parties appealed the decision to the Austrian Supreme Court, which referred to the CJEU the question of the scope of content to be removed as well as the territorial scope of the removal. The Court found that monitoring for identical content to that which was declared illegal, would fall within the allowance for monitoring in a “specific case” and thus not violate the Directive’s general monitoring prohibition. This allowance could also extend to equivalent content providing the host was not required to “carry out an independent assessment of that content” and employed automated search tools for the “elements specified in the injunction.”
The applicant was Ms Eva Glawischnig-Piesczek , former federal chairperson of the Austrian parliamentary party “die Grünen” (the Greens) and member of the Nationalrat (National Council in Austria). The defendant was Facebook Ireland Ltd., “which operates a global social media platform for users located outside the USA and Canada.” [para. 11]
On 3 April 2016, an anonymous Facebook user shared an article from the Austrian online news magazine oe24.at titled ‘Greens: Minimum income for refugees should stay’ and published a comment calling Glawischnig-Piesczek “miese Volksverräterin” (lousy traitor), “korrupten Trampel” (corrupt bumpkin) and her party a “Faschistenpartei” (fascist party). This generated a thumbnail on Facebook containing the title of the article, and a photograph of Glawischnig-Piesczek. Both the post and comment could be accessed by any Facebook user. On 7 July 2016, Glawischnig-Piesczek asked Facebook to delete the posts and to reveal the user’s identity. After Facebook neither deleted the posts nor revealed the user’s identity, Glawischnig-Piesczek sought an injunction in the Handelsgericht Wien (Commercial Court, Vienna, Austria). She argued that her right to control the use of her own image under § 78 Urhebergesetz (Austrian Law on the protection of copyright) had been violated. She further claimed that the defamatory comment, which was posted together with the picture, constituted an infringement of § 1330 Allgemeines Bürgerliches Gesetzbuch (ABGB or Austrian Civil Code), which protects people from hate speech.
Facebook Ireland Ltd. argued that it was governed by Californian law (site of its headquarters) or Irish law (European base) but on no account by Austrian law. Secondly, it referred to its host-provider privileges under the European Community’s E-Commerce-Directive (ECD) which excludes host-providers from liability for their users’ content. Facebook also alleged that the impugned comments were protected under the right to freedom of expression pursuant to Art. 10 ECHR.
The Commercial Court ordered Facebook to ‘cease and desist from publishing’ [para. 15] Glawischnig-Piesczek ’s photograph if the accompanying text ‘contained the assertions, verbatim and/or using words having an equivalent meaning’ [para. 14] to the defamatory comment. Facebook Ireland disabled access to the said content in Austria. On appeal filed, the Oberlandesgericht Wien (Higher Regional Court, Vienna, Austria) upheld the order ‘as regards the identical allegations’ [para. 16] but held that the ‘dissemination of allegations of equivalent content had to cease only as regards those brought to the knowledge of Facebook Ireland by the applicant or by third parties’ [para. 16]. The Courts agreed that the defamatory comments implied she was engaged in illegal activities without providing any evidence and therefore, were harmful to Glawischnig-Piesczek’s reputation. Both parties appealed this judgment to the Oberster Gerichtshof (Supreme Court, Austria). The Supreme Court referred to the CJEU the questions of 1) whether, under Article 15 of the Directive, an injunction against a hosting provider could extend to statements that are identically worded and/or have equivalent content; and 2) if such an injunction could apply worldwide.
On these issues, the Advocate General (AG) of the CJEU issued an advisory opinion in June 2019. The AG observed that Article 15(1) did not regulate the territorial scope of an obligation, thus he found that it did not preclude the Member States from issuing injunctions requiring hosting providers to identify and filter not only information which has been declared illegal, but also identical and/or equivalent information globally. However, Article 15 of the Directive prohibits any general monitoring obligation, therefore the Advocate General advised that while the injunction could be extended to identical statements by any user, it should only be extended to equivalent information posted by the user who published the original unlawful content. To extend the scope to equivalent information posted by all users would necessitate a general monitoring of content in violation of Article 15. The Advocate General also recognized that the term equivalent information “gives rise to difficulties of interpretation,” but that based on its use it can be inferred that it comprises content which “scarcely diverges from the original information” or “to situations in which the message remains essentially unaltered.” [para. 67, emphasis in original]
Addressing the necessity of worldwide removal, the AG determined that when information is found to be illegal, the service provider could be obliged to monitor the information globally depending on the nature of the content since neither EU nor International law prevent injunctions from being applied extraterritorially. As defamation laws are not harmonized, national courts must weigh the rights and limit the extraterritorial effects of its injunctions to what is necessary and proportionate. In conclusion, the Advocate General advised that the effects of the injunction must be “clear, precise and foreseeable” and that it must balance the fundamental rights involved taking into account the principle of proportionality.
The Third Chamber of the Court delivered the judgment.
The primary question before the Court was whether Article 15(1) of the Directive required hosting providers to remove not only illegal information within the meaning of Article 14(1)(a) the Directive, but also other identically worded, or equivalent meaning information. The Court also had to determine the territorial applicability of such an obligation under the Directive.
The Court began by recalling that Article 14(1) of the Directive exempts information service providers from liability insofar as they have no knowledge of any illegal activity or information or act “expeditiously to remove or disable access to” it as soon as they become aware of it. Within that framework, national authorities and courts may establish procedures to remove or disable illegal content and require providers “to terminate or prevent an infringement”.
The Court considered that while Article 15(1) prohibits general monitoring of online content, which includes actively seeking facts or circumstances indicating illegal activity [para. 31], recital 47 allows for monitoring “in a specific case” where content has been declared to be illegal. In the present case, Facebook Ireland had been notified of the illegal content but failed to “expeditiously” remove or disable the impugned content. Recital 52 states that the harm from the information flows on social media sites results from the “rapidity and […] geographic extent” to which it spreads to others through sharing and reproduction.
In light of the above, the Court held that the Directive did not preclude a Member state from ordering a hosting provider to remove information that has been held to be unlawful, as well as information that is identical or equivalent to such unlawful information posted by any user. Monitoring for identical content to that which was found to be illegal, would fall within the allowance for monitoring in a “specific case” and thus not violate the general monitoring prohibition. The Court reasoned that this allowance could extend to ‘information with an equivalent meaning’ [para. 39], providing the host was not be required to ‘carry out an independent assessment of that content’ [para. 45] and employed automated search tools for the “elements specified in the injunction.” [para. 46]
The Court defined “equivalent information” as content which ‘essentially conveys the same meaning but is worded slightly differently’ [para. 41] posted by any user. Such equivalent information would contain ‘specific elements which are properly identified in the injunction such as the name of the person concerned’ [para. 45], the relevant circumstances, and ‘equivalent content to that which was declared to be illegal’ [para. 45].
The Court further established that recital 41 required a balance be struck between the interests of the parties when issuing an injunction, which in the present case consisted of protecting Glawischnig-Piesczek ’s reputation and honor without “imposing and excessive burden on the host provider.” [para. 45]
On the issue of territorial applicability of such an injunction, the Court observed that Article 18(1) notably ‘does not make a provision’ [para. 50] for any territorial limitation for the effects of such injunctions. Therefore, the Court held that it is up to the Member state to determine the geographic scope of the restriction [para. 52], as long as it is within the “framework of the relevant international law.” (see recitals 58 and 60).
The Court delivered its preliminary ruling on the interpretation of Article 15 of the Directive and remanded the case back to the Austrian Supreme Court to proceed with the main proceedings.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The judgment contracts expression on a number of ways. First, it requires social media intermediaries to employ automated search filters to remove identical posts as well as “equivalent” content without regard to the context in which they may be posted. The filtering must also take place across all users. Second, it puts false faith in the ability of such filters to correctly identify and remove the impugned content. Third, contrary to the opinion of the Advocate General, the judgment leaves the meaning of ‘equivalent information’ vague by not defining it in precise or foreseeable terms. It also leaves it to the national courts to precisely define the parameters of any “equivalent” content in the injunction, without considering that the relevant court may not fully appreciate the limitations of the filters even when the terms are precise. Fourth, the required principle of proportionality and balancing of fundamental rights from the AG’s opinion, is not clearly stated or stressed, although it may be implied in the Courts understanding of “within a framework of relevant international law.” These shortfalls in the judgment may lead to over broad removal of content, including legal content, ultimately, infringing on freedom of expression
Regarding the potential for global removal, ARTICLE 19 argues that “[t]he ruling also means that a court in one EU member state will be able to order the removal of social media posts in other countries, even if they are not considered unlawful there. This would set a dangerous precedent where the courts of one country can control what Internet users in another country can see. This could be open to abuse, particularly by regimes with weak human rights records.” Even in the present case, some of the content found to be illegal in Austria would likely have been found to be honest comment, and hence not unlawful, under UK defamation law.
Cathryn Hopkins, writing for Inforrm’s Blog, points out in an excellent analysis that on the positive side, the monitoring is restricted to content declared illegal by a court, meaning that it “does not relate to content that is the subject of a notice and take down request under Article 19 ECR that is complied with, or content that is subject to out of court settlement.” However, she warns that the ruling could lead to forum shopping since defamation laws are not harmonized. Plaintiffs may attempt to bring defamation cases in “claimant-friendly” jurisdictions knowing that they could result in global takedowns.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Let us know if you notice errors or if the case analysis needs revision.