Global Freedom of Expression

Republic of Poland v. Parliament and Council

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 26, 2022
  • Outcome
    Dismissed
  • Case Number
    C‑401/19
  • Region & Country
    Poland, Europe and Central Asia
  • Judicial Body
    Court of Justice of the European Union (CJEU)
  • Type of Law
    Intellectual Property/Copyright Law
  • Themes
    Content Regulation / Censorship
  • Tags
    Copyright, Filtering and Blocking

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Grand Chamber of the Court of Justice of the European Union (CJEU) dismissed the Republic of Poland’s action seeking annulment of Article 17 of Directive 2019/790 (Copyright Directive), which established a new specific liability mechanism in respect of online content-sharing service providers. As per Article 17 of the Copyright Directive, providers are directly liable where works and other protected subject matter are unlawfully uploaded by users of their services. The Grand Chamber issued this judgement looking at the compatibility of the Copyright Directive’s filtering requirements with the Charter of Fundamental Rights of the European Union. Poland through a legal challenge had argued that the provisions of the Copyright Directive requiring the providers to perform preventive monitoring by using automated filtering tools did not provide adequate safeguards to ensure that the right to freedom of expression and information was respected. With such risks, platforms would use automated filtering which can potentially undermine lawful online speech. The Court ruled, interpreting the provisions of Copyright Directive for the first time, holding that the obligation of the providers to carry out a prior automatic review of the content uploaded by users was accompanied by appropriate safeguards in order to ensure respect for the right to freedom of expression and information of those users.


Facts

On 14 September 2016, the European Commission submitted a proposal for a directive on copyright in the Digital Single Market. The aim of that proposal was to adapt the EU rules in the field of literary and artistic property – copyright and rights related to copyright – in particular Directive 2001/29, to the evolution of digital technologies.

One of the provisions of the Copyright Directive, Article 17, provides for the “use of protected content by online content-sharing service providers”. As per that provision, providers are directly liable where works and other protected subject matter are unlawfully uploaded by users of their services. The providers concerned may nevertheless be exempted from that liability. To that end, they are required, in accordance with the provisions of that article, to actively monitor the content uploaded by users, in order to prevent the placing online of the protected subject matter which right holders do not wish to make available on those services. In determining whether the service provider has complied with its obligations, and in light of the principle of proportionality, the Court takes into account (a) the type, (b) the audience, (c) the size of the service and the type of works or other subject matter uploaded by the users of the service, and (d) the availability of suitable and effective means and their cost for service providers.

In the present case, by application dated 24 May 2019, the Republic of Poland (applicant) brought the present action, asking the Court to annul Article 17(4)(b) and (c) of the Directive, in so far as the wording “and made best efforts to prevent their future uploads in accordance with point (b)” is concerned. The applicant alleged an infringement of the right to freedom of expression and information guaranteed by Article 11(1) of the Charter of Fundamental Rights of the European Union (the Charter). According to Article 11(1) of the Charter, “everyone has the right to freedom of expression”, which includes “freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers”.

The applicant argued that the provisions of the Copyright Directive requiring the providers to perform preventive monitoring by using automated filtering tools did not provide adequate safeguards to ensure that the right to freedom of expression and information was respected. With such risks, platforms would use automated filtering which can potentially undermine lawful online speech. Alternatively, it requested the Court to consider that the aforementioned provisions (ie. article 17(4)(b) and (c)) of the Copyright Directive cannot be separated from other provisions of Article 17, without changing it. Therefore, it asked the Court to annul the Article entirely. The European Parliament contested, on the other hand, to dismiss the application as unfounded.


Decision Overview

The Grand Chamber of the Court of Justice of the European Union delivered the judgment. The principal question before the Court was whether Article 17 of the Copyright Directive infringed Article 11(1) of the Charter, by failing to provide adequate safeguards to ensure that the right to freedom of expression and information was respected.

The applicant’s plea was based on the argument that to be exempted from all liability for giving the public access to copyright-protected works or other protected subject matter uploaded by their users in breach of copyright, content service providers are required to carry out preventive monitoring of all potential content. To do so, the service providers must use IT tools which enable the prior automatic filtering of that content. By imposing such measures, the applicant contested that it affected the right to freedom of expression and information and therefore is not justified [para. 24].

At the outset, it must be borne in mind that the liability regime concerning online content-sharing service providers for giving the public access to protected content, uploaded to their platforms by their users in breach of copyright, was earlier governed by Article 3 of Directive 2001/29 and Article 14 of Directive 2000/31. In this respect, the Court referred to its earlier interpretation while determining the validity of the liability regime introduced in Article 17 of the new Copyright Directive, holding that, first, the Article must be interpreted as meaning that the service operator on which illegally posted content is available is not really disseminating “communication to the public of that content within the meaning of the provision. This is unless the platform gives access to such content in breach of copyright. Second, the activity of the platform does not have an active role in giving it knowledge or control over the content uploaded on it. Additionally, to be exempted, the platform must know its users are committing such illegal acts [paras. 27 and 28]. However, with so much online content available now, the European legislature believed that there must be a specific liability mechanism to ensure the development of a fair licensing market between the rights holder and the service providers [para. 29].

Article 17(4) introduces a specific liability for which no authorization is granted i.e. a service provider can be exempted from liability only under certain conditions. These conditions are threefold:

1) making best efforts to obtain an authorization;

2) making best efforts to ensure the unavailability of specific works and other protected subject matter for which the rights holders have provided the service providers with the relevant and necessary information; and

3) acting expeditiously, upon receiving a sufficiently substantiated notice from the rights holders, to disable access to, or to remove from their websites, the notified works or other protected subject matter, and made their best efforts to prevent their future uploads [para. 35].

This specific liability regime is further specified in Articles 17(5) to (10) of the Directive. The applicant argued that the aforementioned conditions (points 2 and 3) limit the exercise of the right to freedom of expression and information of the users as guaranteed by Article 11 of the Charter. As per the applicant, preventive review affects the right to freedom of expression and information of users as it carries the risk that lawful content will be blocked and because the blocking is determined by algorithms even before dissemination, making it unlawful [para 41].

In this regard, the Grand Chamber observed that the rights guaranteed in Article 11 thereof have the same meaning and scope as those guaranteed in Article 10 ECHR (which deals with right to freedom of thought, conscience and religion). This is apparent from the Explanations relating to the Charter  (OJ 2007 C 303) and in accordance with Article 52(3) of the Charter. Consequently, the Court concluded that the sharing of information on the internet via online content-sharing platforms fell within the scope of Article 10 ECHR and Article 11 of the Charter.

The Court observed that according to the case law of the European Court of Human Rights (ECtHR), Article 10 of the Charter not only applies to the content but also means of dissemination, as the latter directly affects the freedom of expression and information. Given that the internet has become a primary mode of disseminating information [para. 46 and 47], in order to determine whether a limitation on the right is placed through the liability regime, there is a premise that the service providers do not necessarily have authorization for all the protected content that may be uploaded. The Court noted that in this context the rights holders are free to determine under what conditions are their subject matter used [para. 48].

Furthermore, the Court also concluded that providers who do not have the authorization, will have to show “best efforts” to obtain such authorization. Firstly, it is necessary to show that the rights holders have sent them the relevant and necessary information to make these best efforts. And secondly, these rights holders have been informed that this content is up for public view. [paras. 51 and 52]. Thus, the Court held that the specific liability regime as established under the Copyright Directive with respect to online content-sharing service providers inherently entailed a limitation to the right of freedom of expression and information of users [para. 58].

Besides, the Court also held that the Directive reflected upon ECtHR case-law, where there is a need to have measures strictly targeted which enable effective protection of copyright, without affecting users who are lawfully using the providers’ services [para. 81]. Taking from the Advocate General’s observations, the Court stated that it has already held that a filtering system which may not distinguish between lawful and unlawful content would be compatible with the Charter due to its disrespect towards the fair balance between expression and information and the right to intellectual property [para. 86]. Moreover, with regard to exceptions and limitations to copyright, the Directive required the Member States to make available self-generated content for specific purposes (such as quotation, criticism, review etc) [para. 87].

In light of the aforementioned reasons, the Court dismissed the plea of the applicant.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

The judgement notes the conflict between copyright filters and the right to freedom of expression but does not talk about banning upload filters as a whole. In the future, it would be interesting to see what kind of filtering measures would be considered adequate by the European Courts if they are challenged. When Member States implement the Directive, they will have to ensure that service providers have enough guidance in the national laws to be compliant with Article 17.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Official Case Documents

Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback