Commercial Speech, Content Regulation / Censorship, Licensing / Media Regulation
Irwin toy ltd. v. Quebec
In Progress Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Italian Data Protection Supervisory Authority issued two interim measures restricting the ability of social media platform, TikTok, from processing the data of users, residing in Italy, whose age could not be determined with certainty. The measures were issued following the death of a child who took her own life by accident while allegedly trying to take part in the “Blackout challenge” on TikTok. European and Italian law provides specific protection to children regarding the processing of their personal data, and TikTok’s policy and implementation of that policy were in violation of those rules.
Columbia Global Freedom of Expression notes that some of the information contained in this report was derived from secondary sources.
In January 2020, the Garante per la protezione dei dati personali – the Italian data protection Supervisory Authority (SA) – called on the European Data Protection Board (EDPB) to create a taskforce dedicated to the social media platform, TikTok. In March 2020, the SA itself initiated an investigation into TikTok, focused on “data processing activities that would appear to fall short of the new legal framework applying to personal data protection”. On December 15, 2020, the SA brought formal proceedings, under note n. 47853, against TikTok notifying the company of a number of violations of the European Union’s General Data Protection Regulation, 2016/679/EU (GDPR). The SA identified problems with the legal basis TikTok used to justify the processing and transfer of personal data abroad; the period for which personal data is stored; compliance with the principles of data protection by design and by default; and compliance with the rules adopted by the company to verify its users’ age (particularly for minors).
On December, 20, 2020, the SA released a press release on its findings, criticizing TikTok’s “poor attention to the protection of children, easy-to-circumvent signup restrictions for kids; poor transparency and clarity in user information; and privacy-unfriendly default settings”. It noted that, although TikTok had a policy of denying access to its platform for children who are younger than 13 years old, this could be easily circumvented as the age verification procedure relied on the user’s self-declaration. As a result of the Coronavirus pandemic, TikTok requested and obtained an extension until January 29, 2021 to provide a written reply in response to the SA’s findings.
In January 2021, ten-year-old Antonella Sicomero died after she attempted the “blackout” challenge on TikTok – a challenge in which individuals choke themselves until they lose consciousness. On January 22, 2021, prompted by the outcry over Sicomero’s death, the SA issued the first of two measures (2021/20) imposing a temporary restriction on TikTok’s processing data of users residing in the Italian territory whose age could not be determined.
On January 26 2021, TikTok provided a written reply setting out its intentions with regard to the SA’s December statement, and declared that it would implement the measure by: (a) re-verifying all users’ ages, via a self-declaration of the data subject, without however allowing further attempts once the user has declared to be younger than 13 years old; (b) promoting an awareness campaign directed at minors’ parents; (c) doubling the number of Italian speaking moderators entrusted with the assessment of the content published on the platform; and (d) improving the reporting function, which allows users to notify TikTok of the presence of users younger than 13 years old or assumed to be younger than 13 years old, on the platform. TikTok also stated that it would consider using artificial intelligence as a tool in support of the age verification procedure.
On February 8, 2021, TikTok started sending notifications to its users asking them to verify their age, and blocking the profiles of those who were younger than 13 years old.
On February 11, 2021, the SA issued a second measure (2021/61) restating what had been established on January 22, 2021.
The SA issued two decisions: 2021/20, on January 22, 2021; and 2021/61, on February 11, 2021. The SA is empowered by article 58(2)(f) of the GDPR to utilize “corrective power” to “impose a temporary or definitive restriction including a ban on processing”. Article 66(1) of the GDPR also empowers SAs “in exceptional circumstances” to “immediately adopt provisional measures intended to produce legal effects on its own territory with a specified period of validity which shall not exceed three months”. If an SA exercises its power under article 66(1) it must communicate the measure and the reasons for adopting it to the other SAs concerned (which, in this case, is Ireland’s Data Protection Commission), the Board and the Commission. Article 63 of the GDPR establishes a “consistency mechanism” to ensure consistency of application of the GDPR throughout the European Union, and requires that national SAs “shall cooperate with each other and, where relevant, with the Commission”. In terms of this consistency mechanism, upon request of “any supervisory authority the Chair of the Board or the Commission may request that any matter of general application or producing effects in more than one Member State be examined by the Board with a view to obtaining an opinion”.
The GDPR also addresses the age of consent for data processing. Article 6(1)(s) states that a child has to be “at least 16 years old”, and article 8(1) states that “[w]here the lawfulness of the processing is grounded in the consent given by the data subject” and that “[w]here the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child”. Article 8(1) does allow for member states to “provide by law for a lower age for those purposes provided that such lower age is not below 13 years”. Article 2 quinquies of the Italian Personal Data Protection Code – implementing article 8(1) of the GDPR – provides for a minimum limit of 14 years old to consent to the processing of personal data in relation to information society services. Given the provisions in the GDPR and the Italian Personal Data Protection Code, any consent given by users who are younger than 14 years old is not valid, and so, in those cases, TikTok’s processing is without legal basis and therefore unlawful.
In measure 2021/20 the SA imposed on TikTok a temporary restriction on the processing of personal data of users, residing on the Italian territory, whose age could not be determined with certainty. Although the measure was preliminary, this restriction had immediate effect (subject to any further assessment carried out by the SA), and lasted until February 15, 2021.
The SA took into account that TikTok had not yet provided a written reply to the statement from the SA in December, and highlighted that the preliminary investigation carried out had brought to light serious shortcomings with regard to the age verification procedure adopted by the company. The SA made specific reference to three provisions which highlighted the importance of protecting children’s interests. It referred to article 24(2) of the Charter of Fundamental Rights of the European Union (“The rights of the child”), which states that “[i]n all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration.” It also relied on recital number 38 of the GDPR which establishes that – with regard to personal data – “[c]hildren merit specific protection” because they “may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data”. Recital number 38 notes that processing of children’s personal data must be specifically protected for “the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.” The SA also referred to article 25(1) of the GDPR (“Data protection by design and by default”), which requires that controllers of data must “implement appropriate technical and organisational measures … in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.”
Although the provisional nature of the measure did not allow for an extended reasoning, Guido Scorza, a member of the SA, provided further information on the reasoning of the decision in an interview. He confirmed that TikTok acknowledged that its legal basis for processing users’ personal data is contractual, and that the sole purpose of TikTok processing the data is the performance of the contract between TikTok and the user (that is, so that TikTok can provide its services to the user), and that the contract stipulates that TikTok’s service is reserved for users who are at least 13 years old (or at least those who declare themselves to be 13 years old or older).
In the second measure, the SA noted that as the notifications TikTok were sending to verify users’ ages had appeared only three days earlier, it was not possible to assess at that stage whether the measure adopted by TikTok was appropriate and effective. Accordingly, it extended the restriction established in the first measure to March 15, 2021.
Article 78 of the GDPR, Article 152 of the Italian Personal Data Protection Code (Legislative Decree No. 196 of 30 June 2003), and Article 10 Legislative Decree No. 150 of 1 September 2011, allows for an appeal to the SA’s decision to be filed in court, within a period of thirty days from the date of communication of the measure itself, or sixty days if the plaintiff resides abroad.
Since issuing the two decisions, the SA has started to actively campaign in favor of stricter and more conscious parental controls on children’s activity on social media platforms. In the context of this campaign, the SA released a handbook detailing suggested behaviors that parents should adopt to protect children and prevent them from engaging in dangerous online interactions and activities.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
Although impacting on the operation of TikTok in Italy, the Supervisory Authority’s decisions find the balance between protecting children’s data processing rights and the right of social media platforms to operate by ensuring that social media companies abide by national, regional and international principles of data privacy, processing and protection.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
The SA’s decisions may strongly impact on other EU Data Protection Data Protection Authorities’ approach towards age-verification in the context of social media platforms.
Let us know if you notice errors or if the case analysis needs revision.