Global Freedom of Expression

Danny Mekić v. X (formerly Twitter)

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    July 4, 2024
  • Outcome
    Access to Information Granted
  • Case Number
    C/13/742407 / HARK 23-366
  • Region & Country
    Netherlands, Europe and Central Asia
  • Judicial Body
    First Instance Court
  • Type of Law
    Civil Law
  • Themes
    Digital Rights, Privacy, Data Protection and Retention
  • Tags
    Twitter/X, Data Protection and Retention

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

A Dutch Court ruled that X (formerly Twitter) failed to adequately respond to an individual’s request for access to information regarding a temporary restriction on his account. In October of 2023 Amsterdam based researcher Danny Mekić was informed by some of his account followers that his posts were not appearing in search results on X. In response, Mekić submitted a comprehensive data access request about the restriction, including the origin and source of his personal data, automated decision-making, reputation scores, labels, and the “Guano” system. Unsatisfied with X’s response, Mekić petitioned the Court to order X to respond to his access request as stipulated by Article 15 of the General Data Protection Regulation (GDPR) and to supply specific information about automated decision-making and how X processed and utilized his personal data, under Article 22 of GDPR. The Court held that X had not provided sufficient transparency or detail, especially regarding automated decision-making, and ordered the company to fully comply with the request within one month, imposing a daily penalty of €4,000 for non-compliance. 


Facts

On October 11, 2023, X (formerly Twitter) imposed a temporary restriction on Amsterdam-based researcher and PhD Candidate in Leiden, Danny Mekić’s, account following a tweet he made that contained the term “child pornography.”  The post criticized European plans against child pornography, quoting another user and linking to a news article, and read: “The chats of hundreds of millions of people will soon be scanned to track down a relatively small number of criminals, no matter how bad.” X’s automatic detection system flagged this tweet as potentially violating their policy on combating child abuse, resulting in the restriction. The restriction meant that Mekić’s account and posted messages temporarily did not appear in searches. X did not notify Mekić of this restriction; he only became aware of it when other users informed him that they could not find his account. 

In response to this discovery, Mekić submitted a comprehensive data access request to X on October 13, 2023. He supplemented this request on October 15 with a detailed list of specific information seeking the origin and source of his personal data, recipients of his data, the identity of data controllers, and information about any automated decisions made regarding his account. 

On October 16, 2023, following an additional review, X lifted the restriction on Mekić’s account, acknowledging that it was unjustified. However, X did not communicate this decision to Mekić. 

On November 14, 2023, X responded to Mekić’s access request, referring to various sections of its Privacy Policy. 

Unsatisfied with this response, on November 17, 2023, Mekić filed an application against X before the Amsterdam District Court (Rechtbank Amsterdam) arguing that X had not adequately addressed his access request. Mekić petitioned the Court to issue a provisionally enforceable order requiring X to respond to his access request as stipulated by Article 15 of the General Data Protection Regulation (GDPR) and also sought information about automated decision-making under Article 22 of GDPR. To ensure compliance, Mekić requested a daily penalty of €4,000 for any non-compliance by X.

On January 12, 2024, X provided more detailed information to Mekić about the restriction. They explained that his October 11 post had triggered an automated system designed to scan for content potentially associated with child sexual exploitation. It emphasized that the restriction was not the result of any third-party notification and that they apply their policy consistently based on their terms and conditions and the law. X referred to an email sent on November 14, 2023, which outlined their content moderation approach, including their “Freedom of Speech, not Freedom of Reach” philosophy. This email also acknowledged the use of automated mechanisms to analyze posts potentially linked to Child Sexual Exploitation, which resulted in the search ban or “shadowban” on his account. It further noted that safeguards are in place to avoid restricting accounts of journalists or researchers, though they continue to evaluate and adjust these mechanisms. 


Decision Overview

Judge M. Wouters of the Amsterdam District Court delivered the decision. The primary issue before the Court was to determine whether X’s response to Mekić’s GDPR access request was sufficient for him to verify the accuracy and legality of how his personal data had been processed. 

Mekić argued that X had failed to adequately comply with his request as X had neither provided general access to his data nor the specific data he had requested. He submitted that X had not furnished his personal data in the appropriate format and had failed to provide information about automated decisions, their underlying logic, significance, and expected consequences. Mekić argued that, in his access request, he had sought additional information and access, which included the origin and source of his personal data, particularly concerning a human rights organization that allegedly contacted X about his account restriction: he requested information on data shared with the American National Center of Missing and Exploited Children (NCMEC), given X’s owner Elon Musk’s statement about automatic reporting to this US child abuse watchdog. Mekić sought a comprehensive copy of his personal data as processed by X to understand how his information was registered in their systems, and requested details about any blacklists, shadowbans, or search blacklists affecting his account, citing journalistic articles claiming X’s use of such measures. Mekić asked for information about the reputation scores X used when imposing restrictions and access to the “Guano” system, which he described as a chronological record of all actions taken on an account. Through these specific requests, Mekić aimed to gain a complete understanding of how X processed and utilized his personal data, particularly in relation to the restriction placed on his account. 

During the oral hearing, Mekić requested specific information from X, including details about interactions with organizations regarding his account restriction, whether X automatically reported him to the NCMEC, and a full copy of his personal data.

X argued that it had complied with Mekić’s access request and had provided information about the creation and nature of the content moderation decision to Mekić on November 14, 2023. It emphasized that it had lifted the restriction on Mekić’s account on October 16, 2023, and reiterated that it had provided Mekić with the information about the restriction, namely about the post that led to the measure, the context for imposing the measure, that this is done automatically, that the restriction was lifted after further investigation and that no other restrictions were in place by the time of the hearing. During the proceedings, X argued that the other information requested by Mekić could not be provided because it would constitute providing trade secrets to Mekić. X also submitted that Mekić might use the information for journalistic purposes and that the decision was not automated because the system’s parameters were set by humans.

The Court established that X is a data controller and Mekić, as a X user, has the right to request access to his personal data under Article 15 of the GDPR. This includes specific information about how his data is processed, the purposes of the processing, recipients of the data, and details about automated decision-making that affects him. The Court referred to the definition of “personal data” as articulated in the GDPR, which states that personal data is “any information relating to an identified or identifiable natural person”. The Court referred to the decision of the Court of Justice of the European Union in Peter Nowak v. Data Protection Commissioner (2017) and observed that the concept of personal data is not limited to sensitive or personal information but potentially extends to any kind of information, whether objective or subjective information in the form of opinions or assessments, provided that this information concerns the data subject. It noted that the latter condition is fulfilled when the information is linked to a particular person by virtue of its content, purpose, or effect and by which that person is reasonably identifiable to another person. 

On the right to information in automated decision-making, the Court emphasized that data subjects have the right to information about decisions that produce legal effects or significantly affect them. It noted that under the GDPR, individuals have the right to be informed about the existence of automated decision-making processes, especially when such decisions are based solely on automated processing and produce legal effects or significantly impact them, and that according to the Article 29 Working Party (WP29) Guidelines on Automated Decision Making and Profiling, data processing is considered significant if it has substantial effects on an individual’s circumstances, behavior, or choices, creates long-lasting consequences, or, in extreme cases, leads to exclusion or discrimination. [para. 4.3]

On the provision of Personal Data and Transparency, the Court held that under Article 5 of the GDPR, personal data must be processed lawfully, properly, and transparently. This requires that the information provided by X to users like Mekić be simple, accessible, and understandable. Articles 12, 13, and 14 of the GDPR expand on the transparency principle and emphasize that clear and straightforward language should be used. The Court also referred to the WP29 Transparency Guidelines which advocate for the use of concrete and clear language, avoiding indefinite terms like “may,” “could,” or “possibly.” It noted that Article 15(3) of the GDPR requires that X must provide Mekić with a copy of his personal data being processed and that this copy must be complete and accurate, to enable Mekić to verify the processing of his data for accuracy and lawfulness and should allow him to exercise his rights under the GDPR, such as rectification or erasure. The Court referred to the Court of Justice of the European Union’s case F.F. v. Österreichische Datenschutzbehörde, (2023) and held that the information must be delivered in a concise, transparent, and understandable form, as required by Article 12(1) of the GDPR and that the copy should include complete documents or extracts, allowing Mekić to understand the context of the data processing. [paras. 4.4-4.6]

The Court held that X is obliged to take active steps to ensure that Mekić receives the necessary information, and merely referring him to general terms of use is insufficient. In line with the Court of Justice of the European Union’s ruling in RW v. Österreische Post, (2023), access to “recipients or categories of recipients” under Article 15(1)(c) GDPR requires that X provide the identities of the recipients unless it is impossible or the request is manifestly unfounded or excessive. However, X may refuse access to protect the rights and freedoms of others, including its own interests, such as protecting trade secrets, as outlined in Article 15(4) GDPR and Recital 63 but this refusal should not deprive Mekić of all relevant information. [paras. 4.7-4.9]

In assessing Mekić’s access request, the Court considered whether it was made for legitimate purposes or if it constituted an abuse of rights. The Court found no evidence of abuse, noting that Mekić, as a journalist pursuing a PhD in automated decision-making, has the right to access his personal data and held that X’s argument that Mekić might use the information for journalistic purposes did not prove any abuse of rights. The Court noted that bio-intentions had not emerged and even if Mekić had them, this would not immediately constitute an abuse of rights as this would not establish that this is the only reason Mekić could use his right of inspection. Accordingly, the Court upheld Mekić’s right to access his data. [paras. 4.10-4.11]

The Court recognized that Mekić’s access request was both specific and general, relating primarily to the shadowban imposed on his account but also extending to other personal data concerning his account history. Despite X’s initial response being vague and general, the Court found that X’s response to Mekić’s access request was insufficiently transparent and concrete as it merely referred Mekić to its Privacy Policy and failed to provide specific details on how Mekić’s personal data was processed. The Court noted that this lack of detail prevented Mekić from verifying the accuracy and lawfulness of the data processing, thereby not complying with the GDPR’s transparency requirements. The Court emphasized that X must provide Mekić with a comprehensive copy of his personal data, including information on third parties who received his data and the context of the data processing, to enable Mekić to exercise his rights effectively under the GDPR. [paras. 4.12-4.18]

The Court dismissed X’s argument about trade secrets, stating that X had not specified which personal data might be compromised or provided a clear explanation for withholding information; a general claim of trade secrets does not exempt X from its GDPR obligations, and Mekić had not requested sensitive elements like algorithms that could constitute trade secrets. The Court added that X’s mention of the “X Data Archive” was deemed untimely and irrelevant since it was not previously cited in responses to Mekić’s request as he archive’s content, primarily focusing on user actions, and does not address X’s obligation to provide detailed information on data processing. The Court also rejected X’s claim about an email sent on November 14, 2023, stating it was unrelated to the access request and therefore not considered in the case. 

The Court found that the restriction imposed on Mekić’s account by X was an automated decision, meaning it was made solely by a system without human intervention, and so rejected X’s argument that the decision was not automated because the system’s parameters were set by humans. The Court stressed that the key issue was that the restriction itself was enforced through automation, with human involvement only occurring at a later stage if a review was necessary. Since the automated decision significantly impacted Mekić – affecting his visibility and professional profile, and potentially leading to serious consequences if linked to child abuse – the Court ruled that X had to be transparent about the automated decision-making process. It held that, according to Articles 13 and 14 of the GDPR, X should have informed Mekić about the decision at the time it was made and provided details on the process, allowing Mekić to challenge the restriction effectively. It also held that X’s delayed and incomplete response to Mekić’s access request did not comply with Article 15(1)(h) of the GDPR, which requires transparency about automated decision-making and so X was required to provide Mekić with comprehensive information about the automated decision, including the factors considered in making the decision, to enable him to understand and contest it. The Court emphasized that X’s obligations under the GDPR include addressing automated decisions transparently and providing all necessary information to the affected individual. [paras. 4.23-27]

The Court rejected the claims about organizational contacts and automatic reports to NCMEC due to insufficient evidence. It did find that X had not sufficiently addressed Mekić’s request for a full copy of his personal data and ordered X to comply with this part of the request. Additionally, the Court required X to provide insight into reputation scores and labels, as it found X’s denial of their existence unconvincing based on evidence presented by Mekić. X was also ordered to grant access to data processed by the “Guano” system, which tracks account actions, as this constitutes personal data that Mekić needs to verify its accuracy and legality. [paras. 4.28-4.36]

Accordingly, the Court ordered X to fully comply with Mekić’s general access request under Article 15(1) of the GDPR within one month, including providing detailed information about automated decision-making, reputation scores, labels, and the “Guano” system. A penalty of €4,000 per day was imposed for each day X fails to meet this deadline. The Court also directed X to cover the costs of the proceedings, totaling €456.81. 


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The Court’s ruling in this case primarily addresses the intersection of data protection and transparency rather than directly impacting freedom of expression but by enforcing these transparency requirements, the ruling ensures that individuals have access to critical information needed to challenge decisions that significantly affect them, thereby upholding principles of accountability and fairness.

While the decision does not explicitly focus on freedom of expression, it indirectly supports it by ensuring that automated decisions which can restrict an individual’s online presence and professional visibility are made transparently. This transparency allows affected individuals to understand and contest such decisions, which can be crucial for maintaining their ability to freely express themselves. Therefore, although the ruling primarily centers on data protection and transparency, it ultimately supports freedom of expression by preventing opaque and potentially unjust restrictions on individuals’ online activities.

See the related case: Danny Mekić v. X (formerly Twitter) | Shadow Banning

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • CJEU, F.F. v. Österreichische Datenschutzbehörde, C-487/21 (2023)
  • CJEU, RW v. Österreische Post, C-154/21 (2023)
  • CJEU, Peter Nowak v. Data Protection Commissioner, C-434/16 (2017)
  • EU, General Data Protection Regulation (GDPR), Art. 5
  • EU, General Data Protection Regulation (GDPR), art. 12
  • EU, General Data Protection Regulation (GDPR), art. 13
  • EU, General Data Protection Regulation (GDPR), art. 14
  • EU, General Data Protection Regulation (GDPR), art. 15
  • EU, General Data Protection Regulation (GDPR), art. 22

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Official Case Documents

Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback