Global Freedom of Expression

Van Haga v. LinkedIn

Closed Expands Expression

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    October 6, 2021
  • Outcome
    Decision Outcome (Disposition/Ruling), Other
  • Case Number
  • Region & Country
    Netherlands, Europe and Central Asia
  • Judicial Body
    First Instance Court
  • Type of Law
    Civil Law
  • Themes
    Content Moderation, Content Regulation / Censorship, Digital Rights, Political Expression, Integrity And Authenticity, Misinformation, Account Integrity and Authentic Identity
  • Tags
    Private entities, COVID-19, Members of the Legislative Branch, Public safety, Social Media, Content-Based Restriction, Individuals of public importance, Filtering and Blocking, Misinformation

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

In this case, the District Court of North Holland (‘the Court’) held that LinkedIn must reinstate a politician’s LinkedIn account. The plaintiff, Mr. Van Haga, is a Dutch politician and Member of Parliament whose LinkedIn account was disabled following complaints from users that nine posted messages contained misinformation. The Court observed that LinkedIn’s Covid-19 policy was vague and not clearly posted for users to follow. The Court further concluded that in view of the “essential nature of access to social media platforms as a means to make the right to freedom of expression effective in practice”, the termination of a social media service agreement (the restriction of an account) required extra safeguards which LinkedIn failed to provide. Thus the Court ordered the reinstatement of the account, however, the messages that had been the reason for the permanent restriction of the account (for containing Covid-19 misinformation) did not have to be re-uploaded.


The plaintiff, Mr. Van Haga, is a Dutch politician and Member of Parliament. The defendants, LinkedIn Ireland Unlimited Company and LinkedIn Netherlands B.V., operate a professional online social network in the Netherlands and worldwide. Between 13 October and 18 November 2020, Mr. Van Haga had posted three messages on the platform which LinkedIn had later marked as “misinformation” and deleted, following complaints by other users. On 24 December 2020, LinkedIn temporarily restricted the plaintiff’s account, which prevented him from logging in to the account. The same day, Mr. Van Haga objected against the restriction via e-mail. On 30 December 2020, the restriction was lifted.

However, on 7 June 2021, LinkedIn again restricted the plaintiff’s account, this time permanently. The reason for the permanent restriction of the account were nine messages that Mr. Van Haga had posted between 21 September 2020 and 5 June 2021. LinkedIn had marked these posts as “misinformation” and had deleted them shortly following complaints by several users. Through his attorney, Mr. Van Haga requested the platform to reinstate his profile. LinkedIn did not comply with the request, which is why he lodged summary proceedings with the District Court of North Holland.

Decision Overview

The main issue before the Court was whether LinkedIn, by restricting the plaintiff’s LinkedIn account, had committed a breach of contract and/or had acted unlawfully against Mr. Van Haga (within the meaning of article 6:162 of the Dutch Civil Code) and, was therefore under an obligation to restore his account, including the nine messages that had been removed before the account’s restriction.

The plaintiff argued that as an opposition politician, he had a significant interest in exercising his right to freedom of expression, especially given his democratic duty to participate in public debate and to criticize the government. He further stated that the fact that LinkedIn had been acting in line with governmental guidance – thus creating a strong connection between a private social media platform and the State – would justify a greater horizontal effect of the right to freedom of expression as enshrined in Article 10 of the European Convention on Human Rights (ECHR). Third, he claimed that LinkedIn had not given any reasons for restricting his account. Lastly, he noted that the European Commission had clearly stated that freedom of expression should not be undermined by measures aimed at tackling misinformation.

LinkedIn, on the other hand, argued that it was trying to keep away misinformation from its platform as a response to the European Commission’s call to online platforms to tackle potentially harmful Covid-19 misinformation, and as a response to the EU Code of Practice on Disinformation that LinkedIn’s parent company Microsoft had signed in 2018. LinkedIn further referred to the Proposal for a Digital Services Act (DSA), which stipulated that social media networks are not obliged to allow all (legal) user-generated content on their platforms (i.e., there is no ‘must-carry obligation’). In order to assess whether specific content could be classified as misinformation, LinkedIn had followed the guidelines as communicated by authoritative health agencies, i.e., not only the World Health Organization (WHO) or the Dutch National Institute for Public Health and the Environment (RIVM), but also other authorities such as the European Center for Disease Prevention and Control (ECDC), the Red Cross, the UN, the European Public Health Organization, UNICEF and the Dutch Ministry of Health, Welfare and Sport. Finally, it emphasized that users were allowed to spread critical messages and opinions on its platform, but that did not include unsubstantiated medical claims. According to LinkedIn, all the deleted messages of the plaintiff lacked scientific evidence.

In its assessment, the Court first noted that Article 10 ECHR addresses Member States of the Convention, and not private parties such as LinkedIn. It also reiterated that in principle, ECHR provisions do not have direct horizontal effect—meaning that a private party cannot invoke the provisions against another private party directly. At the same time, the Court recognized that the content moderation policies of social media platforms had been influenced by governmental institutions, including the European Commission. It could even be argued that the infringement of the right to freedom of expression had been “instigated” by the government. According to the Court, this factor needed to be taken into consideration when determining how much freedom should be left to platforms in the moderation of content.

Next, the Court went on to assess LinkedIn’s policy of preventing Covid-19 misinformation on its platform. It found that LinkedIn’s choice to rely on authoritative governmental institutions as a means to establish whether the content qualified as “harmful misinformation” could not be considered unreasonable, given that institutions like the WHO and RIVM are “independent” and “not operating under a politically driven mandate but under a politically neutral mandate to promote public health, combat the pandemic and/or publish advice based on scientific evidence”. The fact that their advice and views could change over time did not alter this conclusion, considering that scientific insights are constantly evolving. However, the Court noticed that LinkedIn’s Covid-19 policy was – compared to those of other big platforms – barely put in writing. The only communication in this regard stated that users “may not share any content which fully contradicts the guidelines of leading, worldwide health organizations and governmental institutions for public health”. The Court did not find this statement informative, as it did not provide any guidance as to where LinkedIn could draw the line between messages that were fully contradictory to these health guidelines, and those that just made critical remarks. During the court hearing, LinkedIn clarified that users were permitted to make critical remarks regarding these health guidelines, but that it did not allow unsubstantiated medical claims. However, the platform failed to convince the Court that this interpretation of its Covid-19 policy had been clear for the plaintiff at the time he posted the disputed messages. The Court thus held that the limited knowledge about LinkedIn’s Covid-19 policy had to be factored in the assessment of the removal of the disputed messages and the subsequent restriction of the plaintiff’s account.

After some general considerations about the Covid-19 policy, the Court turned to the specific measures that LinkedIn had taken against the plaintiff on the basis of this policy: the removal of messages and the permanent restriction of his account. According to the Court, such measures should have reflected a balance between the right to freedom of expression and creating a safe, online environment. In this context, the Court observed that the forthcoming DSA contained relevant procedural frameworks to guide the balancing exercise, i.e., rules reflecting fundamental legal principles such as transparency, motivation and due care. These principles were used by the Court to evaluate how LinkedIn had proceeded to restrict Mr. Van Haga’s freedom of expression. Significantly, the Court noted that in view of the “essential nature of access to social media platforms as a means to make the right to freedom of expression effective in practice”, the termination of a social media service agreement (the restriction of an account) required extra safeguards. In other words, if LinkedIn wanted to terminate a user agreement to prevent the spread of misinformation on its platform, it should have done so with due care. LinkedIn, however, had failed to notify Mr. Van Haga about the removal of his posts and – apart from a single reference to the user agreement and general community policy – had not stated any substantiated reasons for the content removal or the restriction of the plaintiff’s account. The Court therefore held that the permanent restriction of Mr. Van Haga’s account had occurred without due care and thus, ordered LinkedIn to reinstate his profile. In addition, the Court noted that having a “clear written policy” and a “procedure focused on the communication with users and the exchange of viewpoints” is essential to give users the opportunity to learn.

With respect to the nine deleted messages, the Court stated that it would order these messages to be re-uploaded only if they did not contain any misinformation. Referring to the commentaries on European Court of Human Rights (ECtHR) case law, the Court held that where statements caused harm to others, a distinction must be made between factual statements and value judgments, and that harmful factual statements must have a sufficient factual basis. After assessing the nine deleted messages in more detail, the Court affirmed that LinkedIn had good reasons to believe that the messages contained harmful misinformation (for example, saying that young people do not get sick from Covid-19 and that Covid-19 vaccines are ‘experimental’) which could diminish the willingness of LinkedIn users to follow well-founded government advice and adhere to public health measures.

The fact that social media platforms all point to each other as ‘alternative channels’ for spreading (mis)information so that essentially there is no channel left, was – according to the Court – no reason to require platforms to host misinformation. If an evaluation showed that the dissemination of certain content must be prevented on one platform, it implied that other platforms should not be used as “megaphones” either. Plus, as the Court pointed out, the user could still employ his own channels to bring messages to public attention.

Lastly, the Court highlighted that politicians not only have rights, but also obligations. During a worldwide pandemic, this meant that democratically established policies could be criticized but not undermined. Politicians should engage in respectful communication, make a clear delineation between facts and value judgments, and place their opinions within the correct scientific context.

In conclusion, the Court ordered LinkedIn to restore Mr. Van Haga’s account, excluding the nine removed messages, within three days of the judgment’s publication.

Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

Contrary to previous Dutch case law on the removal and re-uploading of Covid-19 misinformation (Café Weltschmertz v. YouTube and Stichting Smart Exit, Stichting Viruswaarheid and Plaintiff sub 3 v. Facebook), the Court, in this case, explicitly recognized the importance of social media platforms as a means to effectively exercise the right to freedom of expression. Furthermore, it did not rely heavily on the criterion established in ECtHR’s Appleby v. United Kingdom—that there is no right to a forum of one’s choice. Instead, the Court focused on the procedural aspects of anti-misinformation measures, stressing that decisions about content removal and account restrictions must be made carefully by social media platforms, and in accordance with the principles of transparency, motivation and due care.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback