Content Regulation / Censorship, Privacy, Data Protection and Retention, Defamation / Reputation
Hegglin v. Google
Closed Mixed Outcome
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
The Paris Tribunal of Commerce held Google Inc. responsible for not responding to an individual’s request to stop the association of his name with some words referring to his criminal past, in the “Google Suggest” function. The Court held that the appearance of personal data in combination with other terms constitutes a processing of personal data, “communication by transmission” and “dissemination” and so Google Inc was obliged to accede to the request. The Court distinguished this request from one to block the access to the contents or for their delisting, and said it was merely to stop the association of phrases with names, and this does not infringe the right to communicate information, the freedom of expression, and the freedom to receive and seek information.
In October 2012, a French man, Mr. X., sent a request to Google Inc. and Google France to stop the association of the search of his name and surname with a link and a publication that damaged his reputation. The request related to a specific term “…” which appeared in the suggestions proposed by the software tools “Google Suggest” and “related searches” when his name was entered into the search engine.
In a letter dated November 5, 2012, Google Inc. indicated that it could not satisfy his request since the “keywords” are generated automatically from user requests. In response, Mr. X. again requested the removal of the disputed link and when he had not received any response he approached the court, citing Google Inc. and Google France, seeking an order that Google Inc had not respected his right of objection violating the French Law on Information Technology and Liberties of 6 January 1978 (the IT Law).
Article 2 of the IT Law stipulates that it applies “to automated processing of personal data when the person responsible for it meets the conditions of Article 5.” It defines personal data as “any information relating to an identified natural person or a person who can be identified, directly or indirectly, by reference to one or more elements that are specific to him”. According to that definition, the first name and surname of Mr. X. are personal data allowing him to be identified. According to the same article, processing of personal data constitutes “any operation or any set of operations concerning such data, whatever the process used and in particular the collection, recording, organization, conservation, adaptation …, communication by transmission, broadcasting or any other form of making available …”. A Data Controller defines the data collected, stored, and the modalities of their treatment, it resorts to processing means located on the French territory, processing that involves personal data.
Judge Bagnérés delivered the judgment for the Paris Tribunal of Commerce. The main issue before the Court was whether Google Inc. should remove the terms Mr. X. requested, and so whether Google should be held accountable for not respecting his right of objection.
Mr X. argued that the link caused him serious prejudice and damaged his professional reputation. He invoked his right of objection on the basis that Google Inc. is the Data Controller as defined by the law. Mr. X. argued that the appearance of the “…” term in the “Google Suggest” and “related searches” in combination with his name constitutes a processing of personal data, since it is a “communication by transmission” and a “dissemination” of data. He maintained that Google Inc. performs a preliminary sorting of the requests registered in the database, as it is not disputed that Google Inc. has taken the necessary steps to exclude pornographic, violent or hateful terms. This therefore calls into question its responsibility for these functions.
Google Inc. submitted that the keywords displayed on the site do not contain any meaning about a person as they are only a reference to a possible search on the site. It stated that “the simple juxtaposition of a combination of keywords does not carry any information or judgment”. Google Inc. argued that it does not “assemble information in order to achieve an objective of its own” as the law requires, and so the display of keywords is not the result of Google’s will but is “random, depending on the previous searches of Internet users”. Google Inc. submitted that it is a company under California law, and that it has no technical means located on French soil for the processing of personal data. Google France argued that the search suggestion engine is managed exclusively by Google Inc, so the claims made against it as the French company are misdirected.
The Court rejected Google Inc.’s reliance on the French law of July 29, 1881 relating to the freedom of the press, holding that it was unfounded as Mr. X was not acting under that law but rather under the IT Law relating to the automated treatment of personal data.
The Court held that Article 9 of the European Data Protection Directive (Directive 95/46/EC) was not applicable. Article 9 requires that Member States provide for exemptions and derogations for the processing of personal data when it is necessary to “reconcile the right to privacy with the rules governing freedom of expression carried out solely for the purposes of journalism or artistic or literary expression”. Google Inc. had argued that its search engine falls within the scope of freedom of expression and so is covered by this article, but the Court held that as Mr. X. had not sought the removal of articles or the delisting of his name from those articles the right to freedom of expression was not impacted.
Google Inc. had interpreted its processing of personal data as only being used for transit purposes on French territory. Opinion 1/2008 of the Article 29 Data Protection Working Party, adopted on April 4, 2008, specifies that the notion of “means” refers to all means, whether automated or not, used on the territory of a Member State for the purpose of processing personal data, such as cookies and other similar software.
The Court held that Google Inc. does have computer processing facilities established in France, and that these are not used only for transit purposes.
Google France, however, is excluded from liability as the means necessary for the exploitation of the search engine are not used by Google France. Google France is only responsible for the commercial management of the services.
The Court dismissed Google Inc. and Google France’s request for nullity and declared Mr. X.’s action admissible. It dismissed the case against Google France and ordered Google Inc. to remove, within 30 calendar days, the words “…” from the suggestions offered by the software tools “Google Suggest” and “related searches” implemented by the Google search engine when the name of Mr. X. is entered. It included a fine of 1,000 Euro for each violation of this order. The Court dismissed Mr. X.’s claim for damages but ordered Google Inc. to pay Mr. X. the sum of 10,000 Euro under Article 700 of the French Code of Civil Procedure.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.