Global Freedom of Expression

X v. Union of India

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 20, 2021
  • Outcome
    Blocking or filtering of information
  • Case Number
    W.P.(CRL) 1082/2020
  • Region
    Asia and Asia Pacific
  • Judicial Body
    First Instance Court
  • Type of Law
    Cybercrime Law
  • Themes
    Cyber Security / Cyber Crime, Indecency / Obscenity, Intermediary Liability, Privacy, Data Protection and Retention
  • Tags
    Revenge Porn, Google, Social Media, Same or Similar Content, De-Index

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The High Court of Delhi, India ordered the police to remove content unlawfully published on a pornographic website and search engines to de-index that content from their search results and directed that all parties take action to prevent further publication of similar or identical content. After a woman had discovered that photographs from her social media accounts had been published on a pornography website without her knowledge or consent she filed a complaint with the police and the National Cyber-Crime Reporting Portal and then approached the High Court. The Court stressed the need for “immediate and efficacious” remedies for victims of cases like this as well as the need to balance the obligations of internet intermediaries and the rights of the users, and set out the type of directions that a court can issue in these cases.


Facts

X, an Indian woman, discovered that photographs she had posted on her Facebook and Instagram social media accounts had been re-posted on a pornographic website, www.xhamster.com, by an unknown entity without her consent. X had the requisite privacy settings activated on those accounts. X’s photographs were not obscene or indecent but X believed that their posting constituted an offence under section 67 of the Information Technology Act, 2000 (the IT Act) and the inclusion of derogatory captions on her photographs fell within other offences in the IT Act and the Indian Penal Code. Section 67 makes it an offence to publish and transmit material “that appeals to the prurient interests, and which has the effect of tending to deprave and corrupt persons, who are likely to see the photographs” [para. 2]. As soon as she had discovered the photographs, X filed a complaint with the police and the National Cyber-Crime Reporting Portal but neither authority acted and, within a week of their posting, the photographs had been viewed 15 000 times.

X filed a petition before the High Court at Delhi. During interim proceedings, the Cyber Prevention Awareness and Detection Unit (CyPAD), submitted that, due to technological limitations and impediments, it could not assure the court that it would be able to entirely remove the photographs from the internet. The Court made an interim order for the removal of the photographs, but X notified the Court that the photographs had been re-published and re-posted on other websites, which nullified the effect of the interim order.

To assist the Court, it appointed an advocate, specializing in cyber-law and crime-crime, Dr Paval Duggal, as amicus curiae.


Decision Overview

Justice Anup Jairam Bhambhani delivered the judgment of the High Court of Delhi. The Court described the main issues before it as: 1) when a party seeks the removal of content from the internet, “what directions are required to be passed by a court to make its order implementable and effective; and to which parties are such directions required to be issued?”; and 2) “[w]hat steps are required to be taken by law enforcement agencies to implement such directions issued by a court” so as the content does not “resurface” and the parties “do not succeed in brazenly evading compliance of such orders/directions with impunity?” [para. 11]. All the parties made submissions based on what directions they believed a court can make in these cases.

Delhi Police argued that directions must be issued to intermediaries under section 79 of the IT Act, the Penal Code and the 2021 Rules to “ensure successful removal of offensive and unlawful content from the world-wide-web and to prevent such content being re-posted, re-transmitted or republished” [para. 42]. The Police stated that law enforcement does make requests to intermediaries for information or for the removal of content, but the intermediaries do not always cooperate.

Google LLC explained its nature as a search engine which only indexed information – and so could not be categorized as a “publisher” – and described the function of a search engine as reactive rather than proactive: it could only remove content on notification by courts or government agencies. It emphasized that it was technologically impossible for it to remove content when the issue is not the content per se but the context in which the content was published, and referred to the case of Myspace Inc vs. Super Cassettes Industries 2017 (69) PTC 1 (Del) (DB) which had warned of the dangers to free speech of intermediaries removing all content when they are not able to evaluate the context of content. Google submitted that any prior restraint or “blanket ban” on the publication of content would infringe the right to freedom of expression and act as a chilling effect but that it had “no opposition to removing access to the offending content” and had processes in place to allow for requests for content to be removed, including having a “dedicated webform for [Indian] Governmental Agencies to report content that is unlawful” [paras. 45-46]. In respect of how a court can make an order effective, Google stressed that “the surest way to ensure that offending content is not accessible” is to remove it from the webpage on which it is hosted – but that only the owner of the website can do that [para. 57]. Google maintained that it is exempt from liability from third-party content on its search engine

The Ministry of Electronics and Information Technology submitted that a court could make one of three directions: direct the intermediary to remove the content; direct the relevant government agency to have the content removed; or grant a petitioner the right to request intermediaries, government agencies or the police to “seek removal of unlawful content” [para. 65]. The Ministry also submitted that a petitioner could file a complaint before either the relevant government agencies or the “Grievance Officer” of the relevant intermediary, and highlighted that the Online Cyber-crime Reporting Portal had been expanded to cover all cybercrimes – including crimes like in the present case.

The Internet Service Providers of India (ISPAI) – a voluntary association of 82 service providers in India – explained that service providers are “mandated by law to comply with any ‘blocking orders’ issued by competent government authorities or a court” but that its members “cannot regulate the content of any intermediary” [para. 69]. The ISPAI emphasized that service providers “enable access to the internet but do not control the content hosted on websites or online platforms” and that because they use encryption mechanisms “it is technically impossible for an internet service provider to block unlawful content” as only the website can do that [para. 70]. The ISPAI’s suggestion to the Court was that service providers can comply with directions to block entire websites but cannot “sift or monitor content or to block content partially” [para. 71].

Facebook stressed that X had not sought any relief against Facebook – despite the photographs having been taken from X’s Facebook and Instagram accounts – and that it has robust privacy processes to protect users’ content. Facebook confirmed that it remains willing to cooperate with law enforcement agencies and courts to remove any unlawful content.

The Court noted that interim relief is important in cases like this as “if the court is not in a position to pass effective and implementable orders and is unable to ensure that such orders are complied with at the interim stage, subsequent adjudication of the matter could well be rendered infructuous” [para. 8].

The Court conducted a thorough analysis of the legislative framework, including the IT Act, the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules 2009, the Information Technology (Intermediaries Guidelines) Rules 2011 and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (which replaced the 2011 Rules). The Court focused on the effect of the exemption for liability for intermediaries in section 79 of the IT Act, which provides that an “intermediary shall not be liable for any third party information, data, or communication link made available or hosted by him” in specific circumstances.

The Court compared the 2011 and 2021 Rules and observed that the 2021 Rules “sharpened” the regime governing internet intermediaries. It noted that the Rules now require intermediaries to notify users that they cannot publish any information that “belongs to another person and to which the user does not have any right”, and require the intermediary to not host any unlawful information – including “information that is violative of decency or morality, upon receiving actual knowledge about such information in the form of a court order or on being notified by the appropriate government or its agency” [para. 15]. The Rules oblige the intermediary to remove content within 36 hours of receipt of a court order or notice from a government agency. Intermediaries are also required to provide any governmental agency with information within 72 hours of a request “for the purposes of investigation or cyber-security or protection, for the purposes of verification of identity, or for the prevention, detection, investigation or prosecution of offences” [para. 16]. The Rules also oblige an intermediary to nominate a “Grievance Officer” to enable a victim to make complaints to the intermediary. In addition, the 2021 Rules introduce the concept of a “significant social media intermediary” and impose specific obligations on that category of intermediary. In summary, the Court compared the 2011 and 2021 Rules and concluded that, in the new Rules, the Central Government had sharpened the liabilities of intermediaries to deal with unlawful content; had reduced the time period to remove material from 1 month to 24 hours; and had removed the exemption from liability intermediaries enjoyed under section 79 of the IT Act if that intermediary fails to remove unlawful content. The Court commented that the changes had been introduced by the Government “appreciating the fact that to effectively remove or disable access to unlawful content, it is imperative that action be initiated immediately since any delay in such action can render the same ineffective and futile” [para. 29].

The Court also analysed the IT Act’s provisions, noting that the IT Act provides “extraterritorial jurisdiction and overriding application” as long as the “computer, computer system or computer network involved are located within India” [para. 74]. It added that although section 79(1) exempted intermediaries from certain liability, this was “not unqualified or unconditional and applied only if the intermediary fulfils certain conditions and obligations” [para. 78]. The Court also characterized the 2021 Rules as demonstrating that the legislature intended section 79 to impose penal consequences on an intermediary who fails to observe the law – and stressed that the IT Act and the Rules create the regime that the exemption from liability exists for intermediaries only to the extent that they obey the Rules. Here the Court referred to the Shreya Singhal vs. Union of India (2015) 5 SCC 1 case which had held that “an intermediary would lose the exemption from liability that it enjoyed under section 79(1) if it did not ‘expeditiously remove or disable access to’ offending content or material despite receiving ‘actual knowledge’.” [para. 81]. The Court set out the requirements for the exemption under section 79 to apply: the intermediary’s function is limited to “providing access to a communication system”; the intermediary does not initiate, select the receiver nor select or modify the information to be transmitted; the intermediary “observes due diligence”; the intermediary has not “conspired, abetted or induced the commission of an unlawful act”; and the intermediary has not failed to “expeditiously remove or disable access” to content after receiving notice of its unlawfulness [para. 82]. It also noted that the IT Act makes the “directors, manager, secretary or other officer of a company” liable if the contravention of the Act or Rules has been because of that person’s neglect [para. 84].

The Court examined comparative law, including the Australian case of X. v. Twitter [2017] NSWSC 1 1300; the European Court of Justice cases of Google Spain SL, Google Inc. vs. Agencia Española de Protección de Datos (AEPD) Case C-131/12; ECLI:EU:C:2014:317 and Eva Glawischnig-Piesczek vs. Facebook Ireland Limited Case C-18/18; ECLI:EU:C:2019:82; and the Canadian cases of Google Inc. vs. Equustek Solutions Inc. et al 2017 SCC 34 and Equustek Solutions Inc. vs. Jack 2018 BCSC 610. The Court also noted that Indian courts have issued injunction orders in respect of online content, and referred to Swami Ramdev & Ans. vs. Facebook, Inc. & Ors 2019 SCC OnLine Del 10701, YouTube LLC & Anr. vs. Geeta Shrof 2018 SCC OnLine Del 9439, ABC vs. DEF & Ors CS(OS) No.160/2017, and Shreya Singhal vs. Union of India (2015) 5 SCC 1.

In response to Google’s submissions, the Court stressed that it did not contemplate any “prior restraint or blanket-ban”, and that it was “conscious” that it should not place an “untenable burden” on an intermediary and ensure that any directions are possible to carry out and are proportionate in removing only the offending content [para. 63-64].

The Court highlighted that X’s photographs were not obscene or offensive in themselves, but as they had been taken from her social media accounts without her consent, to be uploaded on a pornographic website alongside “derogatory captions”, publication of the photographs constituted an offence under section 67 of the IT Act as the “only purpose of posting the petitioner’s photograph on a pornographic website could be to use it to appeal to the prurient interests of those who are likely to see it” [para. 85]. It added that publication of these images would likely result in “ostracization and stigmatization” which required an “immediate and efficacious remedy” [para. 85].

The Court stated that it believed that for its order to be effective – “even within India” – an intermediary would have to block the content globally [para. 89]. With reference to the Google Inc v. Equustek Solutions case the Court stressed that this is something that a search engine can do with ease. It provided a list of directions which it believed would achieve a “fair balance between the obligations and liabilities of the intermediaries and the rights and interests of the aggrieved user/victim” and constitute an order that was “legal, implementable, effective and would enable meaningful compliance of the orders of a court without putting any impossible or untenable burden on intermediaries” [para. 90]. These directions could be to direct: (i) the host website to remove the content within 24 hours; (ii) the host website to preserve all content for at least 180 days in case of any further investigation; (iii) the search engine to de-index the content; (iv) the intermediaries to “endeavour to employ pro-active monitoring” to remove identical content; (v) the law enforcement agencies to obtain all relevant information from the respective websites; (vi) the aggrieved party to provide law enforcement with all relevant information; (vii) that the aggrieved party can notify law enforcement to remove the same or similar content from any other website; (viii) the aggrieved party to notify the National Cyber-Crime Reporting Portal; and (ix) that the intermediary would lose the exemption of liability if it failed to adhere to the directions [para. 90].

The Court stressed that “directions issued by a court seized of a case such as the present one, must be specific, pointed and issued to all necessary parties”, and that this will ensure that – despite the nature of the internet being that content can never be “completely removed” – content can be made “non-searchable” by de-indexing it [para. 91].

The Court held that it was clear that section 67 had been contravened by the reposting of X’s photographs on www.xhamster.com, and that it needed to direct “the State and other respondents to forthwith remove and/or disable access to the offending content … to the maximum extent possible” [para. 92]. It directed that (i) X provide law enforcement with all relevant information; (ii) the Delhi Police remove or disable access to the offending content from “all websites and online platforms”; (iii) the search engines, Google, Yahoo Search, Microsoft Bing and DuckDuckGo, de-index all offending content from their search results; (iv) the search engines adopt proactive methods to identify and remove identical content; (v) the investigating officer provide all the other entities referred to in the order with the relevant information to allow them to remove or disable access to the content; (vi) Delhi Police to obtain all relevant information about the offending content from www.xhamster.com and the search engines for further investigation; (vii) X be permitted to request the investigating officer to remove the same or similar content from other websites; and (viii) that any platform could approach the court for any clarification. The Court stressed that any failure to comply with the order would lead to an intermediary’s removal of the exemption from liability.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

In balancing two competing rights, the Court stressed the need to protect the rights of internet intermediaries and not impose too onerous obligations on them to remove content, but recognized the need to protect internet users from the consequences of unlawful cyber behavior.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

National standards, law or jurisprudence

Other national standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

The decision was cited in:

Official Case Documents

Reports, Analysis, and News Articles:


Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback