Global Freedom of Expression

Mrs X v. Union of India

Closed Mixed Outcome

Key Details

  • Mode of Expression
    Electronic / Internet-based Communication
  • Date of Decision
    April 26, 2023
  • Outcome
    Decision Outcome (Disposition/Ruling), Judgment in Favor of Petitioner
  • Case Number
    W.P.(CRL)1505/2021
  • Region & Country
    India, Asia and Asia Pacific
  • Judicial Body
    First Instance Court
  • Type of Law
    Constitutional Law, Cybercrime Law
  • Themes
    Digital Rights, Intermediary Liability, Safety, Privacy Violations, Adult Sexual Exploitation
  • Tags
    Non-Consensual Intimate Images (NCII), Social Media

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Delhi High Court held that the intermediaries are required to remove all offending content from their platform in the case of Non-Consensual Intimate Images (NCII), and not just links to specific URLs provided by the users/victims. A woman had had explicit photographs of her posted online by a third party and had been unsuccessful in various attempts to have the images removed. The Court highlighted the damage the posting of NCIIs can cause and how victims being required to search the internet for new uploads of these images for the purposes of requesting their removal can cause trauma. The Court held that the Indian legislative framework required that intermediaries make “reasonable effort” to ensure that their users do not post content that does not belong to them or is obscene, and that intermediaries must apply technology to ensure that the repost of offending images are removed.


Facts

In December 2019, an Indian woman, Mrs. X, met and Indian man, Richesh Manav Singhal, online. Mrs X was married and had a young son. Mrs X and Singhal exchanged contact numbers and in July 2020 he came over to her rented accommodation and “forced himself upon her” and transferred explicit photos of Mrs X (which she “had taken of herself for the purpose of sharing them with her husband”) from her phone to his. [p. 3] It appeared that Singhal had also involved Mrs X’s minor son in sexual acts.

Singhal threatened to leak Mrs X’s explicit photos and kill her son if she did not pay him “huge amounts of money”. [p. 3] After Mrs X had depleted her available funds and given Singhal jewelry, Singhal did leak the photos to pornographic websites without her consent.

In August 2021, Mrs X filed a police complaint against Singhal, on the grounds that he had made a YouTube channel in her name and posted daily explicit videos and photographs of her. Mrs X also approached Google, Microsoft, Bing, YouTube and Vimeo seeking the removal of the posts and laid complaints with “cybercrime.gov.in.” All those attempts at having the images removed were unsuccessful.

Mrs X then approached the High Court of Delhi, seeking the blocking of the sites with her intimate images. Although attempts had been made to remove all links to the images, the images were “consistently being re-produced and re-uploaded”. [para. 6] The Court appointed a senior counsel, Saurabh Kirpal, as amicus curiae to assist the Court on the question of the directives it can issue to internet intermediaries (like Google and Microsoft) in cases where images are removed by those intermediaries but are re-uploaded so as to protect the rights of individuals alongside the intermediaries’ duties.

In March 2022, the Court learnt that Singhal had been arrested and 83 000 explicit photos – including of Mrs X – were found on a laptop and that he was involved in other cases. The case therefore became moot as Singhal would no longer be able to re-upload photos of Mrs X, but the Court decided to proceed with the matter “to ensure that the victims like [Mrs X] are not forced to approach the authorities/intermediaries including the search engine repeatedly for removal of any offending content”. [para. 10]

After judgment had been reserved, the government alerted the Court to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022.


Decision Overview

Justice Subramonium Prasad delivered the High Court’s judgment. The main issue for determination was the liability of intermediaries like Google and Microsoft in removing Non-Consensual Intimate Images (NCII) from the internet.

Google argued that it had taken all efforts to prevent the presence of offensive content on its index, disabling re-uploads and removing problematic channels from YouTube [p. 35]. Regarding the Google search engine, Google LLC argued that they did not have any control over third-party websites’ content and merely indexed content made available by third parties on their websites/platforms [p. 35]. They stated that it would be futile to render directions only to search engines and not to third-party websites which were the primary sources of NCII content. While highlighting that Parliament did not intend to “task intermediaries with policing and monitoring of content under the garb of due diligence”, Google LLC submitted that “allowing intermediaries to apply its own mind to adjudge the legitimacy of online content will lead to chilling free speech, over-blocking as well as circumventing the fundamental right of online speech of third-parties” [para. 35].

Microsoft argued that their search engine, Bing, did not have technology that could automatically find and delete NCIIs, and so it was only able to delete content after receiving notification; it submitted that “[i]t becomes incumbent upon the user/victim to work with webpage owners to remove the content from the internet in its entirety” [para. 36]. Microsoft submitted that “requiring intermediaries to filter content would amount to excessive and unreasonable restrictions on the fundamental right to freedom of speech and expression” [para. 36].

The Ministry of Electronics and Information Technology (MEITY) argued that although the intermediaries are obligated to remove offending content within 24 hours of learning about it, “proactive monitoring and removal of content will adversely affect the freedom of speech and expression of other individuals having the same or similar name as that of the Petitioner” [para. 37].

Delhi Police explained that it was taking measures to “monitor and prosecute offences against women and children on the internet”, including a website feature which assists women and children in laying a complaint and the establishment of dedicated District Cyber Police Stations.

The amicus curiae explained that according to various provisions of the Information Technology Act, 2000 (IT Act) and Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules), the intermediaries were obligated to remove all offending content from their platform which is not just limited to specific URLs provided by the users [p. 34]. According to him, intermediaries would be protected from immunity under the “safe harbour protection” only if they fulfil their legal obligations as prescribed in Section 79(3)(b) [p. 34].

The Court defined Non-Consensual Intimate Images (NCII) as “sexual content that is distributed without the consent of those who are being depicted in the said content … [it] may or may not be taken with the consent of the individual involved, however, its dissemination is largely meant to be non-consensual and comes under the larger umbrella of cyber-harassment” [para. 13]. The Court acknowledged the colloquial term of “revenge porn”, but noted that this is just one form of NCII. The Court acknowledged that distribution of this content affects the victims’ mental health through other “life disruptions” such as job losses and family rejection, even though the public generally regards these offences as less serious than sexual harassment and molestation [para. 13]. Given the increase in NCII cases and internet accessibility, the Court stated that it was necessary to understand how the IT Act and its rules affect NCII abuse and the role and obligations intermediaries have in the distribution of NCIIs and prevention of its abuse.

In analyzing the existing legal framework, the Court noted that NCII was not explicitly defined in either IT Act or the Rules.  However, Rule 3(2)(b) does set out the intermediary’s grievance redressal mechanism and “more or less defines NCII as ‘any content which prima facie exposes the private area of any individual/shows such individual in full or partial nudity/shows or depicts such individual in any sexual act or conduct/is in the nature of impersonation in an electronic form, including artificially morphed images’” [p. 15]. This Rule was “not a charging offence” but section 66E of the IT Act criminalizes the violation of privacy if someone  “intentionally or knowingly captures, publishes or transmits the image of a private area of any person without his or her consent, under circumstances violating the privacy of that person”. Section 67 of the Act “provides punishment for publishing or transmitting of obscene material in electronic form” [p. 17].

The Court examined the role of intermediaries in removing NCIIs and said that although the “originators” – those who publish the content – are responsible for uploading the content, intermediaries are involved in its spread and ongoing presence online. Section 79 of the IT Act is a “safe harbour provision” and so exempts intermediaries from incurring liability. Rule 3 then place obligations on intermediaries, including making “reasonable efforts to cause the user of its computer resource not to host, display, upload, modify, publish, transmit, store, update or share any information that, inter alia, belongs to another person and to which the user does not have any right, or is obscene, pornographic, paedophilic, invasive of another’s privacy including bodily privacy, insulting or harassing on the basis of gender” [para, 23]. The Rule “categorically states that if any intermediary fails to observe the rules, the safe harbour protection … will stand vitiated and the intermediary shall stand liable for prosecution under any law” [para. 29]. The Court confirmed that search engines are intermediaries.

The Court described the uploading of the NCIIs in the present case as a “clear violation of the provisions of the IT Act and IT Rules … [and] a violation of the right to privacy” [para 41]. The Court referred to the Supreme Court case of Puttaswamy v. Union of India in setting out the elements of privacy, including “communicational privacy which is reflected in enabling an individual to restrict access to communications or control the use of information which is communicated to third parties” and “informational privacy which reflects an interest in preventing information about the self from being disseminated and controlling the extent of access to information” [para. 41]. The Court also noted that an individual has the right to “exercise control over personal data [which] would also encompass an individual’s right to control their existence on the internet” [para. 42]. It added that the “right to privacy is also inextricably linked with the right to live a life with dignity” and that the expectation of privacy extends to domestic relationships [para. 43].

With reference to the Vyaskh K.G. v. Union of India case, the Court commented that that Court had found that “in the digital context, ‘the right to delisting’ and ‘right to oblivion’ are facets of the right to be forgotten” [para. 44].

The Court then examined the obligations on the search engines – as intermediaries – in the present case. It rejected the search engines’ arguments that because they are not responsible for the material hosted on websites requests for takedowns must be sent to the publishers and not the search engines. The Court found that it was “at this stage that a search engine’s role in ensuring that one’s right to privacy is not contravened comes into prominence” [para. 45]. It referred to the Court of Justice of the European Union’s cases of Google Spain v. Agencia Española de Protección de Datos, which recognized the right to be forgotten, and Google LLC v. Commission Nationale de l’informatique et des libertes (CNIL), which found that “it was for the search engine operator to take, if necessary, sufficiently effective measures to ensure the effective protection of the fundamental rights of the [user]” [para. 47]. The Court noted that the Supreme Court of Justice of Argentina, in Da Cunha v. Yahoo de Argentina SRL, had found that the European Union and US’s regimes conferred immunity on intermediaries only once they had obtained “actual knowledge” of any offence and did not require monitoring of third-party data [para. 48]. It referred to the Vyaskh K.G. judgment again, stating that it had “succinctly observed that Google cannot be said to be content-blind to publications made online and it is not a mere passive conduit” [para. 49].

The Court stressed that the IT Rule 3 now “explicitly pronounces the obligation of the intermediary to not only ‘inform’, but to make ‘reasonable efforts’ to ensure that its users do not publish content that is prohibited” by the Rules [para. 50]. It referred to a paper by L.M. Hinman, “Searching Ethics: The Role of Search Engines in the Construction and Distribution of Knowledge”, and found that “search engines can no longer be said to be just providing access to knowledge, but are playing a central role in the constitution of knowledge itself” [para. 51]. It noted that the term “Google” has become a verb, meaning “search”. The Court commented that de-indexing a specific URL impacts on the right to be forgotten because it actually makes it more difficult for a victim “to access the offending material if they are not already in possession of the specific URLs” [para. 51].

The Court held that there is a “social obligation” on intermediaries to be “proactive in de-indexing such links when it comes to its knowledge that such content is illegal” and that it is “unfathomable” that search engines “feign helplessness” when faced with links to illegal content [para. 52]. It described the idea that a victim must “sift through” content on the internet to alert search engines each time illegal content is identified as “unconscionable” and “untenable” [para. 52]. The Court rejected the argument that search engines did not have the technology to remove content without a victim having to re-approach them. It identified technology from Google, Meta and Microsoft that creates a “unique identifier/fingerprint/hash” of offending content and enables other images online to be identified and taken down. However, the Court accepted that although it was “of the opinion” that large entities like Google and Microsoft “cannot abscond or withdraw from their duties to the public at large in the name of reducing the liability they might incur”, there was a “negative impact” on the right to freedom of expression if those entities are obliged to pro-actively filter content. The Court noted that, irrespective of the goals of the technology, “its application may lead to consequences that are far worse and dictatorial” [para. 54].

In assessing whether the right to free speech had been violated, the Court accepted that it must assess the impact on the right to freedom of expression when it considers an individual’s right to privacy. The Court said that the key issue was the meaning of “such content” in Rule 3(2)(b) and whether it meant “a specific instance of identified NCII” or “all content of identical nature” [para. 55]. It interpreted the phrase to mean “all conduct” so as to protect the victim, but that the “all” refers only to content that has been reported [para. 55]. The Court referred to the case of X. v. Union of India which had required that intermediaries “engage in proactive monitoring and removal of NCII content that the Court had deemed to be illegal”, and accepted that an appeal had been filed against that judgment but that until its determination the initial order stands [para. 56].

The Court interpreted the Rules which created additional obligations on intermediaries and which “widens their scope of losing their safe harbour” [para. 57]. The Court described the search engines’ conduct as “lackadaisical” and found that the Singhal v. Union of India case was not applicable because, in the present case, there had already been a court order instructing them to take down the relevant content and the IT Rules had not been in place when the Singhal case was decided. The Court held that if the framework was interpreted so that a victim had to approach search engines each time that would “frustrate the purpose of the … IT Rules” [para. 58]. It emphasized that having to continually search the internet for URLs to report to search engines would subject victims to trauma. It also described the obligation on search engines as not “proactive” if they are only required to filter content after they have been informed about it. The Court described the argument that intermediaries do not host or publish content as irrelevant to the question of whether they are required to remove content, and found that it was “undeniable that they do have the ability, the capacity, and the legal obligation to disable access to the offending content” [para. 59]. It stated that “this responsibility of the search engine cannot be brushed under the carpet on the ground that it does not host content” [para. 59].

The Court “painfully” noted the “abysmal absence of a collaborative effort that should ideally be undertaken by the intermediaries and the State” [para. 60]. It commented that time was lost when entities shifted responsibility and that this encourages offenders because of the lack of consequences. The Court added that this impacts on the available legal avenues and causes emotional and reputational harm to the victim, especially as in India “NCII abuse does indeed lead to harrowing consequences and everlasting stigma for the victim” [para. 60].

The Court made several recommendations: that the victim submit an affidavit to Court with the “specific audio, visual images and key words” and the URLs she requests be taken down; the intermediaries’ Grievance Officer (required by the IT Rules) “must be appropriately sensitised” on a broad definition of NCII abuse; the Indian Government’s “Online Cybercrime Reporting Portal” must enable the victim to track the complaint; the Delhi Police must register a formal complaint as soon as they receive information about NCII abuse; all District Cyber Police Stations must have a dedicated officer to liaise with intermediaries and instruct them to cooperate; there must be a fully-functioning helpline for reporting NCII abuse; search engines must employ technology they have to identify and take down similar material to the original complaint and victims must be given a “token or digital identifier” to ensure that future related offending conduct is removed without further court application; intermediaries must publish reporting mechanisms for users on their websites; intermediaries must follow the IT Rules’ timeframes; and that MEITY develop a “trusted third-party encrypted platform” [para. 61].

The Court stated that it “hopes that the directions and suggestions” it provided will be followed, and noted the “reluctance exhibited by intermediaries” despite a legal framework requiring the takedown of NCII content [para. 62]. The Court described the IT Act and IT Rules as “comprehensive and unambiguous in delineating the nature of obligations of intermediaries” [para. 64].

Accordingly, the Court dismissed the petition.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Mixed Outcome

This case highlights the obligations and responsibilities of intermediaries, including search engines and social media platforms, concerning the removal of Non-Consensual Intimate Images (NCII) content. The ruling is a balanced approach which protects the right to freedom of speech and expression under Article 19 of the Indian Constitution and ensures that offensive and illegal content, such as NCII, is promptly taken down to safeguard the right to privacy and dignity of individuals. The Court emphasized that intermediaries have a social obligation to be proactive in de-indexing such links when they become aware of illegal content, thereby addressing the impact on victims’ mental and emotional well-being. Additionally, the Court clarified the due diligence requirements for different types of intermediaries under the Information Technology Act and Rules, ensuring that they fulfil their responsibilities without infringing citizens’ constitutional rights.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

National standards, law or jurisprudence

Other national standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

The decision establishes a binding or persuasive precedent within its jurisdiction.

Official Case Documents

Attachments:

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback