Digital Rights, Privacy, Data Protection and Retention
Jeremy Lee v. Superior Wood
In Progress Expands Expression
Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:
Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.
This case is available in additional languages: View in: Español View in: العربية
On September 22, 2021, the US District Court held that Facebook must disclose materials relating to the incitement of ethnic hatred against Muslim-minority Rohingyas in Myanmar. In November, 2019, the Republic of The Gambia had initiated proceedings against Myanmar claiming breach of its obligations under international law on account of its ill treatment of the Rohingya minority. The International Court of Justice (“ICJ”), in January 2020, declared provisional measures requiring Myanmar to prevent the commission of genocidal acts against the Rohingya Muslims. Given Facebook’s role as the main platform for online news in Myanmar at that time, The Gambia filed a discovery request in the US District Court for the District of Columbia to uncover public and private communications as well as documents concerning the content that Facebook had deleted following the genocide. The Court granted The Gambia’s request for de-platformed content and internal investigation documents, determining that Facebook’s deleted content was not subject to the non-disclosure rule of the Stored Communications Act (“SCA”) and that, pages or posts erstwhile accessible to the public prior to the deletion by Facebook were within the ambit of the statutory exception to the non-disclosure rule.
In 2012, Myanmar witnessed an outbreak of violent acts across Rakhine State, home of the Rohingya Muslims. The violence caused large scale displacement affecting ethnic Rakhine and Muslims, including burning and looting of houses and summary executions. Multiple international human rights organization, including the United Nations Human Rights Council (“UN Mission”), concluded that the violence in Rakhine was “pre-planned and instigated and that the Myanmar security forces were actively involved and complicit” [p. 3]. Beginning in October 2016, Myanmar military performed “clearance operations” resulting in mass targeted killings, executions, disappearances, detention and torture of Rohingya civilians as well as rape and other sexual and gender based violence.
One of the findings of the UN Missions in Myanmar pertained to the influential role Facebook played in disseminating articles, being “by far the most common social media platform in use in Myanmar” for online news [p. 4]. The Myanmar officials frequently relied on Facebook to release news and information and it was also used by media outlets as a primary method to publish stories. In response to these claims, Facebook, in October 2018, commissioned a human rights impact assessment (“HRIA”) of its presence in Myanmar. According to the HRIA report, Facebook “[was] the internet” in Myanmar and Myanmar officials were able to “credibly spread rumours about people and events” through use of the platform [p. 5]. Consequently, Facebook contributed immensely to shaping public perception against Rohingyas, its platform was used to spread anti-Muslim sentiments and disinformation leading to communal violence and mob justice. Moreover, the hate campaign also saw several instances of branding of the entire Rohingya community as “illegal migrants”. For instance, in June 1, 2012, Zaw Htay, spokesperson for the President of Myanmar, posted a statement on his Facebook account equating Rohingyas with “terrorists”, contributing significantly to the 2012 violence which happened a week later.
In 2018, Facebook issued an update on Myanmar, recognizing that it was “too slow to prevent misinformation and hate” [p. 6]. In August 2018, Facebook banned accounts of key individuals and organizations in Myanmar (commander-in-chief of Myanmar’s armed forces Min Aung Hlaing and the military’s television network) as well as deleted independent news and opinion pages which covertly pushed the messages of the Myanmar military. Terming them as “violations of its terms of service”, Facebook deleted 438 pages, 17 groups and 160 Facebook and Instagram accounts, followed by nearly 12 million people, for engaging in “coordinated inauthentic behaviour” to perpetuate misinformation and hate speech against Rohingyas [p. 6]. However, it preserved the content it deleted.
In November 2019, the Republic of Gambia instituted proceedings in the International Court of Justice against Myanmar, to hold Myanmar accountable for the crime of genocide against the Rohingyas under the 1948 Convention on the Prevention and Punishment of the Crime of Genocide. In January 2020, the ICJ indicated provisional measures that require Myanmar to prevent the commission of genocidal acts against the Rohingya and the destruction of evidence. While Gambia’s application was pending at the ICJ, on June 5, 2020, it filed a discovery request pursuant to 28 USC §1782 with the US District Court of the District of Columbia, requesting for electronic content, specifically documents and communications that were produced, drafted, posted or published by the individuals and government agencies whose Facebook accounts were suspended or terminated. The request also encompassed all documents of any related internal investigations conducted by Facebook of content policy violations concerning these individuals and entities. Notably, Gambia also requested a Rule 30(b)(6) deposition of Facebook to make sense of the documents Facebook was requested to produce.
Facebook opposed the request, asserting that the provision 28 USC §2702 of the Stored Communications Act (“SCA”) barred it from disclosing the material. It also argued that the discovery request was overly burdensome and the information could be sought through other channels, requesting the court to exercise its discretion to decline the request.
Magistrate Judge Zia Faruqui delivered the judgment of the United States District Court for the District of Columbia. The principle issue before the Court was whether the deleted content requested by The Gambia was subject to disclosure by Facebook under the provisions of SCA.
According to 28 U.S.C. § 1782, a US federal court is authorized to order testimony or the production of documents “for use in a proceeding in a foreign or international tribunal” pursuant to a request by such a tribunal or “any interested person”. Essentially, three factors are required to be met in deciding whether to grant or deny the application: (i) the person resides or is found in the district, (2) the discovery requested will be used in a proceeding before a foreign or international tribunal, and (3) the request is made by an interested person [p. 7]. Whereas, under the provision 28 U.S.C. § 1782 of the SCA, an entity providing an electronic communication service to the public “shall not knowingly divulge to any person or entity the contents of a communication while in electronic storage by that service”. Under the SCA, a “user” is “any person or entity who—(A) uses an [ECS]; and (B) is duly authorized by the provider of such service to engage in such use”. Moreover, the SCA defines “person” in an expansive manner, as “U.S. government agents and individuals”. (p. 10)
Before the District Court, The Gambia raised a preliminary objection that Myanmar government officials were not “protected users” under the SCA, and therefore, not entitled to the right of protection against disclosure. It hypothesized that by explicitly listing US government agents in the definition of the “person”, Congress sought to forego protection for foreign government agents under the SCA. However, The Gambia’s attempt to partition these categories was unsuccessful as the Court noted that all individuals, including foreign government agents, were entitled to protection from unauthorized access under the SCA. Furthermore, Facebook was also held to be an ECS for the purpose of the SCA.
With respect to whether the records sought by The Gambia qualified within the definition of “electronic storage”, the Court distinguished between two types of electronic storage, the first being temporary storage (i.e. “any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof.”) and the second being “backup storage” (i.e. “any storage of such communication by an electronic communication service for purposes of backup protection of such communication.”). Both parties agreed that the question before the Court was if the records sought qualified as backup storage.
The Court delved into an interpretative exercise to recognize the true meaning of the word “backup”, declaring that the content deleted by Facebook was not a backup storage as the content was permanently taken off from the platform and no backup copy can exist without the original in existence, thereby rendering the disclosure rule of the SCA inapplicable [p. 15]. Facebook had argued that delivered and undeleted content constitutes backup storage since the deleted content remains on Facebook servers in proximity to where active content on the platform is stored, though the Court agreed with The Gambia’s reasoning that deleted content to which user has no access would not constitute backup storage. The Court also noted the purpose for which the data was stored, recognizing that Facebook stored the data for self-reflection (as it claimed that it kept the instant records as part of an autopsy of its role in the Rohingya Genocide) and not for backup. According to the District Court, the Congress had restricted the scope of the SCA to protect backup storage and not all electronic storage, Notably, the Court cited numerous judgments to support its view (for instance, Sartori v. Schrodt, 424 F. Supp. 3d 1121 (N.D. Fla. 2019); Flagg v. City of Detroit, 252 F.R.D. 346 (E.D. Mich. 2008); Theofel, 359 F.3d 1070 (9th Cir. 2004); Hately, 917 F.3d 785).
With respect to Facebook’s reliance on Hately specifically (where the Court reasoned that “a wire or electronic communication is stored for ‘purposes of backup protection’ if it is a ‘copy’ or ‘duplicate’ of the communication stored to prevent, among other things, its ‘destruction’”, p. 17), the Court held that Facebook had itself destroyed its content, and while it retained offline access to the content, the access was not meant to prevent its destruction on the platform.
Facebook had also argued that a narrow interpretation of “backup storage” would have “sweeping privacy implications” as even on the deactivation of a user’s account, the content of the user’s communication would become available for disclosure to anyone, including the US government [p. 18]. At the outset, the Court noted that Facebook’s concerns about disclosure damaging the right to privacy had no relevance, especially since coordinated inauthentic behaviour (i.e., fake accounts that violated the terms of service) did not have any privacy rights from Facebook. More importantly, the Court deemed it relevant to balance the right to privacy against the need to uncover the cause of the Rohingya genocide, while also noting that the privacy implications were minimal as the requested content still permeated social media otherwise.
Irrespective of the above, the Court did note that there were exceptions to the SCA which permitted disclosure of otherwise protected content, the most relevant being the “consent exception” where a provider is allowed to divulge the contents of communication with the consent of the originator. The Gambia had also argued for a “provider protection” exception (that “a provider may divulge the contents of a communication . . . as may be necessarily incident to the rendition of the service or to the protection of the rights or property of the provider of that service”, p. 20) to argue for disclosure of sensitive content. The Court agreed with The Gambia’s argument, citing Facebook, Inc. v. Super. Ct., 417 P.3d 725, 751 (2018) and declaring that the Court may compel production of communications excepted from SCA protection even though the use of the word “may” might lead to the presumption that that disclosure under any SCA exception is purely voluntary on the part of the provider.
With respect to the consent exception, while the Court agreed that there is no magic number of accessible viewers for content to trigger the consent exception, it instead looked to answer “whether the posts had been configured by the user as being sufficiently restricted that they are not readily available to the general public” [p. 22]. On the basis of the facts, the Court concluded that Myanmar officials intended their reach to be public as making their accounts and pages private would have defeated their goal of inflaming hate against Rohingyas. As a result, outside of private messages, the content that The Gambia requested fell within the consent exception and therefore, discovery was appropriate. The Court did not concur to The Gambia’s provider protection exception, however.
One of the other arguments that Facebook raised was the overbroad nature of The Gambia’s request which it believed offered no meaningful metric for identifying accounts and was unduly burdensome. Yet, the Court rejected this argument citing that the scope of The Gambia’s request was very specific, to seek de-platformed content dating back to 2012 that was relevant to the ICJ case (i.e. documents related to hate speech and [the] incitement to violence on Facebook). Such a review, in view of the Court, presented minimal difficulties as Facebook has publicly touted the strength of its Myanmar language team and its content-review capabilities. Facebook’s appeal to the Court to request The Gambia to exhaust alternative avenues of discovery was dismissed as well – the Court stated that no law supported a quasi-exhaustion requirement in this case.
Finally, with respect to discovery of Facebook’s internal investigation documents, the Court on similar grounds, ordered the platform to produce any non-privileged documentation that related to its internal investigation. The Court noted that the internal investigation records requested by The Gambia had a viable purpose – of illuminating how Facebook connected the seemingly unrelated inauthentic accounts to Myanmar government officials and which accounts or pages were operated by the same Myanmar government officials or from the same government locations. The Gambia’s final request for a Rule 30(b)(6) deposition was held as unduly burdensome of Facebook, and therefore, rejected by the Court.
In conclusion, while acknowledging that the SCA needed an update in line with the current times, it nevertheless laid down a well-established path for disclosure of deleted content and Facebook had to adhere to The Gambia’s request for discovery of de-platformed content and related internal investigation documents.
Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.
The judgment expands expression.
This case presents a rare mix of the privacy, content moderation and freedom of expression issues on the internet. Facebook’s content moderation in Myanmar was acknowledged by the U.N. Special Rapporteur for freedom of expression in a 2019 report as one of the crucial tools to moderate problematic content. Yet, Facebook’s denial of discovery request on grounds of non-disclosure and privacy was considered by the Court as a restriction on the freedom of expression. The Court in this case also acknowledged that “[t]he question of how social media platforms can respect the freedom of expression rights of users while also protecting [users] from harm is one of the most pressing challenges of our time” [p. 13]. At the same time, it also acknowledged that the right to privacy in this case must be balanced against the need to uncover the cause of the Rohingya genocide [p. 18].
Irrespective, this case is an important win for several human rights accountability initiatives, such as Gambia’s ICJ case, which continue to fight against Facebook’s arbitrary enforcement of community standards as well as its content decisions towards state actors in various jurisdictions such as Myanmar.
Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.
Case significance refers to how influential the case is and how its significance changes over time.
Let us know if you notice errors or if the case analysis needs revision.