Dr. Agnès Callamard gave the speech below at the UN event titled the “70th Anniversary of the Convention on the Prevention and Punishment of the Crime of Genocide and the International Day of Commemoration and Dignity of the Victims of the Crime of Genocide and of the Prevention of this Crime” on 7 December 2018.
History has told us, painfully, that large scale human atrocities such as ethnic cleansing and genocide are usually preceded by sustained exposure to and a routinization of hatred, expressed and acted upon. This was particularly well documented in the case of Nazi Germany by the German philologist Viktor Klemperer. He wrote, “It isn’t only Nazi actions that have to vanish, but also the Nazi cast of mind, the typical Nazi way of thinking, and its breeding ground: the language of Nazism”[i].
Fast forward 70 years later, to 2018: the world we inhabit has become, digitally networked. Information technologies and Internet in particular have enabled global public discourse on an unprecedented scale [ii]. They have reconfigured the public sphere so that, in the words of an Internet scholar, it is characterized by a “complex interaction of publics, online and offline, all intertwined, multiple, connected, but also transnational and global”[iii].
Speakers and categories of language and speech have multiplied exponentially as well, from legacy media to individuals to trolls to algorithms.
In this environment, how are we to respond to Viktor Klemperer’s warning? What is or rather what are the languages of Nazism in the 21st networked century? How do they influence the way of thinking? How do they spread, circulate, take hold of people’s mind so that a Final Solution no longer feels unthinkable and is then acted upon? How should we respond to them while protecting the borderless circulation of ideas and freedom of expression on this formidable digital space?
I – WHAT DO WE KNOW
When it comes to the digital, networked world, there is still much we do not know or understand. For instance, how does this yet very new technology impact on the human mind, on the process of socialization?
As far as atrocity crimes are concerned, there too, there are many gaps in our understanding but these are not directly related to the new technology. Hate speech and atrocity crimes have both been the object of many research and studies. The relationship between the two has been the object of well-known court cases and decisions, from the Nuremberg tribunal[iv] to the International Criminal Court on Rwanda[v]. Still, our understanding of the relationship between the language and speech of hatred, atrocity-justifying ideology and speech[vi], and the actual execution of atrocity crimes remains elusive.
Still, the last few years have taught us many important lessons as far as how these issues operate in the on-line world.
The primary lesson is: We should not let the seemingly chaotic and unruly nature of the digital world fool us. There are leaders, organization and accountability there too.
1. Organization and Planning
Atrocity crimes are not “spontaneous.” Their execution demands planning and organization, including in terms of justification and incitement.
In-depth research into the one 21st century example of atrocity crimes fueled by social media – that of the Rohingya in Myanmar – has shown that the spread of hatred against the Rohingya through Facebook did not result from Internet structural flaws driven by the general public and bots. The wide-circulation of misinformation was not organic. It was planned and organized.
Investigation by the New York Times showed that the spread of hatred was the result of systematic and covert exploitation by the Myanmar Military, who created large Facebook followings for fake pages and accounts flooded with hate propaganda and disinformation[vii].
The UN-commissioned independent Fact Finding Mission (FFM) into Myanmar found rampant hate speech in Myanmar disseminated through Facebook[viii]. The FFM’s report found that the company’s response to the misuse of the Platform to spread hatred had been “slow and ineffective.” It called for an independent examination of the extent to which Facebook posts and messages had increased discrimination and violence.
2. The Leaders
We know from research conducted into atrocity crimes over the last century (and beyond) that Killers are not “fanatical, wild-eyed ‘ideological’ warriors, fanatics but are, on the contrary, ‘ordinary people’, possibly ‘passive, indifferent”[ix].
The assimilation of beliefs depends upon people with moral authority such as political leaders, intellectuals, church and community elders, but also simply other peer-group members. And this applies as well to beliefs of hatred and fears[x].
Research with Rwanda genocide murderers showed that peer pressure from male neighbors and kin exerted more influence on their participation in killing than did government and radio propaganda, but that the radio acted as a form of communication between elites, who then recruited on a personal or kin basis[xi].
As the Myanmar example shows and as other research have demonstrated[xii], there are and were “digital leaders”. They are, often, leaders in the off-line world as well.
3. Entrepreneurs of Hatred
There is little doubt that “The new modes of sociality enabled by the various so-called ‘new’ social media may be a central contributing factor to the growth and ubiquity of hate speech in the public sphere” [xiii].
The on-line speech of polarization entrepreneurs, like their off-line speech, dehumanize, attribute guilt, construct threats, assert the existence of hidden enemies, raise alarm about survival and the future, construct bit by bit final solutions.
And they disseminate and infuse these beliefs with rumors and factually incorrect narratives presented as facts.
4. Rumors and Lies
There too, evidence points to a level of organization and planning that is not well conveyed by the seemingly chaotic digital space. Campaigns based on rumours, lies and vilification are well orchestrated[xvii].
A University of Oxford report released in July 2018[xviii] found “evidence of formally organized social media manipulation campaigns” in 48 countries in 2018, up from 28 one year earlier. All but one country currently under some levels of ICC investigation are included in this list.
“Cyber troops,” fake accounts, bots, are employed to manipulate public opinion online, to manufacture consensus, and subvert democratic processes. In most countries this involves the spread of junk news and misinformation during elections, military crises, and complex humanitarian disasters.
Again, there is nothing “organic” or inherent to the Digital Technology in this phenomenon. Instead, there are individuals, leaders, keen and eager on using the unprecedented capacities of the technology for ends varying from political opportunism to spreading atrocity-justified speech and ideology[xix].
II – RESPONSES AND PREVENTION
How do we respond to the misuse of the digital technology and social media for the purpose of inciting for and justifying genocide and atrocities? In the little time imparted to me this afternoon, I will highlight a few suggestions.
Social Media companies must integrate a Risks and Early Warning analysis in their operations, well resourced, allowing them to identify months ahead the possible “hot spots:” countries, communities or events – at risk of spread of hate, manipulation, propaganda, violence – which could ultimately lead to mass atrocities and genocide. Such early analysis will permit better preparation, better handling, less knee jerk reactions, more pro-active thinking and less reactive heavy responses. These are complex phenomena demanding complex analysis, but on this journey, they can benefit from expertise and experience outside their bubbles.
And this requires far more resources than the companies appear prepared to devote. For this, they must be called off.
Facebook has said it will have 100 Burmese content moderators by the end of 2018. But if the Military has, as the NYT article alleges, as many as 700 people working on these propaganda campaigns, Facebook is clearly “outgunned”[xx]. The company and other social media companies must do more, far more, to protect their platform against clear misappropriation of the technology.
2. Documentation – Identification of the Digital Leadership and of the Working of the Campaign of Hatred
Such investment is on-going as recent statements from the CEOs of Facebook, Twitter or Google testify – underlining how many accounts have been closed off, or blocked for instance[xxi]. These measures must and should be the object of regular reporting, along with the data and information regarding the masterminds, and impact, etc. Transparency regarding the methodology, extent and impact, along with the existence of appeal processes will go a long way towards establishing effective and trusted mechanisms, protective of an inclusive global public space[xxii].
Social media should seek to protect evidence in relationship to crimes under international criminal law. “Facebook has been criticized in the past for removing posts that could be evidence of war crimes, but it has confirmed it is “preserving data” on the Burmese accounts and Pages it has removed in the latest rounds of takedowns. The FFM has called on Facebook to make this data available to judicial authorities to enable accountability”[xxiii].
3. Fact-checking and Responses
There too, steps have been taken to address lies and propaganda spread through social media. Journalists and Media Houses are working with Social Medias as Fact-Checkers for instance[xxiv]. It will take one or more year, and proper independent objective of the workings of these fact-checkers and other mechanisms to determine whether they are effective.
Lies should not go undisturbed and unchallenged. The actual facts may not convince those who have already bought the messages. But the global public space cannot be dominated by one single message, by a monologue.
It is everyone responsibility in many ways to denounce a lie.
External, transnational actors have a clear role to play to actually contest the claims of leaders, groups and individuals who are deploying the justificatory mechanisms to push for violence.
It is a primary responsibilities of people with moral authority such as the UN Secretary General, or the Special Representative on the Prevention of Genocide, to respond to lies, including when or particularly so when they are uttered by other individuals in position of authority.
4. Supporting and Protecting
Interventions of this nature may often be more effective when done indirectly by supporting and protecting local actors, credible authorities, active civil society organizations or representatives and journalists who can respond effectively to incitement and atrocity-justifying ideologies: “Even if this only obstructs those who call for violence rather than stopping them outright, it could have significant life-saving impacts”[xxv].
We must be prepared to protect those who warn against the spread of incitement to violence, those who report atrocity crimes.
The international public outcry in response to the 7 year sentence against Myanmar journalists Wa Lone, and Kyaw Soe Oo, for reporting on major human rights violations against the Rohingya in Rakhine State is exactly what is needed. These interventions must be multiplied. Sustained.
5. Holding to Account
The fact that we are largely living in a digital, networked world does not mean that the old rules have no role, meaning or importance. Accountability remains at the heart of a prevention and response strategy as far as genocide is concerned. No one responsible for inciting mass atrocities, for inciting genocide should ever think or imagine that by virtue of inciting on-line, they can walk free. Those who by omission or commission have contributed to, or failed to prevent it – they too will have to account [xxvi].
Finally, let’s remember that the on-line and off-line worlds are largely integrated, connected. The previous speakers have referred to another 21st century genocide, that of the Yazidis, and lamented the absence of accountability.
And yet, many of those responsible for this genocide, ISIS leaders and fighters, are currently held in Iraqi prisons. They are charged with various crimes related to terrorism. But none of them are facing charges in relation to genocide or other atrocity crimes[xxvii].
This has to change. We can and must prioritize delivering accountability and justice to the Yazidi people for the crimes of genocide and other atrocities committed against them.
A friend of mine once told me that it is not enough to challenge a perverse narrative, you have to replace it with another and the only thing that can displace a story is a story.
That too is part of how we respond to genocides and prevent them.
We tell a compelling, empowering and dignified story; more powerful than that told by the perpetrators of yesterday’s and today’s genocide.
A Story of connection across neighborhoods
A Story of humanity across differences
A Story of empathy against indifference
A Story of hope against hatred
[i] The Language of the Third Reich (1947) cited by Richard Ashby Wilson, “Inciting Genocide with Words,” Michigan Journal of International Law, Vol. 36, Winter 215
[ii] UN Special Rapporteur on Freedom of Expression, A/HRC/35/38
[iii] Zeynep Tufekci, “Twitter and Tear Gas: The Power and Fragility of Networked Protest, Yale University Press,” p.6.
[iv] “The Trial of German Major War Criminals,” Proceedings of the International Military Tribunal Sitting at Nuremberg, Germany, 1 October 1946, Streicher
[v] See for instance Prosecutor v Akayesu
[vi] Jonathan Leader Maynard, “Rethinking the Role of Ideology in Mass Atrocities,” Terrorism and Political Violence 26/5 (2014): 821-841
[vii] “A Genocide Incited on Facebook, With Posts From Myanmar’s Military,” New York Times, https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html?action=click&module=Top%20Stories&pgtype=Homepage
[viii] Independent International Fact-Finding Mission on Myanmar, report released on 18 September 2018. https://www.ohchr.org/EN/HRBodies/HRC/MyanmarFFM/Pages/Index.aspx The report states, “In a context of low digital and social media literacy, the Government’s use of Facebook for official announcements and sharing of information further contributes to users’ perception of Facebook as a reliable source of information.”
[ix] Jonathan Leader Maynard, Op. cit., p.825
[x] Jonathan Leader Maynard, Op. cit., p.826-827
[xi] Charles Mironko, “The Effect of RTLM’s Rhetoric of Ethnic Hatred in Rural Rwanda”, The Media and The Rwanda Genocide,(Allan Thompson ed., 2007). Cited by Wilson, Op.cit.
[xii] See for instance research into the Philippines cyber hate and harassment. Jonathan Corpus Ong and Jason Vincent Cabanes, “Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines,” The Newton Tech4Dev Network, February 5, 2018 http://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf; Propaganda War: Weaponizing the Internet, Rappler, October 3, 2016, https://www.rappler.com/nation/148007-propaganda-war-weaponizing-internet
[xiii] Sindre Bangstadt, Hate Speech: The Dark Twin of Free Speech, 2017, http://www.sindrebangstad.com/hate-speech-the-dark-twin-of-free-speech/
[xiv] Cass R. Sunstein, “Going to Extremes. How Like Minds Unite and Divide,” 2009
[xv] John Suler, “The Online Disinhibition Effect,” CyberPsychology & Behavior, 6/1/2004, ISSN: 1094-9313, Volume 7, Issue 3, p. 321
[xvi] See for instance, Cass Sunstein, “#RepublicDivided Democracy in the Age of Social Media,Princeton University press, 2017
[xvii] See for instance Ong and Cabanes, Op. cit., 2018
[xviii] “Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation,” The report warned that the practices of democracy are violated by the strategies and techniques employed by cyber troop operations, and they actually do work, to democracy’s detriment.
[xix] Agnes Callamard, “The Control of “Invasive” Ideas in a Digital Age,” Social Research: An International Quarterly (Johns Hopkins University Press) 84 (1): 119-145, 2017
[xx] Evelyn Douek, “Facebook’s Role in the Genocide in Myanmar: New Reporting Complicates the Narrative,” 22 October 2018
[xxii] David Kaye, the UN Special Rapporteur on Freedom of Expression has issued a report (A/HRC/38/35) focusing on the regulation by social media companies with a range of recommendations. In particular, he has suggested that the companies must embark on radically different approaches to transparency at all stages of their operations, from rule-making to implementation and development of “case law” framing the interpretation of private rules. He has also recommended that companies must open themselves up to public accountability. “All segments of the ICT sector that moderate content or act as gatekeepers should make the development of industry-wide accountability mechanisms (such as a social media council) a top priority.” https://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/ContentRegulation.aspx
[xxiii] Douek, Op. cit.
[xxiv] See for instance, “Checking in with the Facebook fact-checking partnership,” CJR, April 4, 2018
[xxv] Jonathan Leader Maynard, Op. cit., p.221
[xxvi] See analysis of FB legal responsibilities under ICL by Evelyn Douek, “Why Were Members of Congress Asking Mark Zuckerberg About Myanmar? A Primer,” 26 April 2018