Executive Summary
Until recently, the distribution arrangements for Child Sexual Abuse Material (CSAM) had been largely identified on the public web and the dark web. In these areas, the technology industry, regulators and policy makers have been making steady and measurable progress. However, of late, the focus of CSAM distribution has shifted to end-to-end encrypted (E2EE) services offering high levels of anonymity and negligible prosecution with an extremely low barrier to entry.
The research group undertook this study, between June 2020 to July 2020, to assess E2EE platforms as a new mode of CSAM distribution. The objective of this research was to specifically study and analyse technical, legal and policy frameworks that can help prevent proliferation of CSAM on E2EE communication services. Most countries/commentators have taken an either-or approach to propose solutions, where it is either monitoring of content and invasion of user privacy or no regulation and free flow of objectionable content at alarming levels. This report presents an objective assessment of effectiveness of reporting mechanisms on the platforms and identifies practical ways of tackling CSAM while balancing user privacy. In a dedicated section, the report proposes a model design and operationalisation strategy for reporting CSAM in an E2EE setting which does not violate user privacy. The researchers have also identified and highlighted the importance of other techno-legal and operational gaps and challenges associated with CSAM including initiatives like National Tip-Line, mandatory report CSAM buttons for intermediaries in India, National Hash Register etc.
The report exhaustively studies different E2EE services, while presenting primary findings from investigations into two platforms prevalent among Indian users : WhatsApp and Telegram. The selection of these platforms for our primary research was also based on features which facilitate wider dissemination of CSAM, such as chat invite links that enable users across the world to join a chat group on such platforms. The mode of data collection on the platforms was deployment of custom tools to gather data, and then in- depth analysis of content and reporting mechanisms. In the study, we found over 100 instances of dissemination of CSAM in just 29 randomly selected adult pornography groups on WhatsApp, while 23 such instances of dissemination from 283 channels analysed on Telegram. In due course of time, an addendum to this report in the form of a white paper for legislators will be published highlighting the need and extent of duty of care and liabilities of intermediaries in context of E2EE services and CSAM.
Read the full report here.