The UK Department for Science, Innovation, and Technology and the AI Safety Institute have highlighted the growing threat of AI-generated CSAM in the International AI Safety Report 2025.
Global Concerns and Law regarding Child Sexual Abuse Materials
- CSAM: CSAM refers to material (images, videos, or audio) that depicts sexually explicit content involving children.
- WEF: The World Economic Forum (2023) raised alarms about AI’s ability to create life-like images of children, stressing the potential for exploitation.
- Internet Watch Foundation: The Internet Watch Foundation’s 2024 report underscores the rise in CSAM circulating on the open web, escalating concerns about child safety online.
- UK’s Legislative Response: The UK Government is taking a progressive step by introducing legislation that criminalizes the possession, creation, and distribution of AI tools capable of generating CSAM.
Current Legal Framework and Gaps
- Existing Laws on CSAM: Laws such as the Protection of Children Act 1978 and Coroners and Justice Act 2009 criminalize the taking, distribution, and possession of indecent images of children.
- These laws do not address AI-generated CSAM, focusing instead on images of actual children, leaving a significant legal gap.
- Need for Legal Reform: Existing laws are incomplete in their application to AI-generated CSAM, which involves synthetic imagery created by artificial intelligence rather than actual children.
- Section 67B of the IT Act, 2000: This section punishes the publication or transmission of material depicting children in sexually explicit acts.
- POCSO Act, 2012: Sections 13, 14, and 15 of the Protection of Children from Sexual Offences (POCSO) Act, 2012 criminalize the use of children for pornographic purposes, storing child pornography, and using a child for sexual gratification.
- Bharatiya Nyaya Sanhita, 2023: Section 294 of the Bharatiya Nyaya Sanhita penalizes the sale, distribution, or public exhibition of obscene materials. Section 295 criminalizes selling, distributing, or exhibiting obscene materials to children.
- Inadequate: While India has a robust legal framework to address CSAM involving real children, it lacks sufficient provisions to tackle AI-generated CSAM.
Impact of new Law
- Proactive Action: The new law will allow law enforcement to take action before CSAM is distributed, tackling the issue at an early stage.
- Deterrence: By criminalizing the creation and possession of AI tools capable of generating CSAM, the law serves as a deterrent, discouraging the development and use of such harmful technologies.
- Preventing Mental Health Damage: The law aims to mitigate the mental health impact on children by preventing exposure to or exploitation through AI-generated CSAM.
- Filling Legislative Gaps: It addresses the legal void regarding AI-generated imagery, ensuring that both real and synthetic images of children are protected under the law.
Rising Threat of Cybercrimes Against Children in India
- NCRB Report 2022: The National Crime Records Bureau (NCRB) Report 2022 revealed a significant increase in cybercrimes against children compared to previous years.
- National Cyber Crime Reporting Portal (NCRP):Under the Cyber Crime Prevention against Women and Children (CCPWC) scheme, the National Cyber Crime Reporting Portal (NCRP) reported 1.94 lakh incidents of child pornography as of April 2024.
- Tip-Line Reports: In collaboration with the National Centre for Missing and Exploited Children (NCMEC), USA, India receives cyber tip-line reports on CSAM.
- As of March 2024, 69.05 lakh cyber tip-line reports have been shared with the relevant States and Union Territories.
Way Forward
- Expanding the Definition: As per the NHRC Advisory (October 2023), the term ‘child pornography’ under the POCSO Act should be replaced with ‘CSAM’ to create a more expansive and inclusive definition.
- Defining ‘Sexually Explicit’ Content in IT Laws: Section 67B of the IT Act must explicitly define ‘sexually explicit’ to enable the real-time identification and blocking of CSAM content.
- Expanding Intermediary Liability: The definition of ‘intermediary’ under the IT Act must include:
- Virtual Private Networks (VPNs)
- Virtual Private Servers (VPS)
- Cloud Services
- This will impose statutory liability on these entities to comply with CSAM-related provisions.
- Integrating Technological Advancements: Statutory amendments are urgently needed to integrate the risks arising from emerging technological advancements, particularly those related to AI and cloud technologies.
- Adopting the UN Draft Convention: The Government of India must push for the adoption of the UN Draft Convention on ‘Countering the Use of Information and Communications Technology for Criminal Purposes’ by the UN General Assembly.
- Replacement of the IT Act: The Ministry of Electronics and Information Technology has proposed the Digital India Act 2023, which aims to replace the two-decade-old IT Act.
- Inspiration from U.K. Legislation: The Digital India Act should draw inspiration from the U.K.’s upcoming legislation to specifically target AI-generated CSAM, ensuring comprehensive legal safeguards for children in the digital age.
Conclusion
India must undertake urgent legal and policy changes to address the growing challenges of CSAM, digital crimes, and AI-based exploitation. A comprehensive and adaptive legislative framework is essential to ensure a safe digital ecosystem for future generations.
To get PDF version, Please click on "Print PDF" button.