January 17, 2026
Introduction: When the Victim Exists Only as an Image
India’s legal framework for child protection entered the digital age with the 2019 amendment to the Protection of Children from Sexual Offences Act (POCSO). The amendment expanded criminal liability to cover the online creation, possession, storage, and circulation of child sexual abuse material (CSAM). While this legislative move was forward-looking, it did not anticipate the disruptive effects of generative artificial intelligence.
A pressing question has now emerged: can an image be said to “involve a child” when no real child exists, no minor has been harmed, and no child participated in the creation of the material? This question is no longer hypothetical. Advances in AI now permit the generation of hyper-realistic images in which adult bodies are digitally altered, or entirely synthetic faces resembling children are created without any human model.
Such developments place classical criminal law principles under strain. Concepts such as intent, knowledge, causation, possession, and agency—traditionally grounded in human conduct—become blurred when content is produced by autonomous or semi-autonomous systems. Indian cyber law has not yet articulated how criminal responsibility should operate in this emerging landscape, leaving investigators, courts, and intermediaries without clear guidance.
The Breakdown of Traditional Victimhood in the Age of AI
Criminal law has historically assumed that sexual offences involve a real, identifiable victim. In the context of CSAM, this assumption is particularly strong: the offence is understood as one that violates the bodily integrity, dignity, and autonomy of an actual child.
Artificial intelligence disrupts this assumption entirely. It can generate images of fictional minors that are visually indistinguishable from photographs of real children. These images may never have involved a living person at any stage, yet they replicate the very harm the law seeks to prevent.
Indian jurisprudence has not yet addressed whether such synthetic depictions fall within the scope of POCSO. This silence creates an interpretive vacuum. If the existence of a real child is treated as mandatory, purely artificial images escape regulation—even though their social impact may be profound.
This creates a serious ethical and legal contradiction. POCSO is designed to prevent exploitation and reduce harm, not merely to respond after injury has occurred. Excluding synthetic imagery on the ground that no real child was involved risks undermining the statute’s core protective purpose.
The Ontological Dilemma: Does a Child Have to Exist?
At the heart of the debate lies an ontological problem: can the law recognise harm where the “victim” exists only as a representation? Traditional CSAM doctrine presupposes that a child must exist as a matter of fact. AI challenges this premise by producing content that simulates childhood without embodying it.
This dilemma is not merely philosophical. It affects enforcement, prosecution, and deterrence. Synthetic imagery contributes to the normalisation of child sexualisation, increases demand for real abuse material, and makes it significantly harder for authorities to distinguish lawful content from illegal content.
If physical existence is insisted upon as a prerequisite, the law inadvertently rewards technological sophistication. Offenders can evade liability by ensuring that their images are entirely artificial, even while producing material indistinguishable from real abuse.
Rethinking “Involvement”: From Biological Reality to Visual Representation
This challenge cannot be resolved through technical definitions of artificial intelligence or metaphysical debates about existence. It must be resolved through legal interpretation.
POCSO is not a statute concerned with individual identity in the manner of defamation or property law. It is a preventive and protective statute, aimed at dismantling systems of exploitation before they cause irreversible harm. From this perspective, the phrase “involving a child” must be interpreted in line with legislative purpose.
A purposive reading suggests that what matters is not whether a child exists, but whether the image presents itself as a child. The legal focus must therefore shift from biological existence to perceptual reality. If an image appears to depict a child, creates the impression of childhood, or symbolically represents a minor, it should fall within the scope of the offence.
This distinction allows the law to separate child as a biological category from child-likeness as a representational one. Once this shift is accepted, the apparent contradiction disappears.
Comparative Perspectives: Learning from Global Legal Responses
Other jurisdictions have already confronted this issue. Laws in several countries criminalise pseudo-photographs, virtual depictions, and fictional representations of minors engaged in sexual acts. These legal systems explicitly reject the argument that a real child must be proven to exist.
The rationale is clear. Synthetic CSAM causes serious societal harm even in the absence of direct physical victimisation. It reinforces harmful sexual norms, fuels demand for real abuse material, and undermines law enforcement by making detection and verification increasingly difficult.
India’s statutory language, though not explicit, is compatible with this approach. The emphasis on representation, depiction, and appearance in the amended provisions allows Indian courts to adopt an interpretation consistent with global best practices without rewriting the law.
Why Excluding Synthetic Imagery Would Undermine POCSO
A narrow interpretation of POCSO that excludes AI-generated imagery would significantly weaken the statute. It would create a loophole where the most technologically advanced offenders face the least legal risk—an outcome that contradicts both common sense and legislative intent.
The regulation of CSAM is not limited to compensating identifiable victims. It is equally about preventing the creation of an ecosystem in which the sexual exploitation of children—real or simulated—becomes normalised and widespread.
Treating synthetic depictions as legally irrelevant ignores their real-world consequences. It also places enforcement agencies in an impossible position, forcing them to prove the physical existence of a child in an era where visual evidence can no longer be trusted.
Conclusion: An Interpretive Solution, Not a Legislative Failure
The challenge posed by AI-generated child sexual imagery does not reveal a defect in POCSO. It reveals a statutory ambiguity that must be resolved through judicial interpretation.
By understanding “involvement of a child” as referring to visual appearance rather than physical existence, Indian courts can preserve the preventive purpose of the law while adapting it to modern technological realities. This approach avoids the need for immediate legislative overhaul and ensures continuity with established principles of purposive interpretation.
Until courts or Parliament provide explicit clarification, uncertainty will persist in enforcement and prosecution. However, the interpretive path forward is clear. In the age of artificial intelligence, the law must recognise that harm can be generated even when the victim exists only as an image.
Short Judicial Briefing / Policy Note
Subject: Applicability of the POCSO Act to AI-Generated Child Sexual Abuse Material
Issue
Whether AI-generated images depicting fictional minors fall within the scope of “involving a child” under the POCSO Act, 2012.
Key Concerns
- Synthetic imagery is indistinguishable from real CSAM
- Absence of a real child creates enforcement ambiguity
- Narrow interpretation incentivises technological evasion
Legal Position
- POCSO is a preventive child-protection statute
- Purposive interpretation permits appearance-based liability
- Comparative jurisdictions criminalise fictional depictions
Recommended Interpretation “Involving a child” should be construed to include any depiction that visually represents or creates the impression of a minor, irrespective of biological existence.
Policy Rationale
- Prevents normalisation of child sexualisation
- Avoids enforcement loopholes
- Aligns Indian law with global best practices
Conclusion
Courts may extend POCSO to synthetic CSAM through interpretation alone, ensuring continued protection of children in the age of artificial intelligence.