AI-Generated Child Sexual Imagery & POCSO: Why Indian Law Must Protect Against Appearance-Based Harm

January 17, 2026

Introduction: When the Victim Exists Only as an Image

India’s legal framework for child protection entered the digital age with the 2019 amendment to the Protection of Children from Sexual Offences Act (POCSO). The amendment expanded criminal liability to cover the online creation, possession, storage, and circulation of child sexual abuse material (CSAM). While this legislative move was forward-looking, it did not anticipate the disruptive effects of generative artificial intelligence.

A pressing question has now emerged: can an image be said to “involve a child” when no real child exists, no minor has been harmed, and no child participated in the creation of the material? This question is no longer hypothetical. Advances in AI now permit the generation of hyper-realistic images in which adult bodies are digitally altered, or entirely synthetic faces resembling children are created without any human model.

Such developments place classical criminal law principles under strain. Concepts such as intent, knowledge, causation, possession, and agency—traditionally grounded in human conduct—become blurred when content is produced by autonomous or semi-autonomous systems. Indian cyber law has not yet articulated how criminal responsibility should operate in this emerging landscape, leaving investigators, courts, and intermediaries without clear guidance. 

The Breakdown of Traditional Victimhood in the Age of AI

Criminal law has historically assumed that sexual offences involve a real, identifiable victim. In the context of CSAM, this assumption is particularly strong: the offence is understood as one that violates the bodily integrity, dignity, and autonomy of an actual child.

Artificial intelligence disrupts this assumption entirely. It can generate images of fictional minors that are visually indistinguishable from photographs of real children. These images may never have involved a living person at any stage, yet they replicate the very harm the law seeks to prevent.

Indian jurisprudence has not yet addressed whether such synthetic depictions fall within the scope of POCSO. This silence creates an interpretive vacuum. If the existence of a real child is treated as mandatory, purely artificial images escape regulation—even though their social impact may be profound.

This creates a serious ethical and legal contradiction. POCSO is designed to prevent exploitation and reduce harm, not merely to respond after injury has occurred. Excluding synthetic imagery on the ground that no real child was involved risks undermining the statute’s core protective purpose. 

The Ontological Dilemma: Does a Child Have to Exist?

At the heart of the debate lies an ontological problem: can the law recognise harm where the “victim” exists only as a representation? Traditional CSAM doctrine presupposes that a child must exist as a matter of fact. AI challenges this premise by producing content that simulates childhood without embodying it.

This dilemma is not merely philosophical. It affects enforcement, prosecution, and deterrence. Synthetic imagery contributes to the normalisation of child sexualisation, increases demand for real abuse material, and makes it significantly harder for authorities to distinguish lawful content from illegal content.

If physical existence is insisted upon as a prerequisite, the law inadvertently rewards technological sophistication. Offenders can evade liability by ensuring that their images are entirely artificial, even while producing material indistinguishable from real abuse. 

Rethinking “Involvement”: From Biological Reality to Visual Representation

This challenge cannot be resolved through technical definitions of artificial intelligence or metaphysical debates about existence. It must be resolved through legal interpretation.

POCSO is not a statute concerned with individual identity in the manner of defamation or property law. It is a preventive and protective statute, aimed at dismantling systems of exploitation before they cause irreversible harm. From this perspective, the phrase “involving a child” must be interpreted in line with legislative purpose.

A purposive reading suggests that what matters is not whether a child exists, but whether the image presents itself as a child. The legal focus must therefore shift from biological existence to perceptual reality. If an image appears to depict a child, creates the impression of childhood, or symbolically represents a minor, it should fall within the scope of the offence.

This distinction allows the law to separate child as a biological category from child-likeness as a representational one. Once this shift is accepted, the apparent contradiction disappears. 

Comparative Perspectives: Learning from Global Legal Responses

Other jurisdictions have already confronted this issue. Laws in several countries criminalise pseudo-photographs, virtual depictions, and fictional representations of minors engaged in sexual acts. These legal systems explicitly reject the argument that a real child must be proven to exist.

The rationale is clear. Synthetic CSAM causes serious societal harm even in the absence of direct physical victimisation. It reinforces harmful sexual norms, fuels demand for real abuse material, and undermines law enforcement by making detection and verification increasingly difficult.

India’s statutory language, though not explicit, is compatible with this approach. The emphasis on representation, depiction, and appearance in the amended provisions allows Indian courts to adopt an interpretation consistent with global best practices without rewriting the law. 

Why Excluding Synthetic Imagery Would Undermine POCSO

A narrow interpretation of POCSO that excludes AI-generated imagery would significantly weaken the statute. It would create a loophole where the most technologically advanced offenders face the least legal risk—an outcome that contradicts both common sense and legislative intent.

The regulation of CSAM is not limited to compensating identifiable victims. It is equally about preventing the creation of an ecosystem in which the sexual exploitation of children—real or simulated—becomes normalised and widespread.

Treating synthetic depictions as legally irrelevant ignores their real-world consequences. It also places enforcement agencies in an impossible position, forcing them to prove the physical existence of a child in an era where visual evidence can no longer be trusted. 

Conclusion: An Interpretive Solution, Not a Legislative Failure

The challenge posed by AI-generated child sexual imagery does not reveal a defect in POCSO. It reveals a statutory ambiguity that must be resolved through judicial interpretation.

By understanding “involvement of a child” as referring to visual appearance rather than physical existence, Indian courts can preserve the preventive purpose of the law while adapting it to modern technological realities. This approach avoids the need for immediate legislative overhaul and ensures continuity with established principles of purposive interpretation.

Until courts or Parliament provide explicit clarification, uncertainty will persist in enforcement and prosecution. However, the interpretive path forward is clear. In the age of artificial intelligence, the law must recognise that harm can be generated even when the victim exists only as an image. 

Short Judicial Briefing / Policy Note

Subject: Applicability of the POCSO Act to AI-Generated Child Sexual Abuse Material

Issue

Whether AI-generated images depicting fictional minors fall within the scope of “involving a child” under the POCSO Act, 2012.

Key Concerns

  • Synthetic imagery is indistinguishable from real CSAM
  • Absence of a real child creates enforcement ambiguity
  • Narrow interpretation incentivises technological evasion

Legal Position

  • POCSO is a preventive child-protection statute
  • Purposive interpretation permits appearance-based liability
  • Comparative jurisdictions criminalise fictional depictions

Recommended Interpretation “Involving a child” should be construed to include any depiction that visually represents or creates the impression of a minor, irrespective of biological existence.

Policy Rationale

  • Prevents normalisation of child sexualisation
  • Avoids enforcement loopholes
  • Aligns Indian law with global best practices

Conclusion

Courts may extend POCSO to synthetic CSAM through interpretation alone, ensuring continued protection of children in the age of artificial intelligence.

 

 

By Adv Juhi Damodar
Juhi Damodar is an advocate, based in Karnataka and works on family court matters and child protection laws, and emerging issues at the intersection of technology and criminal jurisprudence. Her work focuses on purposive statutory interpretation and the legal challenges posed by artificial intelligence to traditional concepts of harm and victimhood.
To submit your article / poem / short story to Daijiworld, please email it to news@daijiworld.com mentioning 'Article/poem submission for daijiworld' in the subject line. Please note the following:

  • The article / poem / short story should be original and previously unpublished in other websites except in the personal blog of the author. We will cross-check the originality of the article, and if found to be copied from another source in whole or in parts without appropriate acknowledgment, the submission will be rejected.
  • The author of the poem / article / short story should include a brief self-introduction limited to 500 characters and his/her recent picture (optional). Pictures relevant to the article may also be sent (optional), provided they are not bound by copyright. Travelogues should be sent along with relevant pictures not sourced from the Internet. Travelogues without relevant pictures will be rejected.
  • In case of a short story / article, the write-up should be at least one-and-a-half pages in word document in Times New Roman font 12 (or, about 700-800 words). Contributors are requested to keep their write-ups limited to a maximum of four pages. Longer write-ups may be sent in parts to publish in installments. Each installment should be sent within a week of the previous installment. A single poem sent for publication should be at least 3/4th of a page in length. Multiple short poems may be submitted for single publication.
  • All submissions should be in Microsoft Word format or text file. Pictures should not be larger than 1000 pixels in width, and of good resolution. Pictures should be attached separately in the mail and may be numbered if the author wants them to be placed in order.
  • Submission of the article / poem / short story does not automatically entail that it would be published. Daijiworld editors will examine each submission and decide on its acceptance/rejection purely based on merit.
  • Daijiworld reserves the right to edit the submission if necessary for grammar and spelling, without compromising on the author's tone and message.
  • Daijiworld reserves the right to reject submissions without prior notice. Mails/calls on the status of the submission will not be entertained. Contributors are requested to be patient.
  • The article / poem / short story should not be targeted directly or indirectly at any individual/group/community. Daijiworld will not assume responsibility for factual errors in the submission.
  • Once accepted, the article / poem / short story will be published as and when we have space. Publication may take up to four weeks from the date of submission of the write-up, depending on the number of submissions we receive. No author will be published twice in succession or twice within a fortnight.
  • Time-bound articles (example, on Mother's Day) should be sent at least a week in advance. Please specify the occasion as well as the date on which you would like it published while sending the write-up.

Leave a Comment

Title: AI-Generated Child Sexual Imagery & POCSO: Why Indian Law Must Protect Against Appearance-Based Harm



You have 2000 characters left.

Disclaimer:

Please write your correct name and email address. Kindly do not post any personal, abusive, defamatory, infringing, obscene, indecent, discriminatory or unlawful or similar comments. Daijiworld.com will not be responsible for any defamatory message posted under this article.

Please note that sending false messages to insult, defame, intimidate, mislead or deceive people or to intentionally cause public disorder is punishable under law. It is obligatory on Daijiworld to provide the IP address and other details of senders of such comments, to the authority concerned upon request.

Hence, sending offensive comments using daijiworld will be purely at your own risk, and in no way will Daijiworld.com be held responsible.