What Happened
- India has a layered regulatory framework to protect children on social media, combining the Digital Personal Data Protection (DPDP) Act 2023, IT Rules 2021, POCSO Act 2012, and the Juvenile Justice Act.
- The DPDP Act 2023 defines a child as anyone under 18 years of age, and mandates verifiable parental or guardian consent before any platform can process a minor's personal data.
- Draft DPDP Rules 2025 specify two modes of parental verification: government-backed digital identity systems (DigiLocker) and virtual tokens for secure consent.
- NCRB cybercrime data shows rising incidence of online exploitation of children, underscoring enforcement urgency.
- Key enforcement gaps identified: children can easily bypass age-gating by misrepresenting their age, and platform compliance mechanisms remain self-regulatory with limited independent verification.
Static Topic Bridges
Digital Personal Data Protection (DPDP) Act 2023 — Child Data Provisions
The Digital Personal Data Protection Act, 2023 is India's first standalone data protection legislation, establishing a framework for the processing of digital personal data. Section 9 of the Act specifically addresses children's data, creating age-based protections more stringent than those applicable to adults.
- Child defined as: person under 18 years (aligns with the age of majority under Indian Contract Act).
- Mandatory requirement: Data Fiduciaries (platforms) must obtain verifiable parental/guardian consent before processing a child's personal data.
- Prohibited activities for children: targeted advertising, behavioural monitoring, tracking, and processing detrimental to children's well-being.
- Verification methods (Draft Rules 2025, Rule 10): DigiLocker-based identity verification; virtual tokens issued through trusted government-linked systems.
- Penalty for non-compliance with child data provisions: up to ₹200 crore.
- The Data Protection Board (established under the DPDP Act) will adjudicate complaints and impose penalties.
Connection to this news: The DPDP Act is the primary legislative backbone for regulating children's social media access — requiring platforms to gate content and features behind verifiable parental consent, though enforcement mechanisms are still being built.
POCSO Act 2012 and IT Rules 2021 — Online Child Safety Architecture
The Protection of Children from Sexual Offences (POCSO) Act, 2012 was enacted to address sexual exploitation of children. With the explosion of internet use, its provisions have been extended to cover online grooming and child sexual abuse material (CSAM). The IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, complement POCSO by placing obligations on digital platforms.
- POCSO Act 2012: Section 11(vi) specifically addresses cyber grooming — sexual harassment of a child with intent to use for pornography or gratification through electronic means; punishment up to 3 years imprisonment and fine.
- POCSO mandates child-friendly processes for reporting, evidence recording, and trial; cases tried in designated Special Courts.
- IT Rules 2021: Require Significant Social Media Intermediaries (SSMIs — platforms with 50 lakh+ registered users) to proactively identify and remove CSAM within 24 hours; appoint Grievance Officer and Resident Grievance Officer.
- IT Rules 2021 also require platforms to establish parental control tools, though these are currently self-regulated.
- 2021 Rules classify online platforms into three categories: publishers of news, curated OTT content, and social media intermediaries.
Connection to this news: Despite POCSO and IT Rules 2021 creating obligations on platforms, the enforcement gap arises because age verification remains platform-driven — children routinely bypass restrictions by providing false birthdates.
Enforcement Gaps and Global Comparisons
India's framework is ambitious but faces structural implementation challenges. The ease of age misrepresentation is the central gap — unlike biometric-based or government-ID linked verification systems, most platforms rely on self-reported age during sign-up.
- COPPA (Children's Online Privacy Protection Act, USA): applies to children under 13; India's DPDP Act extends protection to under-18 — a broader, more protective threshold.
- UK Online Safety Act (2023): requires platforms to conduct age-assurance checks using technically robust methods; India's equivalent rules are still in draft.
- Global experiences show age-gating is circumvented at scale — children use older siblings' accounts, fake documents, or parental credentials.
- Platform-led voluntary measures (Instagram Teen Accounts, YouTube Kids) are present in India but not mandated; they offer some protection but cannot substitute for regulatory enforcement.
- NITI Aayog and parliamentary committees have flagged excessive screen time and online addiction among minors as a public health concern.
- The DPDP Act's Data Protection Board is not yet fully operational, leaving enforcement in a transitional phase.
Connection to this news: The gap between India's legislative intent and on-ground enforcement reflects a global challenge — the DPDP Act's parental consent requirement is sound in principle, but without robust technical age-assurance infrastructure, it risks being a paper protection.
Key Facts & Data
- DPDP Act 2023: Child defined as under 18 years; parental consent mandatory for data processing
- Section 9, DPDP Act: core provision protecting children's data
- Penalty for child data non-compliance: up to ₹200 crore
- Draft DPDP Rules 2025, Rule 10: verification via DigiLocker or virtual tokens
- POCSO Act 2012, Section 11(vi): cyber grooming — punishment up to 3 years + fine
- IT Rules 2021: SSMIs must remove CSAM within 24 hours; grievance officer mandatory
- SSMIs defined as: platforms with 50 lakh+ registered users in India
- COPPA (US): protects children under 13; India's DPDP: under 18 (broader coverage)
- Prohibited for children under DPDP: targeted advertising, behavioural tracking, profiling