What Happened
- UK Prime Minister Keir Starmer summoned senior executives from Meta, X (formerly Twitter), TikTok, Snap, and YouTube to Downing Street on April 16, 2026, to press them on child online safety.
- Starmer warned the tech leaders: "Things can't go on like this. They must change because right now social media is putting our children at risk," signalling escalating government pressure after years of industry self-regulation.
- The UK government is actively consulting on Australia-style measures including a ban on under-16s using social media, screen-time curfews for teenagers, and limits on algorithmically-driven "doomscrolling" features.
- A public consultation on these proposals is open until May 26, 2026; Starmer has not ruled out legislation but is awaiting its outcome.
- The UK Parliament remains divided: the House of Lords has twice passed a proposal to ban social media for under-16s, while the House of Commons has rejected it twice.
- The meeting follows a US court ruling that held Meta and YouTube liable for harming a young woman, which Starmer cited as a catalyst for action.
- Greece has announced plans to ban under-15s from social media, and the EU is developing binding recommendations — reinforcing a global regulatory trend.
Static Topic Bridges
Australia's Online Safety Amendment (Social Media Minimum Age) Act 2024
Australia became the world's first country to legislate a hard age-based social media ban. The act, passed on 29 November 2024 and enforced from 10 December 2025, prohibits anyone under 16 from holding accounts on designated platforms.
- Platforms covered: Facebook, Instagram, Reddit, Snapchat, TikTok, X, Threads, Twitch, Kick, and YouTube.
- Enforcement mechanism: Monetary penalties on companies that fail to take "reasonable steps" to prevent underage accounts; maximum civil penalty is AUD 49.5 million (150,000 penalty units) per breach.
- Obligation rests on platforms, not on children or parents — companies must verify age, not users.
- 77% of Australians supported the age limit in a November 2024 YouGov poll.
Connection to this news: The UK government has explicitly cited Australia as the model it is studying; Starmer's consultation mirrors the political journey Australia took before passing its landmark legislation.
India's Digital Personal Data Protection Act, 2023 (DPDPA) — Children's Data Provisions
India's DPDPA (enacted August 2023, rules in 2025) is the primary domestic framework regulating personal data of children. Section 9 is the operative provision for child protection.
- Definition of "child": Anyone below 18 years of age — stricter than GDPR (13–16 years) or the US COPPA (13 years).
- Verifiable parental/guardian consent is mandatory before any Data Fiduciary processes a child's personal data.
- Absolute prohibition on tracking, behavioural monitoring, and targeted advertising directed at children — these are banned even if a parent consents (Section 9(3)).
- Processing that is "likely to cause any detrimental effect on the well-being of a child" is expressly prohibited.
- Penalty for violating children's data provisions: up to ₹150 crore.
- Exemptions exist for health, education, and childcare institutions for limited, safety-related purposes (e.g., location tracking during school transport).
Connection to this news: India's framework regulates how platforms handle children's data but does not impose a minimum-age access ban. The UK and Australia model raises the question of whether India should move beyond data protection toward a structural access restriction — a policy gap relevant for Mains answer writing.
Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 — IT Rules 2021
IT Rules 2021, framed under the IT Act 2000, govern the liability and obligations of social media platforms operating in India.
- Created two categories: Social Media Intermediaries and Significant Social Media Intermediaries (SSMIs) — those with 50 lakh (5 million) or more registered Indian users.
- SSMIs must appoint a Grievance Officer, Nodal Contact Person, and Chief Compliance Officer (resident in India).
- All intermediaries must deploy automated tools to proactively identify Child Sexual Abuse Material (CSAM) and previously-removed content — a "best-effort" obligation.
- Safe harbour (immunity from third-party content liability) is conditional on observing due diligence under these rules.
- 2023 amendments require platforms to remove content flagged by the government within 36 hours for emergency categories, 72 hours otherwise.
Connection to this news: While IT Rules 2021 impose content moderation obligations, they do not mandate age verification or restrict children's access to platforms. The UK-Australia approach of platform-level liability for underage access represents a more structural intervention that India's framework currently lacks.
UNCRC General Comment No. 25 (2021) — Children's Rights in the Digital Environment
The UN Committee on the Rights of the Child adopted General Comment 25 in March 2021, the first authoritative UN document making explicit that children's rights under the Convention apply in the digital world.
- Reaffirms core UNCRC principles — non-discrimination, best interests of the child, right to life/survival/development, and respect for children's evolving capacities — in online contexts.
- States (all 196 signatories) must now formally report on how they implement children's rights online, creating an accountability mechanism.
- Calls on governments to regulate the digital environment, require age-appropriate design, and prevent harmful data practices targeting children.
- Developed through consultation with over 700 children and young people aged 9–22 across 27 countries.
- Implementation remains uneven: low-income and conflict-affected states often lack the capacity to enforce GC25 principles.
Connection to this news: The UK's move, Australia's ban, and India's DPDPA protections can all be situated within the GC25 framework — states are under normative obligation to take affirmative steps to protect children's rights online, not merely reactive steps after harm occurs.
Global Regulatory Approaches — Comparative Landscape
A spectrum of regulatory models has emerged globally, ranging from data protection to outright access bans.
- Australia (2024): Hard age ban (under-16) on designated platforms; platform liability model; penalties up to AUD 49.5 million.
- EU — Digital Services Act (DSA): Article 28 requires "very large online platforms" (45 million+ EU users) to implement "appropriate and proportionate" child safety measures; binding guidelines issued July 2025. EU is piloting a privacy-preserving age verification app (ready April 2026).
- France: SREN Law (July 2024) — age verification mandatory for adult content; parental consent required for under-15s to register on social media; implemented via regulator ARCOM.
- Greece: Announced ban for under-15s on social media (2026).
- EU Parliament Proposal (November 2025): Minimum age of 16 for social media access, combined with privacy-preserving age verification — mirrors Australia model.
- UK: Consultation-first approach; Online Safety Act 2023 already imposes child safety duties on platforms; age-ban debate ongoing.
- India: DPDPA 2023 protects children's data; no minimum-age access restriction exists yet; IT Rules 2021 require CSAM filtering.
- USA: COPPA protects under-13s; no federal social media age ban; individual states pursuing restrictions.
Connection to this news: The UK's Downing Street summit signals convergence among Western democracies toward stricter platform accountability. India, as a major internet market with the world's largest youth population, will face similar regulatory pressure — making this a live policy debate for GS2 (Governance) and GS3 (Technology).
Key Facts & Data
- UK PM Keir Starmer met executives from Meta, X, TikTok, Snap, and YouTube on April 16, 2026 at Downing Street.
- Australia's ban (enacted November 2024, enforced December 2025) covers under-16s across 10 major platforms; penalty up to AUD 49.5 million per breach.
- 77% of Australians supported the age ban in a November 2024 YouGov survey.
- India's DPDPA 2023 defines a "child" as anyone under 18 — stricter than GDPR (13–16) and COPPA (13).
- Penalty for violating children's data provisions under DPDPA: up to ₹150 crore.
- IT Rules 2021: SSMIs in India are platforms with 50 lakh+ registered users.
- UNCRC General Comment 25 (March 2021): first UN document affirming children's rights apply in the digital world; 700+ children from 27 countries consulted.
- UK Online Safety Act 2023 already imposes child safety duties but stops short of an age ban.
- EU's DSA Article 28 covers platforms with 45 million+ EU users; EU age verification app technically ready as of April 15, 2026.
- UK public consultation on a potential under-16 ban closes May 26, 2026.
- House of Lords (UK) voted twice for a ban; House of Commons rejected it twice — illustrating democratic tension between protection and access rights.
- India has the world's largest youth population (under-25), making any age-based social media restriction a policy question of significant scale domestically.