What Happened
- Indonesia's Minister of Communication and Digital Affairs announced regulations on 6 March 2026 restricting children under 16 from accessing social media platforms categorised as "high-risk."
- The ban classifies platforms into risk tiers: children aged 13-15 can access "lower-risk" platforms, while "higher-risk" platforms (YouTube, TikTok, Facebook, Instagram, Threads, X, Bigo Live, and Roblox) are restricted to users above 16.
- Indonesia became the first country in Southeast Asia to impose nationwide social media restrictions on minors, following similar measures by Australia (December 2025) and Spain.
Static Topic Bridges
Global Movement Toward Online Child Safety Regulation
The regulation of children's access to digital platforms has emerged as a significant governance trend worldwide. Australia became the first country to require social media companies to block users under 16 from having accounts (effective December 2025). The European Union's Digital Services Act (DSA) of 2022 mandates platforms to assess and mitigate risks to minors. The UK's Online Safety Act, 2023 imposes a duty of care on platforms regarding content harmful to children.
- Australia's Online Safety Amendment (Social Media Minimum Age) Act, 2024, mandates age verification and imposes penalties on platforms that fail to prevent under-16 access.
- The EU DSA requires "very large online platforms" (over 45 million EU users) to conduct risk assessments for minors and implement age-appropriate design.
- China has restricted minors to 40 minutes of social media per day (2024), with a complete ban on use between 10 PM and 6 AM.
- Indonesia's approach uses a risk-tiered model, differentiating between platform types rather than imposing a blanket ban.
Connection to this news: Indonesia's regulation joins a growing global patchwork of child safety laws, each reflecting different approaches — from blanket age bans (Australia) to tiered access (Indonesia) to time limits (China).
India's Digital Personal Data Protection Act, 2023 — Child Data Provisions
Section 9 of India's Digital Personal Data Protection (DPDP) Act, 2023, establishes specific safeguards for the processing of children's personal data. The Act defines a "child" as any individual below 18 years and mandates verifiable parental consent before any data processing. It imposes a triple prohibition on tracking, profiling, and targeted advertising directed at children.
- Data fiduciaries must obtain verifiable consent of the parent or legal guardian before processing a child's personal data.
- The Act prohibits behavioural tracking, profiling, and targeted advertising directed at children on digital platforms.
- Processing of data that is detrimental to the well-being of a child is explicitly forbidden.
- Penalty for breach of Section 9 obligations: up to Rs 200 crore.
- The Central Government can exempt certain data fiduciaries from the parental consent requirement for processing children's data, if satisfied that they provide verifiably safe services.
Connection to this news: While Indonesia and Australia focus on restricting platform access by age, India's DPDP Act takes a data-centric approach — regulating how platforms handle children's data rather than banning access outright, representing a complementary but distinct model.
Right to Privacy and Digital Rights of Children
The Supreme Court of India in K.S. Puttaswamy v. Union of India (2017) recognised the right to privacy as a fundamental right under Article 21 of the Constitution. This has significant implications for children's digital rights, as it creates a constitutional basis for protecting minors from surveillance, data exploitation, and harmful online content while also raising questions about the balance between protection and children's autonomy.
- The nine-judge bench unanimously held that privacy is intrinsic to the right to life and personal liberty under Article 21.
- The UN Convention on the Rights of the Child (CRC), ratified by India in 1992, recognises children's right to privacy (Article 16) and protection from exploitation (Article 36).
- The General Comment No. 25 (2021) of the UN Committee on the Rights of the Child specifically addresses children's rights in the digital environment.
- India's POCSO Act, 2012 (Protection of Children from Sexual Offences) complements digital safety measures by criminalising online child sexual exploitation.
Connection to this news: Indonesia's regulation reflects the global tension between protecting children from online harms and respecting their evolving digital autonomy — a tension that India's constitutional framework must also navigate as DPDP Act implementation progresses.
Key Facts & Data
- Indonesia's social media restrictions for under-16s came into effect on 28 March 2026.
- "High-risk" platforms banned for under-16s: YouTube, TikTok, Facebook, Instagram, Threads, X, Bigo Live, Roblox.
- Children aged 13-15 can access "lower-risk" platforms.
- According to UNICEF figures cited by Indonesia, about half of Indonesian children have encountered sexual content on social media.
- Australia was the first country globally to impose a social media ban for under-16s (December 2025).
- India's DPDP Act, 2023, Section 9 imposes penalties of up to Rs 200 crore for violations of child data protection provisions.