EU finds Meta failing to keep under-13s off Facebook, Instagram
The European Commission has issued a preliminary finding that Meta's Facebook and Instagram are in breach of the EU Digital Services Act (DSA) for failing to...
What Happened
- The European Commission has issued a preliminary finding that Meta's Facebook and Instagram are in breach of the EU Digital Services Act (DSA) for failing to prevent children under 13 from accessing the platforms.
- Meta's own terms of service set 13 as the minimum age for both platforms, but the Commission found that age verification is effectively non-existent — children can enter a false date of birth at sign-up with no mechanism to verify accuracy.
- Approximately 10–12% of children under 13 are estimated to be using Instagram and Facebook in the EU, contradicting Meta's internal assessments.
- The Commission also found that Meta's in-platform tool for reporting underage users is "difficult to use and not effective," requiring up to seven clicks to access, with no auto-fill of user details.
- If the preliminary finding is confirmed, the Commission can impose a fine of up to 6% of Meta's total worldwide annual turnover.
Static Topic Bridges
EU Digital Services Act (DSA)
The Digital Services Act (DSA) is an EU regulation that entered into force in November 2022 and became fully applicable in February 2024. It establishes a tiered framework of obligations for digital services — ranging from basic intermediary services to Very Large Online Platforms (VLOPs). VLOPs are defined as platforms with over 45 million monthly active users in the EU. Facebook and Instagram are formally designated VLOPs. The DSA requires VLOPs to conduct systemic risk assessments (including risks to children's mental health) and implement mitigation measures.
- DSA enacted: November 2022; fully applicable to all platforms: February 17, 2024
- VLOP threshold: 45 million monthly active EU users
- Key obligations for VLOPs: systemic risk assessment, content moderation transparency, algorithm audits, data access for researchers, age-appropriate design for minors
- Enforcement: European Commission has direct supervisory authority over VLOPs
- Maximum fine: 6% of global annual turnover; periodic penalty: 5% of average daily worldwide turnover per day of non-compliance
- First major DSA enforcement: €120 million fine imposed on X (formerly Twitter) in 2025
Connection to this news: Meta's preliminary finding of non-compliance is the Commission's exercise of its direct enforcement powers over VLOPs under Articles 34 and 35 of the DSA, which require platforms to assess and mitigate systemic risks — including risks arising from access by minors.
EU Digital Markets Act (DMA) — Distinction from DSA
The Digital Markets Act (DMA) is a companion regulation to the DSA, also enacted in 2022. While the DSA focuses on illegal and harmful content and platform accountability, the DMA targets gatekeepers — large platforms that act as essential digital infrastructure — and prohibits specific anti-competitive practices. The DMA does not deal with child safety; that falls under the DSA and GDPR.
- DMA entered into force: November 2022; applicable: May 2023 (gatekeeper obligations: March 2024)
- Gatekeeper threshold: 45 billion EUR annual turnover + 45 million monthly EU end users + 10,000 annual EU business users
- Meta designated gatekeeper for Facebook, Instagram, WhatsApp, Messenger, and Meta Marketplace
- DMA prohibits self-preferencing, data combination across services without consent, and interoperability restrictions
Connection to this news: Meta faces simultaneous regulatory scrutiny under both the DSA (child safety/content moderation) and the DMA (competition/gatekeeper obligations), illustrating the EU's comprehensive two-track approach to platform regulation.
Child Online Safety — International and Indian Framework
Internationally, child online safety is governed by: the UN Convention on the Rights of the Child (UNCRC, 1989), which India ratified in 1992; the EU's General Data Protection Regulation (GDPR, 2018) which sets 16 as the default age of digital consent (member states may lower it to 13); the UK Online Safety Act 2023; and the US Children's Online Privacy Protection Act (COPPA, 1998, updated 2013) which sets 13 as the minimum age for data collection. In India, the Digital Personal Data Protection Act (DPDPA) 2023 requires verifiable parental consent before processing data of children (defined as under 18), and prohibits tracking, behavioural monitoring, or targeted advertising to children.
- UNCRC (1989): Article 16 (right to privacy); India ratified 1992
- GDPR (2018): Article 8 — digital age of consent 16 years (member states may set 13–16)
- DPDPA 2023 (India): Section 9 — verifiable parental consent for under-18; prohibition on targeted advertising to children
- UK Online Safety Act 2023: Ofcom regulates; "Children's Code" (Age Appropriate Design Code) applies
- COPPA (USA, 1998): prohibits collection of personal data of under-13 without parental consent
Connection to this news: The EU's DSA finding against Meta highlights the global consensus that self-declaration of age is insufficient for child protection online, directly informing implementation debates around India's DPDPA 2023 and its age-gating requirements.
India's IT Act 2000 and IT Rules 2021
India's Information Technology Act 2000 is the primary legislation governing digital intermediaries. Section 79 provides a safe harbour for intermediaries from liability for third-party content, subject to due diligence compliance. The IT (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021 classify platforms as Significant Social Media Intermediaries (SSMIs) if they have over 50 lakh registered users, imposing obligations including a grievance redressal mechanism, content traceability for first originator, and appointment of nodal officers in India.
- IT Act 2000: Section 79 — intermediary safe harbour; Section 67B — punishment for child pornography online
- IT Rules 2021: SSMIs (>50 lakh users) must appoint Chief Compliance Officer, Nodal Contact Person, and Grievance Officer (India-based)
- DPDPA 2023: supersedes some IT Act provisions on data protection; Rules pending finalisation
- India lacks a dedicated age verification mandate for social media access (unlike the UK or EU)
Connection to this news: The EU's DSA enforcement against Meta for inadequate child age verification creates regulatory precedent that India's forthcoming DPDPA Rules will need to address, given Section 9's parental consent requirement for data processing of under-18 users.
Key Facts & Data
- EU DSA enacted: November 2022; fully applicable: February 17, 2024
- VLOP threshold under DSA: 45 million monthly active EU users
- Maximum DSA fine: 6% of total worldwide annual turnover
- Estimated proportion of under-13s using Facebook/Instagram in EU: 10–12%
- First DSA enforcement fine: €120 million (on X/Twitter, 2025)
- India's DPDPA 2023: verifiable parental consent required for processing data of under-18s (Section 9)
- GDPR (2018): default digital age of consent is 16; member states can set 13–16
- COPPA (USA): prohibits data collection from under-13 without parental consent
- UN Convention on the Rights of the Child (1989): India ratified 1992
- Meta's minimum age for Facebook and Instagram (own terms): 13 years