What Happened
- A New Mexico state court jury found Meta liable for nearly $375 million in civil damages after a landmark trial centred on allegations that the company knowingly harmed children's mental health and concealed evidence of child sexual exploitation on its platforms.
- The jury deliberated after a nearly seven-week trial and found Meta willfully violated the state's Unfair Trade Practices Act.
- The jury's checklist of findings included: Meta failed to disclose what it knew about enforcement failures around under-13 users, the prevalence of content about teen suicide, and the role of its algorithms in prioritising harmful or sensational material.
- Meta stated it "respectfully disagrees with the verdict" and intends to appeal.
- The case is considered a landmark because it is among the first state-level civil trials — rather than regulatory proceedings — to hold a major social media company financially accountable for platform design choices that harmed minors.
- This verdict comes amid a broader global wave of regulatory action against Big Tech over child safety, with the EU's Digital Services Act, Australia's social media age-ban law, and India's emerging framework for online child protection all in motion.
Static Topic Bridges
COPPA and the Global Regulatory Architecture for Children Online
The Children's Online Privacy Protection Act (COPPA), enacted in the United States in 1998 and effective from 2000, is the foundational framework for regulating how digital platforms handle personal data of children under 13. It mandates verifiable parental consent before collecting, using, or disclosing a child's personal information, and requires operators to publish clear privacy policies. Despite COPPA, enforcement has historically been weak — YouTube was fined $170 million in 2019 for tracking minors' viewing history for targeted advertising. The New Mexico verdict goes beyond COPPA: rather than enforcing data collection rules, it holds Meta accountable for algorithmic design choices and the concealment of harms. This "design liability" theory is increasingly influential and is shaping legislative efforts in the US (COPPA 2.0) and globally.
- COPPA (1998): Applies to US-based websites collecting data from children under 13.
- COPPA 2.0 proposed: Would extend protections to teenagers (under 17).
- $170 million fine against YouTube (2019): Largest COPPA enforcement action at the time.
- "Design liability": The legal theory that platforms can be held liable not just for data violations but for harmful product design choices.
- EU Digital Services Act (DSA): Came into force 2024; bans targeted advertising to minors on large platforms.
Connection to this news: The New Mexico verdict operationalises "design liability" — a concept that will shape how India and other jurisdictions approach platform regulation beyond simple data protection.
India's Regulatory Framework for Online Child Safety
India's Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules 2021) impose due diligence obligations on social media intermediaries, including grievance redressal mechanisms. The Protection of Children from Sexual Offences (POCSO) Act, 2012 criminalises child sexual abuse material (CSAM) online, and the IT Act 2000 (Section 67B) separately penalises electronic publication of obscene material involving children. However, India lacks a comprehensive children's online privacy law equivalent to COPPA, and platform design accountability is largely unaddressed in current law. The Digital Personal Data Protection (DPDP) Act, 2023 includes provisions restricting processing of children's data and prohibiting targeted advertising to minors — making it India's closest equivalent to COPPA.
- DPDP Act 2023 (India): Defines "child" as below 18; prohibits processing of children's data without verifiable parental consent; bans behavioural monitoring and targeted advertising to minors.
- IT Rules 2021: Significant social media intermediaries (>5 million users) must appoint a Grievance Officer and remove CSAM within 24 hours.
- POCSO Act 2012: Section 13–15 address online CSAM production and storage.
- India's approach is currently compliance-and-takedown focused, not design-liability focused.
Connection to this news: As India operationalises the DPDP Act, the New Mexico verdict's "design liability" precedent will be closely watched by Indian policymakers considering whether platform architecture itself should be a basis for regulatory action.
Social Media Algorithms and Mental Health: The Science
Research on the relationship between social media and adolescent mental health has grown substantially since 2017. Meta's internal research (leaked in the "Facebook Papers" of 2021) showed the company was aware that Instagram made body image issues "worse for one in three teen girls." The algorithmic amplification of sensational, conflict-driven, or body-image content is not incidental but a product design choice optimised for engagement metrics (time-on-platform, clicks, shares). Neurologically, adolescents are more susceptible to social comparison, peer validation loops, and fear-of-missing-out (FOMO) — making algorithm-driven content pipelines disproportionately harmful in this age group compared to adults.
- Facebook Papers (2021): Revealed Meta's internal findings that Instagram worsened body image for a significant portion of teenage girls.
- "Engagement-maximising algorithms": Designed to maximise time-on-platform; tend to surface emotionally provocative content.
- Adolescent neuroscience: Prefrontal cortex (executive control) not fully developed until mid-20s; heightens vulnerability to social comparison loops.
- Jean Twenge's research ("iGen", 2017): Documented correlation between smartphone and social media adoption (post-2012) and rise in US adolescent depression and anxiety.
- Australia (2024): Passed legislation banning children under 16 from social media platforms entirely.
Connection to this news: The New Mexico trial established that Meta knew of these harms and continued the algorithmic design anyway — the core element of a willful violation finding, which is why damages were so large.
Key Facts & Data
- New Mexico civil damages verdict: ~$375 million against Meta (March 24, 2026).
- Trial duration: Nearly seven weeks.
- Meta's response: Plans to appeal.
- COPPA (1998): Protects children under 13 in the US.
- YouTube COPPA fine (2019): $170 million.
- India's DPDP Act 2023: Bans targeted advertising and behavioural monitoring of children under 18.
- Australia: Banned social media access for under-16s (2024 law).
- EU DSA: Prohibits targeted advertising to minors on very large online platforms.
- NHRC data (India): Reported rising incidents of cyber-bullying and online exploitation of minors between 2022–2025.