What Happened
- A Parliamentary panel — the Committee on the Empowerment of Women (2025–26), chaired by D. Purandeshwari — found that no social media or digital platform has ever lost its safe harbour protection under the IT Act, despite documented cases of delayed compliance with government takedown notices.
- Platforms including X (formerly Twitter) and Snapchat have experienced delays in complying with content removal requests, yet continue to retain their intermediary immunity under Section 79 of the Information Technology Act, 2000.
- The panel's findings highlight a structural enforcement gap: the law conditions safe harbour on compliance, but the mechanism to revoke it has never been invoked.
- The committee also sharply criticised the Ministry of Home Affairs (MHA) for providing delayed, deflected, or inadequate responses to questions on fast-track courts, prosecution data for cybercrime, and proposals for a unified cybercrime law.
- The panel's investigation focused particularly on emerging threats from deepfakes and AI-generated content, and referenced the Sahyog Portal as a related accountability mechanism.
Static Topic Bridges
Section 79 of the IT Act, 2000 — Safe Harbour Provisions for Intermediaries
Section 79 of the Information Technology Act, 2000 grants intermediaries — including social media platforms, search engines, e-commerce platforms, and messaging apps — immunity from legal liability for third-party content hosted on their platforms. This "safe harbour" is conditional: platforms must not initiate or participate in the unlawful activity, must observe due diligence, and must remove or disable access to content upon receiving actual knowledge of its unlawful nature (through government orders or court directions). Section 79(3) specifies that if an intermediary fails to expeditiously remove such content upon receiving notice, it loses this immunity and becomes liable. In practice, however, the parliamentary panel found that this revocation mechanism has never been operationalised — no platform has been stripped of safe harbour despite documented non-compliance.
- Section 79 IT Act: grants conditional immunity to intermediaries for user-generated/third-party content
- Conditions for retention: observe due diligence (under ITGDMEC Rules), not initiate or modify content, act on notices expeditiously
- Section 79(3)(b): government or court order → intermediary must remove content or loses safe harbour
- Sahyog Portal: government-run takedown mechanism operating under Section 79(3)(b) — concerns raised about lack of procedural safeguards
- IT Amendment Rules, 2026: tightened takedown window from 36 hours to 3 hours for certain categories of content
- No platform has ever had safe harbour revoked in India despite documented compliance delays
Connection to this news: The panel's finding exposes the gap between the law's stated consequences and its actual enforcement — safe harbour is theoretically conditional but practically unconditional, undermining the accountability framework for digital platforms.
Intermediary Guidelines and Digital Media Ethics Code — The Regulatory Framework
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (ITGDMEC Rules, amended in 2023 and 2026) are the primary secondary legislation governing social media platforms and digital news publishers in India. For "Significant Social Media Intermediaries" (SSMIs) — platforms with more than 5 million registered users — the Rules impose additional obligations: appointing a Resident Grievance Officer, Chief Compliance Officer, and Nodal Contact Person in India; publishing monthly transparency reports; enabling traceability of the originator of messages; and providing a grievance redressal mechanism. The 2026 amendment tightened the content removal timeline from 36 hours to 3 hours for content related to deepfakes, sexual violence, and national security threats.
- Significant Social Media Intermediaries (SSMIs): platforms with >5 million registered users — includes Meta, X, Google, Snapchat, WhatsApp
- SSMI obligations: Chief Compliance Officer (accountable for regulatory compliance), Resident Grievance Officer (accessible to users), monthly compliance reports
- Grievance redressal: platforms must acknowledge grievance within 24 hours; resolve within 15 days
- Content takedown: 36-hour general deadline; 2026 amendment tightened to 3 hours for specified harm categories
- OTT platforms (10–15 currently under examination by Ministry of Information and Broadcasting)
Connection to this news: The panel's critique reveals that even enhanced rules (with tighter timelines) have not been matched by enforcement action — the regulatory framework has become more stringent on paper without a corresponding record of consequence for non-compliance.
Parliamentary Oversight of Digital Governance and Cybersecurity
Parliamentary committees play a critical oversight role in India's governance structure, scrutinising government performance across sectors including technology, security, and law enforcement. The panel's sharp rebuke of MHA for inadequate responses to questions on fast-track courts, cybercrime prosecution data, and a unified cybercrime law reflects a broader accountability deficit in digital governance. Cybercrime in India has grown rapidly — from approximately 65,000 cases registered in 2022 to over 17 lakh cybercrime complaints received in a single year on the National Cyber Crime Reporting Portal (NCRP). The demand for a unified cybercrime law consolidates existing provisions scattered across the IT Act, the Indian Penal Code (now Bharatiya Nyaya Sanhita), and the POCSO Act into a coherent framework, a proposal that has been discussed but not acted upon.
- National Cyber Crime Reporting Portal (NCRP): cybercrime.gov.in; handles financial fraud, social media crimes, child exploitation
- Indian Cyber Crime Coordination Centre (I4C): nodal agency under MHA for cybercrime coordination
- Fast-track courts for cybercrime: proposed but not operationalised uniformly across states
- Parliamentary standing committees: no legislative power but can call witnesses, examine officials, and issue binding recommendations to be responded to within 6 months
- Deepfakes: AI-generated synthetic media posing identity fraud, disinformation, and electoral integrity risks
Connection to this news: The MHA's evasive responses to the panel's questions on cybercrime prosecution and unified law — combined with the safe harbour enforcement gap — reveals a dual failure: inadequate platform accountability and inadequate state response to emerging digital threats.
Key Facts & Data
- Finding: No platform has ever lost safe harbour under Section 79 IT Act despite documented takedown delays
- Platforms cited for non-compliance: X (formerly Twitter), Snapchat
- Committee: Committee on the Empowerment of Women (2025–26), chaired by D. Purandeshwari
- Safe harbour provision: Section 79, IT Act, 2000 — immunity conditional on due diligence and expeditious takedown compliance
- IT Amendment Rules, 2026: content removal window tightened from 36 hours to 3 hours (deepfakes, sexual violence, national security content)
- Sahyog Portal: government mechanism for platform compliance under Section 79(3)(b); criticised for lacking procedural safeguards
- MHA criticism: panel found responses to questions on fast-track courts, cybercrime prosecution, and unified cybercrime law delayed, deflected, or inadequate
- OTT platforms: 10–15 currently under Ministry of I&B examination