Current Affairs Topics Archive
International Relations Economics Polity & Governance Environment & Ecology Science & Technology Internal Security Geography Social Issues Art & Culture Modern History

Comply or lose safe harbour: MeitY’s draft amendment to IT Rules lets govt give platforms binding orders


What Happened

  • The Ministry of Electronics and Information Technology (MeitY) has circulated a draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, proposing to make government advisories and Standard Operating Procedures (SOPs) binding on digital platforms.
  • The core provision: platforms that fail to comply with government directives will be deemed to have failed their "due diligence" obligations — and will consequently lose their safe harbour protection under Section 79 of the IT Act, 2000.
  • This marks a structural shift from the existing framework where MeitY could issue directions to platforms but had no rule-based mechanism to tie non-compliance directly to loss of legal immunity.
  • The amendment follows MeitY's February 2026 notification of IT Amendment Rules (focused on synthetic media/deepfakes), suggesting a broader regulatory tightening of platform accountability.
  • Civil society and legal experts have raised concerns that making safe harbour contingent on compliance with executive directions — rather than court orders — could enable government pressure on platforms to remove content without judicial oversight.

Static Topic Bridges

Safe Harbour Protection — Section 79 of the IT Act, 2000

Section 79 of the Information Technology Act, 2000 provides "safe harbour" immunity to intermediaries (social media platforms, search engines, messaging apps, e-commerce sites) — shielding them from legal liability for third-party content hosted on their platforms. This immunity is conditional: platforms must observe "due diligence" as prescribed by the government, and must not actively participate in or initiate the unlawful act. The Supreme Court in Shreya Singhal v. Union of India (2015) struck down Section 66A but upheld Section 79, adding that platforms need only act on court orders or government notifications — not mere private complaints — to retain safe harbour.

  • Section 79 was modelled on the US Communications Decency Act Section 230, which provides near-absolute immunity to US platforms.
  • The Shreya Singhal judgment (2015): Section 66A struck down; Section 79 read down to require only court orders or government notifications (not individual complaints) for takedown.
  • IT Rules 2021 operationalised Section 79's "due diligence" requirement — mandating grievance officers, content moderation timelines, and compliance reports for Significant Social Media Intermediaries (SSMIs) with over 5 million users.
  • Current safe harbour condition: platforms must act within 36 hours of a lawful takedown order (reduced to 3 hours for certain categories under February 2026 amendment).

Connection to this news: The draft amendment effectively upgrades government advisories and SOPs to near-court-order status — making non-compliance with executive directions a basis for stripping safe harbour, which critics argue reverses the Shreya Singhal framework's insistence on judicial oversight.


Intermediary Liability and Platform Regulation in India

India's platform regulation has evolved from a largely hands-off approach (pre-2021) to an increasingly assertive content governance framework. The IT Rules 2021 introduced mandatory grievance redressal mechanisms, monthly compliance reports, content takedown timelines, and a three-tier oversight structure for OTT/digital media. The February 2026 amendment on synthetic media mandated watermarking of AI-generated content and reduced takedown timelines for deepfakes. The current draft extends the compliance-for-immunity framework further — creating a regulatory architecture where safe harbour is no longer a baseline right but a contingent privilege conditioned on executive compliance.

  • IT Rules 2021 (Rule 3): "due diligence" obligations for all intermediaries — grievance officer, acknowledge complaints within 24 hours, resolve within 15 days.
  • SSMIs (Rule 4): Additional obligations — Chief Compliance Officer, Nodal Contact Person, monthly compliance reports, proactive content moderation for notified categories.
  • February 2026 Amendment: synthetic media/deepfake labelling; takedown timeline reduced from 36 hours to 3 hours for certain content; shift from "endeavour to deploy" to "deploy" technical measures.
  • Current draft: advisories and SOPs become binding; non-compliance = due diligence failure = loss of safe harbour.
  • Twitter (now X) v. Union of India (Karnataka HC, 2022): platform challenged Rule 4's Grievance Appellate Committee — case still pending; this amendment may generate fresh constitutional litigation.

Connection to this news: The proposed amendment would give the executive branch a powerful lever over platform content decisions — one that operates outside the judicial process that Shreya Singhal required as a safeguard.


Freedom of Speech and Platform Content Moderation

Article 19(1)(a) of the Constitution guarantees freedom of speech and expression to citizens; Article 19(2) allows reasonable restrictions on grounds including sovereignty, security of the state, public order, decency, and defamation. The question of whether government directives to platforms to remove content must pass the "reasonable restrictions" test — and be subject to judicial review — is a live constitutional issue. If platforms lose safe harbour for non-compliance with executive advisories (rather than court orders), they face strong incentives to over-comply, potentially suppressing legitimate speech to avoid regulatory risk.

  • Platforms themselves have no fundamental rights under Part III of the Constitution — they invoke freedom of speech indirectly as agents of user expression.
  • Shreya Singhal (2015): intermediary liability rules must meet Article 19(2) standards — vague or overbroad takedown mandates struck down.
  • Chilling effect: platforms over-complying with government directives (removing lawful content to avoid legal risk) is a recognised threat to online expression.
  • The proposed amendment's compliance mechanism has no built-in appeal or review process for platforms to contest advisories before they become binding.
  • Global parallel: the EU's Digital Services Act (DSA, 2023) imposes compliance obligations on platforms but requires transparency reports, independent audits, and due process before regulatory action.

Connection to this news: By conditioning safe harbour on compliance with executive advisories — without a clear judicial or quasi-judicial check — the draft raises Article 19 concerns about the structural pressure it creates on platform content decisions.


Key Facts & Data

  • Section 79, IT Act 2000: the legal basis for safe harbour protection for intermediaries in India.
  • IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: operationalise Section 79's due diligence requirements.
  • February 2026 IT Amendment Rules: synthetic media labelling, 3-hour takedown for certain deepfakes, mandatory technical deployment for SSMIs.
  • Current draft: compliance with government advisories/SOPs linked directly to safe harbour retention — no existing rule-based mechanism for this linkage.
  • Shreya Singhal v. Union of India (2015): Supreme Court ruling that shaped India's intermediary liability framework.
  • SSMIs: platforms with over 5 million registered users — subject to the most stringent compliance obligations.
  • The draft amendment is in circulation for stakeholder comments as of March 30, 2026; it has not yet been gazetted.