What Happened
- The government tightened rules for social media platforms, mandating the takedown of unlawful content within three hours of receiving an order from a court or competent authority — down from the previous 36-hour window
- Clear labelling of all AI-generated and synthetic content is now mandatory for digital intermediaries
- The rules formally define "synthetically generated information" as audio, visual, or audiovisual content that is artificially created or altered to appear real, while excluding routine edits, accessibility features, academic content, and training material
- Platforms are required to embed permanent metadata or provenance markers into synthetic content wherever technically feasible
- The amendments to the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 were notified on 10 February 2026 and become enforceable from 20 February 2026
Static Topic Bridges
IT Act, 2000 — Regulatory Framework for Digital Governance
The Information Technology Act, 2000 is India's primary legislation governing electronic commerce, digital signatures, cybercrime, and intermediary liability. Enacted following the United Nations Commission on International Trade Law (UNCITRAL) Model Law on Electronic Commerce, the Act has been substantially amended over time.
- Originally enacted in 2000 with 94 sections across 13 chapters; substantially amended by the IT (Amendment) Act, 2008
- Section 69A: Power to block websites — "where the Central Government or any of its officers specially authorised is satisfied that it is necessary in the interest of sovereignty and integrity of India, defence, security of the State, friendly relations with foreign States, or public order"
- Section 79: Safe harbour for intermediaries (conditional on due diligence compliance)
- Section 87: Power of the Central Government to make rules — the IT Intermediary Guidelines Rules are framed under this section
- The Act is administered by the Ministry of Electronics and Information Technology (MeitY)
- A comprehensive replacement — the Digital India Act — has been under discussion since 2023 but is yet to be introduced in Parliament
Connection to this news: The 2026 amendments are framed under the rule-making power of Section 87 of the IT Act, expanding the due diligence obligations that intermediaries must comply with to retain Section 79 safe harbour protection.
Global Approaches to AI Content Regulation
India's approach to regulating AI-generated content can be contextualised within a broader global trend of AI governance frameworks.
- EU AI Act (2024): World's first comprehensive AI law; classifies AI systems by risk level (unacceptable, high, limited, minimal); mandates transparency obligations including labelling of AI-generated content; deepfakes must be clearly disclosed
- US Executive Order on AI (October 2023): Requires developers of powerful AI systems to share safety test results with the government; establishes AI safety standards through NIST
- China's Deep Synthesis Regulations (2023): Require watermarking and labelling of AI-generated content; platforms must verify user identities; providers must maintain logs for regulatory inspection
- C2PA (Coalition for Content Provenance and Authenticity): Technical standard for embedding provenance metadata in digital content — adopted by Adobe, Microsoft, Google, and media organisations
- India's approach: Unlike the EU's comprehensive legislation, India uses executive rules under the existing IT Act framework; no dedicated AI law yet; the Digital India Act is expected to address AI regulation holistically
Connection to this news: India's 2026 amendment follows the global trend toward mandating transparency and labelling for AI content, though it operates through subordinate legislation (IT Rules) rather than a dedicated AI statute like the EU AI Act.
Content Moderation and Platform Accountability in India
The tension between platform self-regulation and government-mandated content moderation has been a recurring issue in India's digital policy landscape.
- Grievance Appellate Committees (GACs): Established under the 2023 IT Rules amendment as a second-tier redressal mechanism; users dissatisfied with platform grievance officer decisions can appeal to GACs
- Government fact-check unit: The 2023 amendment authorised the central government to notify a fact-check unit whose findings would be binding on intermediaries; challenged in the Bombay High Court in Kunal Kamra v. Union of India (2024) — a split verdict with the matter pending before a larger bench
- Traceability requirement (Rule 4(2)): SSMIs providing messaging services must enable identification of the first originator of information when ordered by a court or authorised government agency — challenged by WhatsApp in the Delhi High Court
- Three-hour window: The 2026 amendment's compressed takedown timeline is one of the strictest globally — comparable to Germany's NetzDG (Network Enforcement Act), which requires removal of "manifestly unlawful" content within 24 hours
Connection to this news: The reduction from 36 hours to 3 hours for takedown, combined with the loss of safe harbour for non-compliance, significantly increases the compliance burden on platforms operating in India and raises questions about proportionality and due process.
Key Facts & Data
- Previous takedown window: 36 hours; new window under 2026 amendment: 3 hours
- Definition scope: "Synthetically generated information" covers AI-generated audio, visual, and audiovisual content; excludes routine edits and academic material
- SSMI threshold: 50 lakh (5 million) registered users
- IT Act, 2000: 94 sections across 13 chapters; amended substantially in 2008
- EU AI Act: Adopted 2024; first comprehensive AI law globally; risk-based classification approach
- Shreya Singhal v. Union of India (2015): Section 66A struck down; Section 79 read down — "actual knowledge" requires court/government order
- Amendment effective date: 20 February 2026