Current Affairs Topics Archive
International Relations Economics Polity & Governance Environment & Ecology Science & Technology Internal Security Geography Social Issues Art & Culture Modern History

What has government laid down on AI labelling? | Explained


What Happened

  • The Ministry of Electronics and Information Technology (MeitY) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, on February 10, 2026, effective from February 20, 2026.
  • All AI-generated or synthetic content — including audio, visual, and audio-visual material — must now carry mandatory, non-removable labels identifying it as AI-generated.
  • The compliance window for platforms to remove unlawful content pursuant to a lawful order has been reduced from 36 hours to 3 hours.
  • Platforms are required to embed permanent metadata or provenance markers into AI-generated content so its origin can be traced even when shared across platforms.
  • Exemptions apply to routine, good-faith edits (colour correction, sound improvement, formatting) and AI-assisted educational or training content that does not misrepresent reality.
  • Significant Social Media Intermediaries (SSMIs) that knowingly allow harmful synthetic content to circulate risk losing safe harbour protection under Section 79 of the IT Act.

Static Topic Bridges

Section 79 of the IT Act, 2000: Safe Harbour Provision

Section 79 of the Information Technology Act, 2000, provides conditional legal immunity ("safe harbour") to intermediaries for third-party content hosted on their platforms. This provision is the cornerstone of India's internet regulation framework, shielding platforms like social media companies, e-commerce sites, and internet service providers from liability for user-generated content, provided they meet specified conditions.

  • Section 79 grants immunity to intermediaries if they function as neutral platforms and comply with due diligence requirements
  • Immunity is lost if the intermediary: (a) initiates the transmission, (b) selects or modifies the content, or (c) fails to remove unlawful content after receiving a court order or government notification
  • Supreme Court in Shreya Singhal v. Union of India (2015): interpreted "actual knowledge" to mean a court order, protecting platforms from private complaints
  • The IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (and subsequent amendments) define due diligence obligations platforms must follow to retain safe harbour
  • SSMIs (platforms with over 50 lakh registered users in India) face additional compliance obligations

Connection to this news: The 2026 amendment explicitly ties AI content labelling compliance to safe harbour protection — platforms that knowingly allow harmful synthetic content without labels risk losing Section 79 immunity, creating a powerful enforcement mechanism.

Deepfake Regulation and Digital Ethics

Deepfakes — AI-generated synthetic media that realistically depict individuals saying or doing things they never actually did — pose significant challenges to information integrity, democratic processes, and individual privacy. India's regulatory approach to deepfakes has evolved rapidly since 2023, when high-profile incidents involving manipulated videos of public figures drew national attention.

  • IT Rules 2026 define AI-generated content as: audio, visual, or audio-visual content created or modified using a computer resource that appears real and depicts people or events as authentic
  • Prohibited categories: child sexual abuse material (CSAM), non-consensual intimate imagery, false documents, misleading depictions of real individuals/events
  • Violations can lead to: immediate content removal, suspension/termination of user accounts, mandatory reporting to law enforcement
  • Provenance markers: permanent metadata embedded in content to enable cross-platform traceability
  • Global context: EU AI Act (2024) also mandates disclosure of AI-generated content; India's approach focuses more on intermediary liability

Connection to this news: The mandatory labelling and metadata requirements represent India's first comprehensive regulatory framework specifically targeting AI-generated content, moving beyond advisory guidelines to enforceable rules with clear consequences.

Information Technology Act, 2000: Evolution and Amendments

The Information Technology Act, 2000, enacted to give legal recognition to electronic commerce and address cybercrime, has been progressively expanded to regulate emerging technologies. Originally focused on e-governance and digital signatures, the Act was significantly amended in 2008 (IT Amendment Act, 2008) and has since been supplemented by subordinate rules addressing intermediary obligations, data protection, and now AI-generated content.

  • IT Act, 2000: India's primary legislation for cybercrime, e-commerce, and digital governance; enacted pursuant to UNCITRAL Model Law
  • Key sections: Section 43 (unauthorized computer access), Section 66 (computer-related offences), Section 69 (government interception powers), Section 79 (safe harbour)
  • IT (Intermediary Guidelines) Rules: first issued 2011, overhauled 2021, amended 2023 and now 2026
  • The 2021 Rules introduced: three-tier grievance redressal for digital media, mandatory traceability of message originators (for messaging platforms with 50 lakh+ users), and the concept of SSMIs
  • Digital Personal Data Protection Act (DPDP), 2023: complements IT Act provisions on data handling by intermediaries

Connection to this news: The 2026 amendment to the Intermediary Guidelines represents the latest evolution of India's IT regulatory framework, extending its reach to AI-generated content while maintaining the existing intermediary liability architecture.

Key Facts & Data

  • Amendment notification date: February 10, 2026; effective from February 20, 2026
  • Content takedown window: reduced from 36 hours to 3 hours
  • SSMI threshold: platforms with over 50 lakh (5 million) registered users in India
  • Safe harbour: Section 79, IT Act, 2000
  • Intermediary Guidelines: first 2011, overhauled 2021, latest amendment 2026
  • AI content definition: audio, visual, or audio-visual content created/modified by computer to appear real
  • Exemptions: good-faith routine edits, educational AI-assisted content without misrepresentation
  • EU AI Act comparison: enacted 2024; also mandates AI content disclosure but uses a risk-based classification approach