Current Affairs Topics Archive
International Relations Economics Polity & Governance Environment & Ecology Science & Technology Internal Security Geography Social Issues Art & Culture Modern History

Discussion on with social media platforms over protection needed for society: Vaishnaw


What Happened

  • The Election Commission of India (ECI) held discussions with major social media platforms regarding the protection of political content and the prevention of electoral misinformation during the 2026 assembly elections in Assam, Kerala, Tamil Nadu, West Bengal, and Puducherry.
  • On March 19, 2026, the ECI convened a meeting with Chief Electoral Officers, police and IT nodal officers of poll-bound states, and representatives of social media platforms to ensure timely action against misinformation, disinformation, and fake news.
  • The Commission has mandated pre-certification of political advertisements on electronic and digital media through the Media Certification and Monitoring Committees (MCMCs) operating at the district and state levels.
  • Candidates must now disclose their official social media accounts in affidavits filed at the time of nomination — adding a layer of traceability for campaign activity.
  • AI-generated content, deepfakes, and "paid news" are under heightened scrutiny; the ECI has signalled a proactive approach to evolving campaign tactics.
  • Campaign expenditure reporting requirements include disclosure of spending on internet and social media, to be submitted within 75 days after election completion.

Static Topic Bridges

ECI's Authority Over Digital Campaign Conduct and Model Code of Conduct

The Election Commission of India's power to regulate campaign conduct — including digital campaigning — flows from Article 324, which gives it plenary authority over the superintendence, direction, and control of elections. The Model Code of Conduct (MCC), while not a statutory instrument, is backed by the ECI's constitutional authority and has been held by courts to be enforceable. In the digital age, the MCC has been progressively extended to cover social media activity: parties and candidates are expected not to post false, misleading, or defamatory content, and the ECI can direct removal of such content through the IT Act's intermediary mechanism.

  • The MCC comes into force from the date of announcement of election schedule and remains operative till the completion of the election.
  • Violations of MCC related to social media (hate speech, deepfakes, false claims) are handled by MCMCs at the district level.
  • MCMCs: Media Certification and Monitoring Committees — set up in every state/UT and district during elections; oversee both paid political advertisements and organic content complaints.
  • The ECI's Systematic Voters' Education and Electoral Participation (SVEEP) programme also uses social media to promote voter awareness — demonstrating the dual use of these platforms.
  • Political parties are required to register their social media accounts with the ECI and submit social media expenditure accounts.

Connection to this news: The ECI's discussions with social media platforms are an exercise of its Article 324 mandate applied to the digital media landscape — the Commission is establishing whether existing voluntary compliance frameworks are sufficient or whether formal directions under the IT Act are needed.


IT Rules 2021 and the Safe Harbour Debate for Political Content

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules 2021) establish a tiered compliance framework for online platforms. Significant Social Media Intermediaries (SSMIs — platforms with more than 50 lakh registered users in India) have heightened obligations including appointing a Chief Compliance Officer, Nodal Contact Person, and Resident Grievance Officer; enabling identification of the "first originator" of information under certain conditions; and deploying technology-based measures to prevent certain prohibited content. Platforms that comply with these rules retain "safe harbour" protection — they are not liable for third-party content posted on their platforms.

  • Section 79 of the IT Act, 2000: Safe harbour for intermediaries — platforms are not liable for user-generated content if they act as neutral conduits and comply with due diligence requirements.
  • Section 66A of the IT Act was struck down by the Supreme Court in Shreya Singhal v. Union of India (2015) for being unconstitutionally vague.
  • IT Rules 2021 require SSMIs to take down content within 36 hours of receiving a court order or government notification.
  • The 2026 amendment (effective February 2026) introduced a three-hour takedown regime for specific categories of AI-generated harmful content.
  • The question of whether platforms have a special obligation to protect (as opposed to merely not remove) legitimate political content — including content by candidates and parties — is at the heart of the ECI's current discussions.

Connection to this news: The discussions are about the boundary between platform content moderation (which can suppress legitimate political speech) and ECI's mandate to prevent electoral manipulation — a tension that the existing legal framework does not fully resolve.


Deepfakes, AI-Generated Content, and Electoral Integrity

During India's 2024 Lok Sabha elections, AI-generated deepfake videos of politicians were widely circulated across social media platforms — raising concerns about their effect on voter behaviour. For the 2026 assembly elections, the threat has escalated: not only video deepfakes but also AI-generated images, audio clips, and even fake voter identity documents (as seen in the West Bengal EPIC case) are in play. The ECI's discussions with platforms focus on: (1) faster takedown of flagged deepfakes; (2) labelling of AI-generated political content; and (3) ensuring platforms do not suppress authentic political content in the name of content moderation.

  • MeitY issued an advisory in November 2023 requiring platforms to label AI-generated content and remove deepfakes within 36 hours.
  • The IT Rules 2026 amendment (effective February 2026) specifically defines "synthetic information" and mandates labelling and removal obligations.
  • During 2024 elections: Fake AI-generated video of Congress president Mallikarjun Kharge and AI-manipulated clips of other leaders were reported across platforms.
  • The ECI mandates pre-certification for political ads (including AI-generated ones) through MCMCs — but organic (non-paid) political content is harder to regulate.
  • Platforms like Meta, Google, and X (formerly Twitter) have their own AI content policies but enforcement during election periods in India has been inconsistent.

Connection to this news: The ECI's talks with platforms reflect an institutional recognition that voluntary platform policies are insufficient — and that some form of mandatory, election-period specific protocol for AI-generated political content is needed.


Freedom of Expression and the Right to Political Speech

Any regulation of political content on social media must navigate the constitutional guarantee of freedom of speech and expression under Article 19(1)(a) and the permissible restrictions under Article 19(2). Political speech — particularly criticism of the government and candidates — enjoys the highest constitutional protection. The Supreme Court in Indian Express v. Union of India (1985) held that the press (and by extension, political expression) has a special role in a democracy. Excessive regulation of political content risks chilling legitimate dissent and debate, particularly if the regulatory mechanism is controlled by the incumbent government.

  • Article 19(1)(a): Right to freedom of speech and expression.
  • Article 19(2): Permissible restrictions include sovereignty and integrity of India, security of the state, public order, decency, morality, contempt of court, defamation, and incitement to offence. Electoral manipulation is not explicitly listed but can fall under "public order."
  • The Supreme Court in Shreya Singhal (2015) set a high bar for restricting online speech: restrictions must be clearly defined, narrowly tailored, and subject to procedural safeguards.
  • The Editors Guild of India criticised the 2023 IT Rules amendment (government fact-checking of online content) as potentially enabling censorship.
  • The balance to strike: protecting voters from AI-enabled disinformation WITHOUT enabling the ECI or the government to suppress legitimate political criticism.

Connection to this news: The ECI's discussions with platforms must be understood in this constitutional context — any resulting protocols that restrict political content must be narrowly tailored to the specific harm (electoral manipulation by AI-generated disinformation) rather than becoming broad content suppression tools.

Key Facts & Data

  • ECI-social media platform meeting: March 19, 2026 (CEOs, police, IT nodal officers, platform reps).
  • Mandatory disclosures: Candidates must disclose official social media accounts in nomination affidavit.
  • Campaign expenditure: Must include social media/internet spending; reported within 75 days post-election.
  • Pre-certification requirement: All political ads on electronic and digital media require MCMC approval.
  • IT Rules 2021: SSMIs have obligations including 36-hour takedown, Chief Compliance Officer, Resident Grievance Officer.
  • IT Rules 2026 amendment (effective February 2026): Targets AI-generated synthetic content; three-hour takedown regime.
  • Section 79, IT Act: Safe harbour for intermediaries — conditional on compliance with due diligence rules.
  • Section 66A, IT Act: Struck down as unconstitutional in Shreya Singhal v. Union of India (2015).
  • Significant Social Media Intermediary (SSMI) threshold: 50 lakh+ registered users in India.
  • Nodal ministry: MeitY (Ministry of Electronics and Information Technology).