Current Affairs Topics Archive
International Relations Economics Polity & Governance Environment & Ecology Science & Technology Internal Security Geography Social Issues Art & Culture Modern History

India must institutionalise human control over AI systems for military use, test them rigorously like weapons: Lt Gen Shinghal


What Happened

  • At the India AI Impact Summit 2026, Lt Gen Vipul Shinghal stated that India must institutionalise human control over AI systems deployed for military purposes, emphasising that "AI can inform decisions, but only humans can exercise judgment and bear responsibility for them."
  • He cited a real-world example where a machine-generated analysis recommended an immediate strike during a high-tempo operation, but the commander paused to ask "What does the machine not know?" The AI had identified adversary troops but failed to detect an ongoing civilian evacuation, so the strike was halted.
  • Military leaders, defence scientists, industry executives, and academics converged on a central message: India must deploy AI as a force multiplier without surrendering moral agency, operational control, or strategic autonomy.
  • Speakers argued that AI increases leadership burden rather than reducing it, as compressed decision cycles raise the risk of escalation if human judgment is sidelined.
  • The summit underscored the need for institutionalised frameworks to ensure command authority remains paramount in AI-assisted military decision-making.

Static Topic Bridges

Lethal Autonomous Weapons Systems (LAWS) and International Regulation

Lethal Autonomous Weapons Systems (LAWS) are weapon systems that can select and engage targets without meaningful human intervention. The international community has been debating their regulation under the Convention on Certain Conventional Weapons (CCW) since 2013.

  • The Group of Governmental Experts (GGE) on LAWS, operating under the CCW, adopted eleven guiding principles on LAWS in November 2019.
  • India chaired the GGE on LAWS in 2017 and 2018 and is a party to all five protocols of the CCW.
  • India's position holds that: the CCW is the most appropriate forum for LAWS discussions; human responsibility must be retained across the entire life cycle of weapon systems; new rules should focus on weapon effects and proper use rather than banning enabling technologies.
  • India voted against the December 2023 UNGA resolution on LAWS, considering calls for a legally binding instrument as premature.
  • Key opposing camps: some nations (Austria, Brazil, Chile, and others) advocate for a pre-emptive ban; major military powers (US, Russia, India, Israel) prefer technology-neutral principles preserving regulatory flexibility.

Connection to this news: Lt Gen Shinghal's emphasis on institutionalising human control aligns with India's consistent position at the CCW that human responsibility must be retained, while stopping short of supporting a legally binding ban on autonomous weapons.

AI in India's Defence Ecosystem

India has been integrating AI into its defence architecture through multiple institutional mechanisms, balancing capability development with ethical considerations.

  • The Defence AI Council (DAIC), chaired by the Defence Minister, provides strategic guidance for AI adoption in defence.
  • The Defence AI Project Agency (DAIPA) under the Department of Defence Production coordinates AI projects across the three services.
  • The Centre for Artificial Intelligence and Robotics (CAIR), a DRDO laboratory in Bengaluru, develops AI and robotics technologies for defence applications.
  • iDEX (Innovations for Defence Excellence) funds startups working on AI-enabled defence solutions through the Defence Innovation Organisation.
  • Applications include AI-powered surveillance systems along the Line of Control, autonomous unmanned platforms, predictive maintenance for military equipment, and intelligence analysis tools.
  • The Indian Army has deployed AI-based border monitoring systems and the Navy uses AI for maritime domain awareness.

Connection to this news: The India AI Impact Summit discussions reflect the military's growing reliance on AI tools and the simultaneous recognition that institutional safeguards must keep pace with technological adoption to prevent decision-making failures in high-stakes scenarios.

International Humanitarian Law (IHL) and Autonomous Weapons

International Humanitarian Law, also known as the law of armed conflict, governs the conduct of hostilities and protection of victims in armed conflicts. Its applicability to autonomous weapons is a key question in the LAWS debate.

  • The core IHL principles applicable to weapons include: distinction (between combatants and civilians), proportionality (military advantage vs civilian harm), precaution (feasible measures to minimise civilian casualties), and military necessity.
  • Article 36 of Additional Protocol I to the Geneva Conventions (1977) requires states to determine whether any new weapon or means of warfare would be prohibited under IHL.
  • The Martens Clause, present in the Hague Conventions, mandates that even in the absence of specific treaty provisions, parties to a conflict remain bound by the principles of humanity and public conscience.
  • A central concern is whether autonomous systems can make the complex contextual judgments required by IHL, particularly the proportionality assessment.

Connection to this news: The example cited by Lt Gen Shinghal, where AI missed a civilian evacuation, directly illustrates why the IHL principles of distinction and precaution require human judgment that current AI systems cannot reliably replicate, reinforcing India's position that human control must be institutionalised.

Key Facts & Data

  • India chaired the GGE on LAWS in 2017 and 2018.
  • Eleven guiding principles on LAWS adopted: November 2019.
  • India's position: Human responsibility must be retained across the entire life cycle of weapon systems.
  • Defence AI Council (DAIC): Chaired by the Defence Minister.
  • CAIR (Centre for AI and Robotics): DRDO lab in Bengaluru for defence AI development.
  • CCW: India is party to all five protocols.
  • Article 36 of Additional Protocol I: Mandates legal review of new weapons.