Current Affairs Topics Archive
International Relations Economics Polity & Governance Environment & Ecology Science & Technology Internal Security Geography Social Issues Art & Culture Modern History

Supreme Court warns AI-generated judgments will amount to misconduct


What Happened

  • A Supreme Court bench comprising Justice PS Narasimha and Justice Alok Aradhe took suo motu cognisance on February 27, 2026 of an Andhra Pradesh trial court order that relied on four fabricated AI-generated judgments to dismiss defendants' objections in a property dispute
  • The Supreme Court characterised the reliance on AI-generated non-existent judgments as "misconduct" — elevating it from a technical error to a conduct issue with potential disciplinary consequences
  • This marks the first time India's apex court has formally escalated AI hallucinations in judicial proceedings from a procedural lapse to a matter of judicial discipline
  • The controversy centres on an August 2025 Andhra Pradesh trial court order in a property dispute, where the court cited four Supreme Court judgments that did not exist; the Andhra Pradesh High Court, when approached by defendants, acknowledged the judgments were "AI-generated"
  • The Supreme Court issued notices to the Attorney General, Solicitor General, and the Bar Council of India, and appointed senior advocate Shyam Divan as amicus curiae to assist the court
  • A related pattern has emerged: in December 2024, the Bengaluru bench of the Income Tax Appellate Tribunal (ITAT) issued an order in the Buckeye Trust case citing three non-existent Supreme Court judgments and one fabricated Madras High Court ruling; the order was recalled within a week
  • Globally, over 500 AI hallucination litigation cases have been documented, including landmark US cases where lawyers were sanctioned for submitting AI-fabricated citations

Static Topic Bridges

Artificial intelligence large language models (LLMs) such as ChatGPT, Gemini, and others are capable of generating plausible-sounding but entirely fabricated information — a phenomenon called "AI hallucination." In legal contexts, this manifests as AI tools generating fake case citations, fabricated quotes from real judgments, and invented statutory provisions that appear authentic but have no existence in any law reporter or official database. The hallucination problem is particularly dangerous in the legal domain because judgments carry binding precedential weight — a court relying on a fake judgment is effectively deciding a case on a legal foundation that does not exist.

  • AI hallucination: the generation of false, fabricated, or misleading output by LLMs presented as factual
  • Occurs because LLMs predict statistically probable next tokens, not necessarily factually accurate information
  • In law, AI hallucinations typically manifest as: fake case citations, fabricated statutory references, invented quotes attributed to real judges
  • Unlike factual errors in other domains, fake citations in court orders constitute structural fraud on the judicial process
  • Rate of hallucination in legal tasks: studies suggest current LLMs hallucinate legal citations at a 20–88% rate depending on the model and task
  • Verification failure: hallucinated judgments often cannot be found in SCC, Manupatra, Indian Kanoon, or Westlaw databases

Connection to this news: The Andhra Pradesh trial court's four fake citations represent a textbook AI hallucination event — the court (or the parties appearing before it) used an AI tool to generate legal authorities without independently verifying them against authoritative databases.

Judicial Accountability and Misconduct: Constitutional and Statutory Framework

In India, judges of the Supreme Court and High Courts enjoy security of tenure and can only be removed through impeachment under Article 124(4) (Supreme Court) and Article 217(1)(b) (High Courts) — a deliberate constitutional protection for judicial independence. However, district and subordinate court judges are members of state judicial services, subject to disciplinary proceedings under the All India Services Act and High Court supervision under Article 235. The characterisation of AI citation reliance as "misconduct" (rather than mere error) is significant because it potentially subjects trial judges to departmental proceedings, adverse remarks, and career consequences.

  • SC judge removal: Article 124(4) — only by parliamentary address with special majority
  • HC judge removal: Article 217(1)(b) — similar presidential/parliamentary procedure
  • District judges (subordinate courts): under administrative supervision of respective High Courts per Article 235
  • Article 235: High Courts have superintendence and control over subordinate courts
  • Judicial misconduct for subordinate judges: handled through departmental enquiries under state service rules and High Court oversight
  • Bar Council of India rules: Rule 12 — lawyers must ensure scrupulous accuracy in citations; misrepresentation constitutes professional misconduct under Section 35, Advocates Act, 1961

Connection to this news: The Supreme Court's characterisation of AI hallucinations as "misconduct" — rather than "error" — directly engages Article 235 and High Court supervisory powers, potentially triggering action against the Andhra Pradesh trial judge through the judicial disciplinary mechanism.

The emergence of AI tools in legal practice has outpaced governance frameworks globally. India currently has no specific legislation governing AI use in legal proceedings, though the IT Act, 2000 and the Digital Personal Data Protection Act, 2023 provide tangential coverage. The Supreme Court's intervention represents judicial norm-setting in the absence of legislation. Bar councils and law schools are under pressure to develop AI-literacy curricula that emphasise verification protocols. Courts in the United States (Mata v. Avianca, 2023) and Canada have already issued standing orders requiring lawyers to certify that AI-generated content has been verified — India's Supreme Court is moving in a similar direction.

  • No specific Indian legislation on AI use in courts as of 2026
  • Digital Personal Data Protection Act, 2023: addresses data privacy but not AI accuracy obligations
  • IT Act, 2000: no specific provision on AI-generated court submissions
  • Mata v. Avianca (SDNY, 2023): US court fined lawyers $5,000 for submitting ChatGPT-generated fake citations; judge sanctioned lawyers for failure to verify
  • Buckeye Trust v. PCIT (ITAT Bengaluru, December 2024): Indian tax tribunal recalled order containing four non-existent judgments generated by ChatGPT
  • Bar Council of India: issued notices in the case; expected to frame guidelines on AI use in legal proceedings
  • Supreme Court e-Committee: developing an AI tool for legal research ("SUVAS") that uses verified databases to avoid hallucinations

Connection to this news: The Supreme Court's suo motu action effectively creates precedent-level guidance in the absence of legislation — placing the burden of verification on both judges and lawyers, and signalling that AI tools must be used only as drafting aids with mandatory human verification before any citation is used in a judicial proceeding.

Key Facts & Data

  • Case: Suo motu action by SC bench (Justice PS Narasimha and Justice Alok Aradhe), February 27, 2026
  • Trigger: Andhra Pradesh trial court order (August 2025) citing four non-existent Supreme Court judgments
  • SC's finding: Relying on AI-generated fake judgments = "misconduct" (not merely an error)
  • Notices issued to: Attorney General, Solicitor General, Bar Council of India
  • Amicus curiae appointed: Senior Advocate Shyam Divan
  • Related case: Buckeye Trust v. PCIT, ITAT Bengaluru (December 2024) — four fake judgments, order recalled within a week
  • Global AI hallucination cases documented: over 508 (as of early 2026)
  • US precedent: Mata v. Avianca, SDNY, 2023 — lawyers fined $5,000 for fake ChatGPT citations
  • Lawyer accountability under Indian law: Section 35, Advocates Act, 1961 (professional misconduct)
  • Judicial accountability: Article 235 — High Courts supervise subordinate judiciary
  • Key legal databases for verification: SCC, Indian Kanoon, Manupatra, Westlaw India