India'stop court: clear rejection of AI usage by courts
The Supreme Court of India voiced harsh criticism of fake judgment citations generated by artificial intelligence and used in a court ruling. In August 2025, a junior judge adjudicated a case concerning a disputed property. The court commissioned an official to survey the property to assist in its ruling. Despite the defendants' objections to the appointment, the judge dismissed their concerns and proceeded with the decision. The reasoning of the judgment cited four past legal decisions – none of which actually existed, as they were AI-generated fabrications or so-called “hallucinations”.
The defendants took the case to the Supreme Court, challenging the ruling on the grounds that it relied on AI-generated citations. Notably, although the court confirmed that the cited judgments did not exist, it treated the issue as a good-faith error and ultimately ruled in line with the lower court. While the High Court criticized the use of AI, emphasizing the need for human judgment over artificial intelligence, it maintained that the legal principles applied in the case remained sound. Consequently, it found no grounds to overturn the lower court's judgment.
Following a further appeal by the defendants, the case reached the Supreme Court. The Supreme Court of India strongly criticized the lower courts, warning of the serious consequences such practices could have for the integrity of the adjudicative process. The Supreme Court also expressed significant institutional concern over the matter and described the junior judge’s reliance on AI-generated citations as misconduct. Furthermore, the Supreme Court announced that it would issue notices to the country’s Attorney General and Solicitor General, as well as to the Bar Council of India.
While AI tools are becoming increasingly prevalent in everyday work environments, simplifying tasks and improving efficiency, serious concerns arise from their tendency to “hallucinate” – effectively inventing sources of information. Similar incidents have occurred worldwide. For example, in the United States in October 2025, courts encountered comparable problems with AI-generated legal citations. In the United Kingdom, the High Court of England and Wales also issued a stern warning to lawyers against relying on AI-generated case material following several incidents involving inaccurate or fictitious case citations.