English Courts Warn on AI Hallucinations in Recent Rulings[1][2]

Published
Score
7

Why it matters

English courts are cracking down on AI-generated fake case citations in legal filings. The High Court in R. (on the application of Ayinde) v Haringey LBC [2025] EWHC 1383 (Admin) flagged false authorities produced by generative AI and demanded verification against primary sources. The First-tier Tribunal in Elden v Revenue and Customs Commissioners [2026] UKFTT 41 (TC) went further, noting AI inaccuracies in a skeleton argument and imposing requirements on future submissions. Family court and commercial cases have followed the same pattern, with judges repeatedly encountering AI "hallucinations"—fabricated case names, citations, and holdings that never existed.

The Civil Justice Council launched a consultation in February 2026 on mandatory AI-use declarations for statements of case and skeleton arguments, with responses due April 14, 2026. The Judiciary issued updated guidance on AI hallucinations and bias in October 2025. Courts have signaled potential sanctions including contempt findings, perverting the course of justice charges, and regulatory penalties against legal professionals. No specific AI vendors have been named in the decisions.

Attorneys should expect imminent procedural rules requiring disclosure of AI use in document preparation. The consultation deadline is days away, making rule changes likely by mid-2026. For litigators, the risk is acute: courts now actively scrutinize citations and will penalize reliance on AI-generated authorities. The safer practice is manual verification of every case citation against primary sources before filing, and explicit disclosure of any AI involvement in document drafting.

mail

Get notified about new Artificial Intelligence developments

Primary sources. No fluff. Straight to your inbox.

See more entries tagged Artificial Intelligence.

Also on LawSnap