Federal prosecutor Rudy Renfer resigns over AI-generated errors in legal brief

Published
Score
12

Why it matters

On March 10, 2026, Assistant U.S. Attorney Rudy Renfer resigned during a hearing before U.S. Magistrate Judge Robert Numbers after admitting he used artificial intelligence to draft a brief in a healthcare benefits lawsuit. Renfer, a 30-year veteran prosecutor in the Eastern District of North Carolina, had accidentally deleted an earlier version of the brief and turned to AI to reconstruct it. The tool generated fabricated case citations and false quotations designed to support the government's position. Judge Numbers presented a slide deck documenting the inaccuracies and expressed doubt that the errors were unintentional.

The U.S. Attorney's Office for the Eastern District of North Carolina removed Renfer from the case and referred the matter to the Justice Department's Office of Professional Responsibility. U.S. Attorney Ellis Boyle subsequently issued a warning to staff about AI use. The specific details of what citations were fabricated and which cases were misrepresented remain unclear.

The incident reflects a broader pattern of AI-generated hallucinations in legal work. Over 800 similar cases have been tracked worldwide, most notably the 2023 episode involving New York lawyers who relied on ChatGPT to draft a brief containing entirely fictional cases. Recent research shows AI legal tools hallucinate on between 16 and 82 percent of queries depending on the tool and task. Attorneys should assume that AI-drafted briefs require the same level of citation verification as human work—and likely more. The incident also follows a February 2026 ruling by Judge Jed Rakoff in the Southern District of New York denying attorney-client privilege to documents generated via Claude AI, a decision that exposes privileged communications to discovery when AI is involved in their creation.

mail

Get notified about new Health Care developments

Primary sources. No fluff. Straight to your inbox.

Also on LawSnap