← Back to AI Failures Database
LegalHigh Impact

635 Court Cases Now Cite AI Hallucinations

Hallucination Nation StaffFebruary 9, 20267 min

The Original Case

In June 2023, attorneys Steven Schwartz and Peter LoDuca made international headlines when they submitted a legal brief containing six completely fabricated court cases.

There was just one problem: none of the cases existed.

Judge P. Kevin Castel described the citations as "bogus judicial decisions with bogus quotes and bogus internal citations." The attorneys were fined $5,000 each and required to notify every judge falsely cited.

The Spread

By late 2025, a legal researcher documented over 635 court cases where lawyers or litigants cited AI hallucinations. The problem is spreading faster than the legal profession can adapt.

Why It Keeps Happening

AI-generated legal citations are particularly dangerous because:

  1. Legal writing is formulaic — AI easily mimics the style
  2. Case names follow patterns — "[Plaintiff] v. [Defendant]" is easy to fake
  3. Citation formats are rigid — volume numbers, page numbers, court names
  4. Non-lawyers can't easily verify — Legal databases aren't freely accessible

The Professional Stakes

Consequences can include:

  • Court sanctions and fines
  • Malpractice claims
  • Bar disciplinary action
  • Damaged client cases
  • Career-ending reputational harm

The Rule

If an AI gives you a legal citation — verify it exists before you rely on it.

Found this useful? Share it with someone who trusts AI too much.

More from the AI Failures Database

View all stories →