← Back to AI Failures Database
Corporate AIHigh Impact

Deloitte's $440,000 Report Contained AI-Fabricated Citations

Hallucination Nation StaffFebruary 12, 20266 min

The $440,000 Question

When the Australian government paid Deloitte nearly half a million dollars for a 237-page report on welfare compliance, they probably expected human expertise. Instead, they got some very creative fiction.

In February 2026, a University of Sydney academic discovered something peculiar while reviewing the report: citations that led nowhere. References to academic papers that didn't exist. And the crown jewel — a completely fabricated quote allegedly from a Federal Court judgment.

What Went Wrong

According to The Guardian's investigation, Deloitte admitted they had used Azure OpenAI GPT-4o during "early drafting" to fill "traceability and documentation gaps."

Translation: they let the AI make stuff up when they didn't have actual sources.

The fictional scholarship was so convincing that it had fooled everyone who reviewed the document — until someone actually tried to look up the sources.

The Fallout

  • Deloitte issued a corrected version with more than a dozen citations stripped out
  • They provided a partial refund to the Australian government
  • The incident made international headlines
  • Regulatory bodies began asking uncomfortable questions about AI use in government contracts

Why This Matters

This isn't just about one consulting firm getting caught. According to MIT's 2025 study, 95% of enterprise AI projects fail. Companies are throwing massive budgets at AI implementations that don't deliver measurable ROI.

The pattern is everywhere: people trust AI outputs without verification. If one of the Big Four consulting firms can submit AI hallucinations to a government client, what's happening in your organization?

The Checklist That Could Have Saved Deloitte $150,000+

  1. Click every citation link — does it actually go somewhere?
  2. Search quoted text — does the quote actually exist?
  3. Verify author credentials — are these real people who wrote these things?
  4. Cross-reference with primary sources — did you actually read what you're citing?
  5. Have a human expert review — someone who knows the field, not just the AI

Found this useful? Share it with someone who trusts AI too much.

More from the AI Failures Database

View all stories →