Deloitte refunds Australia over AI errors in government report

SYDNEY, AUSTRALIA — Deloitte has agreed to refund part of its payment to the Australian government after errors were discovered in a report partially generated by artificial intelligence (AI), reigniting debate over the accountability of consulting firms that increasingly rely on automated tools in their work, according to a report from Financial Times.
The A$439,000 (US$290,000) “independent assurance review,” commissioned by Australia’s Department of Employment and Workplace Relations, aimed to assess flaws in a welfare system that automatically penalized jobseekers.
However, the report, originally released earlier this year, was found to contain several inaccuracies, including citations to non-existent academic papers.
AI involvement and errors acknowledged
Deloitte admitted that parts of the report had been generated using a large language model, specifically an “Azure OpenAI GPT-4o-based tool chain” licensed by the government department. In the corrected version, the firm clarified that the AI-assisted sections had produced errors in citations, references, and a summary of legal proceedings.
“The updates made in no way impact or affect the substantive content, findings and recommendations in the report,” Deloitte stated in the amended version.
While the consulting firm did not explicitly blame the AI tool for the inaccuracies, the Australian Financial Review reported that the document included false academic references from the University of Sydney and Lund University in Sweden.
The Department of Employment and Workplace Relations confirmed that Deloitte had agreed to repay the final installment of its contract.
“The matter has been resolved directly with the client,” Deloitte Australia said in a statement.
Industry concerns over AI ‘hallucinations’
The episode underscores growing concerns within the professional services industry about the reliability of AI-generated content, particularly the risk of so-called “hallucinations,” where AI systems produce convincing but false information.
The United Kingdom accountancy regulator had earlier warned that Big Four firms were “failing to keep track of how automated tools and AI affected the quality of their audits,” even as they expand their use for risk assessment and evidence collection.
Deloitte’s case illustrates the delicate balance between innovation and oversight as consultancies race to integrate AI to speed up analysis and reporting.
AI accountability and the outsourcing lens
This incident serves as a reminder to the global outsourcing and professional services industries that while AI can enhance efficiency, it also introduces new dimensions of accountability risk. Firms that depend on technology to streamline work must now double down on human oversight to preserve trust and quality.
“AI can be great for driving efficiencies, but it cannot be fully trusted. The one thing more powerful, more trustworthy, more truthful, more capable than AI is—people powered by AI,” Kerry Hallard, Chief Executive Officer (CEO) of the Global Sourcing Association, said in a LinkedIn post addressing the Deloitte incident.
As outsourcing firms worldwide adopt AI to optimize cost and turnaround times, Deloitte’s experience in Australia stands as a cautionary tale: automation can accelerate delivery, but without rigorous verification, it can just as easily undermine credibility, the very foundation upon which the consulting business is built.

Independent




