Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

Expert Witness Report in FCA Case Afflicted with AI Hallucinations

By Eric Fader on July 31, 2025
Email this postTweet this postLike this postShare this post on LinkedIn

The epidemic of out-of-control generative artificial intelligence in litigation filings has metastasized to a False Claims Act (FCA) lawsuit against a group of Utah anesthesiologists. On July 25, Mountain West Anesthesia, LLC and individual defendants in the case moved to bar the testimony of a medical billing expert whose report was riddled with AI-generated errors, including fabricated testimony from a government representative, fake citations to Medicare manuals, and nonexistent industry publications.

The suit alleged that Mountain West Anesthesia improperly billed Medicare and Medicaid for continuous anesthesia services supposedly rendered by physicians who actually were using their smartphones at the time. The relator (whistleblower) in the case, a vascular surgeon, said he personally witnessed this activity. Thomas J. Dawson III, an attorney who the relator designated as an expert, admitted in a deposition that he used ChatGPT to help him write the report.

In its motion, Mountain West said, “Courts have increasingly chastised experts and attorneys for blindly relying on AI to generate unreliable reports and other documents, and yet Mr. Dawson’s report stands out, even among those cases, for the remarkable number and extraordinary nature of the fabrications and errors.”

Sign up to receive Rivkin Rounds at www.RivkinRounds.com.

  • Posted in:
    Health Care
  • Blog:
    Rivkin Rounds
  • Organization:
    Rivkin Radler
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo