The most viral story in the legal industry in 2021 was the attorney who assured a judge that he was not, in fact, a cat.

The year 2023 may have provided this definitely human attorney’s most potent challenger in the form of the Mata v. Avianca. Plaintiff’s counsel utilized ChatGPT to draft a court filed document, ChatGPT “hallucinated” and provided several fictitious case citations. After attempting to cover up the use of Chat GPT, the attorney was sanctioned, and the case directly led to multiple bans on the use of generative AI from judges across the country.

But what are hallucinations, anyway? And can anything be done about them?