Cautionary Tale: The Real-World Risks of AI-Generated Legal Citations
Fake or “hallucinated” legal citations are appearing with increasing frequency, affecting everyone from major law firms to courts. Despite repeated warnings, many attorneys continue to rely on generative AI without verifying the results.
In a recent case heard by the Georgia Court of Appeals, Shahid v. Esaam, No. A25A0196, 2025 WL 1792657 (Ga. Ct. App. June 30, 2025), which involved a final judgment and decree of divorce served by publication, the wife challenged the judgment on the grounds of improper service. In response, the husband submitted a brief that cited approximately 11 out of 15 total “hallucinated” or fake cases. Remarkably, he even requested attorneys’ fees, relying on a fabricated case to support his claim. The trial judge accepted the husband’s argument and issued an order that partially relied on two of the hallucinated cases. The Appellate Court made clear in its review that not only could it not locate the case purporting to provide attorneys’ fees by name or citation, but it could also not be located by its purported holding, which was “a blatant misstatement of the law.” The Appellate Court ultimately vacated the order and remanded for further proceedings consistent with its opinion.
In a recent decision from the U.S. District Court for the Southern District of Florida, Versant Fund. LLC v. Teras Breakbulk Ocean Nav. Enter., LLC, No. 17-cv-81140 (S.D. Fla. 2025), two attorneys were sanctioned for submitting a fabricated, AI-generated legal citation during a supplemental proceeding. Serving as pro hac vice counsel and local counsel, the attorneys filed a response that included a citation to a completely fictitious case. The citation was a hallucinated case and had no basis in existing law. The fake case was identified by opposing counsel, which prompted the attorneys to withdraw it two weeks later. The pro hac vice counsel admitted to using AI for legal research without verifying the results, and local counsel admitted to filing the document without reviewing the citations. The court imposed monetary sanctions, held both attorneys jointly liable for the opposing party’s fees and costs, and required each to complete continuing legal education on AI ethics within 30 days.
These cases serve as a clear warning about the potential dangers of relying on AI tools without proper oversight, with the court in Shahid including a relevant quote by Chief Justice John Roberts: “[A]ny use of AI requires caution and humility.” As artificial intelligence becomes more embedded in the legal profession, attorneys must ensure its use is both responsible and ethical. In Florida courts, the submission of AI-hallucinated case law can violate several key Rules of Professional Conduct, including Rule 4-3.1 (Meritorious Claims and Contentions), Rule 4-3.3 (Candor Toward Client and Court), Rule 4-3.4 (Fairness to Opposing Party and Counsel), and Rule 4-8.4 (Professional Responsibility in General). AI can assist with research, but it cannot replace the duty of independent research and review.