What Ibach v. Stewart Means for Every Solo and Small-Firm Attorney Using AI in 2026
By Kim Xi Harris | Founder & CEO, Lex Arca Legal Vault | calculator.lex-arca.com
On April 24, 2026, the Alabama Supreme Court dismissed an appeal and ordered a solo practitioner to pay $17,200 in sanctions after his briefs cited dozens of cases that do not exist — fabricated by AI. When he promised the court the error would not recur, he used AI to write the apology. The next two citations in that same footnote were also fake. The court called it “egregious.” The client lost the case.
What Exactly Happened in Ibach v. Stewart?
W. Perry Hall is a solo practitioner in Mobile, Alabama. He was retained by Laurie Ibach and Mark Campbell — a niece and nephew — to appeal a summary judgment ruling against them in a family trust dispute. The Mobile Circuit Court had sided with their uncle, Bruce Stewart, the trustee of two living trusts. Hall took the appeal to the Alabama Supreme Court.
What he filed was not a brief. It was a catalog of fiction.
The court found Hall’s submissions contained “numerous invalid, inaccurate, and/or irrelevant citations to legal authorities.” Cases that don’t exist. Quotations from opinions that were never written. References to an out-of-state ruling that cannot be found in any database. The Alabama Supreme Court described it plainly: “They appear to be artificial-intelligence (‘AI’) ‘hallucinations,’ i.e., fake authorities created by an AI system.”
Then came the detail that even seasoned legal observers found hard to process.
In a reply brief, Hall included a footnote apologizing for the hallucinated citations and promising the court “the mistake will not recur.” In the very next sentence of that same footnote, he cited two more cases. Neither exists. Justice Greg Cook wrote in a special concurrence: “It is simply hard to imagine how this could occur absent, perhaps, using AI to craft the apology for having used AI.”
What Did the Court Actually Do About It?
The Alabama Supreme Court did not merely sanction Hall. It dismissed the appeal entirely. His clients — who had a legitimate family trust dispute — lost their right to be heard because their attorney’s filings were so compromised the court called the appeal “frivolous.”
The full package of consequences:
- $17,200 in attorneys’ fees and costs — paid to opposing counsel.
- Referral to the Alabama State Bar for professional discipline.
- Prohibition from filing anything before the Alabama Supreme Court without a co-signing attorney in good standing.
- The clients’ appeal — dismissed.
Associate Justice Chris McCool wrote for the majority: “This Court takes Hall’s misconduct very seriously… the improper use of AI in the plaintiffs’ briefs was widespread and particularly egregious.”
Justice Cook’s special concurrence added a note for the broader bench: while Hall’s conduct warranted extreme measures, dismissal of a client’s appeal is “a particularly strong sanction that should be used sparingly.” The implication was clear — this case was extraordinary enough to justify it.
Is This an Isolated Incident — or a Pattern?
It is a pattern. And it is accelerating.
Mata v. Avianca (S.D.N.Y., 2023) put AI hallucination sanctions on the national radar. Two attorneys were fined $5,000 for submitting ChatGPT-fabricated citations. At the time, courts treated it as a novel warning shot.
That window closed. As of 2026, there are more than 300 standing court orders governing AI use in legal filings. In the past year alone, documented sanctions have included: a Florida attorney ordered to pay $86,000; three attorneys at a 350-person national firm disqualified from a case and referred to state bars; and a DOJ attorney terminated in March 2026 after a pro se plaintiff caught fabricated citations in a federal brief.
Three weeks before Ibach v. Stewart, Judge Anna Manasco in the Northern District of Alabama issued a separate opinion in Rivera v. Triad Properties — sanctioning attorney Joshua Watkins and disqualifying his firm after repeated AI-hallucinated submissions. Her ruling was ordered published in the Federal Supplement.
Alabama has now been the venue for two landmark AI sanctions rulings in the same month. That is not a coincidence. It is a signal.
Why Does This Keep Happening to Solo Attorneys?
Because the tools most solo practitioners are using were not built for legal citation work. They were built for general language generation. ChatGPT, Copilot, and consumer-grade AI assistants have no mechanism to distinguish between a real case and a plausible-sounding fabrication. They generate text that looks authoritative because that is what they are trained to do.
The Alabama Supreme Court identified the root problem precisely: “The problem of fake citations in court filings is the result of attorneys failing to properly research and verify the results of AI-generated citations — in short, attorney negligence in checking his or her work.”
But there is a structural issue underneath the negligence: most AI tools operate on data the attorney does not control, cannot audit, and cannot verify. The model is retrieving from — or confabulating based on — a black box the attorney never sees.
ABA Formal Opinion 512, issued in 2024, makes the attorney’s personal verification obligation explicit and enforceable under Model Rules 1.1, 1.4, and 1.5. Seventy-five percent of U.S. attorneys are currently using AI. Only 25% have received formal AI ethics training. Forty-four percent of law firms have no formal AI governance policy in place.
Solo practitioners are the most exposed segment of the profession — high AI adoption, lowest institutional oversight, and no second attorney in the room to catch what the model invented.
What Does a Compliant AI Workflow Actually Look Like?
Compliance starts with the architecture of the tool itself — not just the attorney’s review habits.
The hallucination problem is not primarily a human error problem. Hall checked his work — or thought he did. He wrote an apology. He promised the error would not recur. It recurred. The problem is that he was working with a tool that had no grounding in verified case materials. It was generating plausible text, not retrieving confirmed law.
A litigation AI that works correctly operates on the attorney’s actual case file — the documents, transcripts, motions, and evidence the attorney has already assembled and verified. When the AI retrieves a citation, it is pulling from material the attorney uploaded. Not from a general language model’s probabilistic reconstruction of what case law might say.
Lex Arca Legal Vault’s Neural Strategist is built on this principle. Analysis is grounded in the attorney’s own case documents. The platform maintains a cryptographically timestamped audit trail of documented AI activity tied to actual file-access events — giving attorneys a documented and verifiable record of what the system accessed and when. That record exists whether it is ever needed or not.
For attorneys under Florida Administrative Order 26-04, Texas personal certification requirements, or ABA Formal Opinion 512 obligations, that documented activity trail is not a nice-to-have. It is the compliance infrastructure the obligation assumes you have. Learn more about Lex Arca’s litigation intelligence platform for solo firms and how the ABA Opinion 512 compliance workflow was built into the platform from day one.
Key Takeaways
- On April 24, 2026, the Alabama Supreme Court dismissed the appeal in Ibach v. Stewart and sanctioned solo practitioner W. Perry Hall $17,200 after his briefs contained AI-hallucinated citations — including two fabricated cases in the same footnote where he apologized for fabricating cases.
- This is not an isolated failure — it is the latest ruling in a documented pattern of escalating AI sanctions affecting solo and small-firm attorneys who use general-purpose AI tools without verified grounding in their actual case materials.
- ABA Formal Opinion 512 and jurisdiction-specific orders in Florida, Texas, and beyond make personal attorney verification of every AI-generated output a professional obligation enforceable under Model Rules 1.1, 1.4, and 1.5.
- Lex Arca Legal Vault provides a documented, verifiable AI activity trail designed to support attorney compliance workflows — grounding AI analysis in the attorney’s own verified case documents rather than general language model output.
- Calculate your firm’s billing leakage and get early access at calculator.lex-arca.com.
About the Author
Kim Xi Harris is the Founder and CEO of Lex Arca Legal Vault, an AI-native litigation intelligence and compliance platform for solo and small-firm attorneys. She is a Cornell Women’s Entrepreneur Program graduate, SBA Women in Business Champion Award recipient, WOSB certified, and holds five Google AI certifications. Calculate your firm’s billing leakage and join the VIP waitlist at calculator.lex-arca.com — or reach us at legalvault@lex-arca.com.
Case citation: Ibach v. Stewart, No. SC-2025-0106 (Ala. Supreme Court, April 24, 2026).