The decision may mark the first time a Canadian court has ruled on liability for contractors' AI use
The Alberta Court of Appeal told a Calgary lawyer who bans his firm from using generative AI that he “bears ultimate responsibility” for submitting a factum with hallucinated case law, even though the factum had been prepared by a third-party contractor who may have lied to the lawyer about not using a large language model tool.
The lawyer, Christopher Souster of Nimmons Law Office, told the appellate court that the contractor he hired to draft the factum also submitted his work late, leaving the firm limited time to properly verify each cited case before a filing deadline.
The court agreed to give Souster time to file an amended factum. But in a decision last week that partially favoured Souster’s client on unrelated issues, a unanimous appellate panel clarified that the lawyer – and any others who might be in his position – bear the blame in the scenario.
“The time needed to verify and cross-reference cited case authorities generated by a large language model must be planned for as part of a lawyer’s practice management responsibilities, especially during busy times and recognizing that exigencies may arise,” the panel said.
“Further, if a lawyer engages another individual to write and prepare material to be filed with the court, the lawyer whose name appears on the filed document bears ultimate responsibility for the material’s form and contents,” the panel added.
The appellate court’s decision is the latest contribution to a growing body of case law on AI hallucinations in court filings. Last year, the Supreme Court of British Columbia issued a landmark decision in Zhang v. Chen, the first known case involving AI-hallucinated case law in Canada; since then, the Federal Court of Canada and the Ontario Superior Court of Justice have also weighed in on the topic.
Amy Salyzyn, an associate professor at the University of Ottawa’s Faculty of Law who has written about AI and legal ethics, says the Alberta case, Reddy v. Saroya, marks the first time she’s seen a Canadian case that touches on third-party contractors’ use of generative AI to prepare court filings.
The case could set a precedent, she suggests.
“One thing I wonder about is whether or not going forward, lawyers who do use third-party services are now going to want to have contractual clauses that discuss AI use, and what the hiring lawyer believes is appropriate and what types of safeguards you might have in place,” Salyzyn says. She adds that a contract could also address missed deadlines.
However, Salyzyn says that “at the end of the day, the responsibility lies with the lawyer filing that factum, and they need to make sure they have whatever procedures or time buffers in place to make sure that they can be confident in what they’re filing with the court.”
Fraser MacLean, a lawyer at MacLean Family Law, agreed. MacLean represented one of the parties in Zhang v. Chen and was part of the team that discovered opposing counsel had cited hallucinated case law.
Fraser MacLean was named as one of The Top 25 Most Influential Lawyers in Canada. See the full list winners here.
“No matter how good someone else may be who is preparing work on your behalf… it is the responsible lawyer’s duty to make sure that you put good work forward and have checked that work to ensure it is accurate,” MacLean says. This is the case “whether that work is prepared by a contractor, a paralegal, or another lawyer at your office.”
He adds that although Canada is a small common law and civil law jurisdiction, it is unfortunately “punching far above its legal weight class on legal AI negligence.”
The Court of Appeal’s decision
The dispute in Reddy v. Saroya dates back to 2012, when one of the parties sued the other, alleging breach of contract, fraudulent misrepresentation, and conspiracy in relation to their real estate business dealings. Souster, who inherited the file, submitted a factum to the Alberta Court of Appeal in December. A third-party contractor prepared that factum.
On Thursday, Souster told Canadian Lawyer he did not receive the contractor’s draft until the morning the factum was due with the court, during an already-busy period. The draft was twice the length permitted by the court and did not include key pieces of information, which took the firm hours to correct and left little time to review citations before the court deadline. Souster, who was then unfamiliar with the concept of AI hallucinations, says it did not occur to him that the cited cases could be fake.
Souster says he had prohibited lawyers and staff at his firm from using generative AI tools since November 2024, based on guidance by the Law Society of Alberta that empowered law firms to decide whether or not they wanted to use the technology.
The motivation behind his policy was twofold. On the one hand, he wasn’t sure what would happen to solicitor-client information if it were plugged into a large language model. Souster says he also “didn’t understand AI well enough at the time that I was capable of implementing a policy that would protect the clients and the members of my firm.”
According to the Alberta Court of Appeal’s decision, Souster reached out to the contractor when opposing counsel flagged several cases cited in their factum, suggesting they did not exist. The contractor assured Souster that he did not use a large language model tool to prepare the factum. Souster later concluded that the contractor may have lied.
However, opposing counsel alleged abuse of process and asked for enhanced costs against Souster’s client, telling the court that the case was an opportunity for the appellate court to “emphasize to the legal profession that court submissions are a serious matter that require effort and careful attention.”
Ivan Ioudine, an associate at Hajduk LLP and one of the lawyers who asked for costs against Souster’s client, says he was shocked when he discovered the hallucinations.
“I didn’t expect somebody to use AI for submissions to the Court of Appeal, and I really didn’t expect that somebody wouldn't review the case law to ensure that the case law was real,” he says.
“When the other lawyer just uses a program to generate his submissions, it’s a waste of everybody’s time,” he adds. “It’s a waste of a lot of money. It’s a waste of court time.
“I think that the courts need to make it very clear to the legal profession that this is unacceptable. The courts are overburdened as it is.”
In its decision, the Alberta Court of Appeal noted that the Law Society of Ontario and the Alberta courts offer several sets of AI guidelines for lawyers in the province, including an October 2023 notice issued by the latter on large language models.
While the consequences of not adhering to the October notice are “within the discretion of the panel or the individual judge involved with the matter,” the appellate court said that “counsel and self-represented litigants should not expect leniency where they have failed to adhere to clear and unambiguous requirements.”
The appellate court added that remedies could include striking submissions, cost awards, contempt proceedings, or a referral to the Law Society of Alberta.
Souster told Canadian Lawyer he hopes other lawyers can benefit from his experience and avoid making the same mistakes. “I have learned from this experience and continue to educate myself on AI to better protect myself and my firm and have altered my practice as a result,” he says. “As AI is used more commonly and is becoming a prime tool for the legal community, additional obligations arise that are particularly challenging for smaller firms [that] do not have the technology or resources to readily detect the use of AI.”
The lawyer adds that he has made several changes to his practice, like including in his written submissions to the court the assurance that an AI platform has not prepared his filings. If he does use AI-assisted research, it will be independently verified and reviewed by a lawyer.
“I will also include protective language in any agreement with a contract lawyer, legal researcher, or employee, not as a replacement of my own due diligence, but in addition to it,” he adds.
Salyzyn applauds the Alberta Court of Appeal for tackling the issue of liability regarding third-party contractors’ AI use, especially in a written decision.
Courts’ handling of such issues don’t “always arise in a written decision,” she says. “I think we’re at the point here where we need to make sure that all hands are on deck and get the word out about this problem that’s certainly not going away.”