Using artificial intelligence safely in family law

As more family lawyers embrace the potential of an AI-assisted practice, here is how they can protect themselves

Using artificial intelligence safely in family law
By Michael McKiernan
Oct 08, 2025 / Share

For many family lawyers, Fraser MacLean’s first encounter with artificial intelligence would have been enough to put them off incorporating the emerging technology in their own practice.


Fraser MacLean

As counsel for the claimant in Zhang v. Chen, his team at MacLean Law helped uncover fake cases in a court filing by the respondent that ChatGPT had hallucinated. The subsequent ruling by BC Supreme Court Justice David Masuhara, in which he held MacLean’s opposing counsel personally liable for costs, went viral in legal circles in Canada and worldwide. 

“Before then, I had little or no experience with AI,” MacLean says.

With that spectacularly vivid cautionary tale in mind, MacLean has spent the 18 months or so since experimenting extensively with various AI tools – including legally focused services such as Alexi and other programs with a broader reach – to increase the efficiency of his practice.

In addition to routine tasks such as proofreading and producing document summaries, MacLean has also begun integrating AI into his litigation preparation process. He bounces around ideas for examining key witnesses or seeks input before a court appearance before a judge, whether at trial or in chambers.  

“The other week I ran through my submissions and asked [AI] for the 10 questions a judge is likely to ask me on them. It spat out some pretty good points,” he says. “AI may not say what you want it to say, and you don’t have to use it. However, there could be pieces of it that are helpful and brilliant.”

Ontario lawyer Russell Alexander was a much earlier adopter of AI and has even authored a book on the subject, documenting the impact of the technology on every aspect of family law practice, from marketing and client intake to the drafting of pleadings and financial briefs. 

“I went all-in on AI early on. I think it’s an exciting opportunity for the legal profession,” Alexander says. 

As the technology continues to develop, it’s becoming harder for family lawyers to ignore it, he adds.  

“Everyone has a foot in the game, and all the major legal service providers have some sort of AI assistant, depending on whatever their niche market is. We’re seeing a flood of software companies claiming they can make your job more efficient,” Alexander says.

In fact, as the Nova Scotia Barristers’ Society warned in its recently released AI Guide, lawyers may even be using AI without realizing, noting, for example, that “Microsoft Copilot is now automatically enabled in Microsoft 365, and Adobe Acrobat requires users to manually disable its ‘AI Assistant.’”

Although family lawyers are not the only ones dealing with legal AI risks, they have been prominent in bringing them to public attention. Soon after BC’s Zhang v. Chen case, Ontario’s family law practitioners got their own AI wake-up call in the divorce case of Ko v. Li, when Justice Fred Myers ordered a show-cause hearing for possible contempt by the applicant’s counsel after she cited hallucinated cases in a factum. The same judge later dismissed the hearing, acknowledging the lawyer’s immediate apology and the “public shaming” she had endured in the meantime. 

At family law boutique Crossroads Law in Vancouver, founder Marcus Sixta explains that it’s impossible to say whether family lawyers are uniquely vulnerable to the dangers of AI. 

“That would be pure speculation, but what I would say is that family law is an area that can be overwhelming for lawyers,” he says. “You’re dealing with highly emotional cases and people, which brings with it a lot of stress. You’re also dealing often with very tight timelines.”

There’s no question that AI tools have the potential to relieve some of that stress, but only when used with caution, Sixta adds. 

Fraser MacLean was named as one of The Top 25 Most Influential Lawyers in Canada. See the full list winners here.

Here are some practical steps lawyers can take to protect themselves when integrating AI into their family law practice: 

Do your research

Before exploring or investing in an AI tool for their practice, “legal professionals need to understand what it can do and what it can’t do,” says family lawyer Brett Carlson, a partner in the Calgary office of Linmac LLP. 


Brett Carlson

In the haze of hype and hysteria that has followed the rise of generative AI, Carlson says lawyers can easily be left with an inflated sense of its capabilities. Simply investigating how a tool works and understanding its limitations – such as the propensity for hallucinated results – helps dispel the mystique and reduce the risk

of overreliance.

“The most important thing for lawyers to recognize is that AI is not the saviour. It can be a great collaborative tool, but it’s not the solution to everything,” Carlson adds. 

Keep it confidential

According to Sixta, security and confidentiality are major concerns for family lawyers who upload data to AI programs, particularly when they use open-source AI models that rely on prompts to train and improve results.

“We don’t know exactly how the program is going to use that information in future,” he says.

While the tech company’s data use and storage policies may offer some reassurance, there are no guarantees in an industry that traditionally takes pride in its “move-fast-and-break-things” approach. 

“However, you can use them in a way that reduces the risk substantially by removing all the confidential or identifying information from anything that you input and keeping it more general,” Sixta adds. 

Put humans above machines 

In an increasingly digital world, the old-fashioned safeguards of supervision and verification are among the most crucial for family lawyers.

In some practices, Sixta says AI tools are effectively replacing paralegals, drafting affidavits and constructing financial statements for clients. 

“It’s the lawyer’s responsibility to review the work done by a legal assistant or a paralegal before filing with the court. The same applies with AI,” he says. “You can't just sit back and let it take over your practice.”

When his team members present memos to MacLean, he insists on having the first page of every cited decision printed out to satisfy him that it is not a fake. But that is just the first stage of a review process that ensures any AI-assisted outputs accurately reflect the relevant facts and law.

“You’ve got to double-check all of it,” he says. “Maintain rigorous professional and human oversight at every step.”

Be transparent 

According to MacLean, sunlight is the best disinfectant when it comes to AI use in legal practice. He encourages his team to be open with each other about their use

of technologies.

“If you ban it, that just pushes it underground. People are going to use it anyway, but outside of the firm’s data and security protocols, which can lead to more problems,” he says.

But courts and colleagues are not the only ones who need to be informed when AI has played a role in a lawyer’s provision of legal services: the person who ultimately pays the bill should also know how their information is being used.

“Our clients haven’t had any issues,” MacLean says. “They understand that this can be a helpful tool for our efficiency. And that in turn helps with keeping things economical for them.” 

 

Growth of AI hallucination cases worldwide

July–December 2023: 8

Jan–June 2024: 12

July–December 2024: 38

Jan–June 2025: 151

July–September 2025: 168

Source: Source: French lawyer and data scientist Damien Charlotin’s database of reported legal decisions involving hallucinated content produced by generative AI, as of Sep. 23, 2025.

 

Reported AI hallucination cases, since July 2023 

USA – 251

Australia – 28

Israel – 28 

Canada – 27 

United Kingdom – 15 

Related stories

Why banning AI in court is the wrong fix for fake case citations Superior Court says family law case factum may have fake legal citations due to AI 'hallucinations'