Insights Header image
Insights Header image
Insights Header image

Do You See What I See? Fake AI Cases Can Result in Real Contempt

May 9, 2025 Litigation & Dispute Resolution Bulletin 4 minute read

The increased use of artificial intelligence (“AI“) in the legal profession has been a significant innovation.  AI tools can deliver efficiency and create value for clients.  At the same time, AI has brought new pitfalls, which, if ignored, may result in lawyers finding themselves in contempt of court.  The Ontario Superior Court of Justice released a decision by Justice Myers[1] this week discussing the risks of AI tools generating fabricated cases[2] and the professional obligations lawyers have in making accurate representations to the court.  The decision serves as a strong reminder to counsel that their professional obligations include verifying the legal authorities they rely on, and that the innovations offered by AI cannot be relied on to replace counsel’s own legal analysis.  Reliance on AI hallucinated cases in submissions to the court can constitute a breach of professional obligations and may amount to contempt in the face of the court.

The Ko v. Li Decision

With the advent of AI and the potential for AI hallucinations, Justice Myers took the opportunity to “quickly and firmly make clear that, regardless of technology, lawyers cannot rely on non-existent authorities or cases that say the opposite of what is submitted.”[3]

The motion before Justice Myers dealt with two estates applications and one family law application.[4] However, Justice Myers’ endorsement first addressed the “serious issue” that the applicant’s submissions appeared to contain AI hallucinations.[5]

The AI hallucinations in the factum included:

  • a citation hyperlink that directed to the wrong case;[6]
  • citations to cases that had nothing to do with the issue being argued;[7]
  • a broken hyperlink that led a “404 Error” page on CanLII;[8] and
  • a citation to a case that reached a conclusion opposite to the one purported in the submissions.[9]

The cases cited in the factum and used during oral submissions could not be found on CanLII, Westlaw, Quicklaw or Google.[10] Counsel was also unable to provide copies of the cases from the printed materials relied on while making her submissions.[11]

Based on these deficiencies, Justice Myers noted that AI appeared to have generated the factum and counsel “might not have checked to make sure the cases were real or supported the propositions of law which she submitted to the court in writing and then again orally.”[12]

Justice Myers added that “[i]t should go without saying that it is the lawyer’s duty to read cases before submitting them to a court as precedential authorities. At its barest minimum, it is the lawyer’s duty not to submit case authorities that do not exist or that stand for the opposite of the lawyer’s submission…It is the litigation lawyer’s most fundamental duty not to mislead the court.”[13]

While Justice Myers found there may have been a grave breach of duties and contempt of court, there has been no final finding in this case. The lawyer is presumed innocent and will have an opportunity to submit evidence on the issue.[14]

Best Practices

Lawyers using AI in their practice should consult sources available to the profession, such as the LSO’s Technology Resource Centre, which has up-to-date resources including:[15]

Regardless of AI use, lawyers must verify all citations they rely on. The Ontario Rules of Civil Procedure require lawyers to certify the authenticity of the authorities relied upon in a factum:[16]

A factum shall include a statement signed by the party’s lawyer, or on the lawyer’s behalf by someone the lawyer has specifically authorized, certifying that the person signing the statement is satisfied as to the authenticity of every authority cited in the factum.

The Federal Court has also published a notice to parties and the profession on the use of AI in court proceedings, a key element of which is disclosure of AI use to the court.[17]

When asked if AI can hallucinate cases, ChatGPT confirmed that AI can generate “information—such as a court case citation, summary, or legal principle—that appears real but is entirely fabricated or inaccurate.”[18] ChatGPT further explained that common AI hallucinations in law include generating fake citations, distorting facts or rulings, or providing incorrect jurisdictional links.  ChatGPT advises to:

  • always verify legal information from trusted databases or court websites;
  • ask the AI tool to cite a source or confirm whether a case is real; and
  • use AI tools designed specifically for legal research that are integrated with verified legal content.

Notwithstanding ChatGPT’s advice to get AI to confirm authenticity, the ultimate responsibility rests with counsel to do so.

Takeaways

Courts and tribunals have previously grappled with the “fake cases” issue and emphasized the seriousness of it.[19] This recent Ontario decision takes it a step further and cautions lawyers that reliance on a fake case may result in a finding of contempt as against the lawyer.[20]

The Ontario decision also serves as a reminder of lawyers’ duties to the court, to their clients, and to the administration of justice:

  • to faithfully represent the law to the court;
  • not to fabricate case precedent and not to mis-cite cases for propositions that they do not support;
  • to use technology, conduct legal research and prepare court documents competently;
  • to supervise staff and review material prepared for counsel’s signature; and
  •  to ensure human review of materials prepared by non-human technology such as generative AI.[21]

[1] Ko v Li, 2025 ONSC 2766 [“Ko“].
[2] The Law Society of Ontario’s white paper on Licensee use of generative artificial intelligence refers to hallucinations and notes that generative AI tools can provide responses that include information that is fabricated or otherwise inaccurate but which appears authentic (page 6).
[3] Ko at paras 26-27.
[4] ibid at para 34.
[5] ibid  at para 1.
[6] ibid at para 5.
[7] ibid at paras 5 and 13.
[8] ibid at para 6.
[9] ibid at para 11.
[10] ibid at paras 7 and 25.
[11] ibid at para 8.
[12] ibid at para 14.
[13] ibid at paras 21-22.
[14] ibid at para 31.
[15] Law Society of Ontario, “Technology Resource Centre: Using Technology”.
[16] R.R.O. 1990, Reg. 194, r 4.06.1(2.1).
[17]  Federal Court, “Notice to the Parties and the Profession” (May 7, 2024).
[18] ChatGPT’s response to inquiry “Can artificial intelligence hallucinate case law?” (May 8, 2025).
[19] For example, Zhang v Chen, 2024 BCSC 285: “Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court.  Unchecked, it can lead to a miscarriage of justice.” (para 29); also see decisions of the Trademarks Opposition Board:  Industria de Diseño Textil, SA v Sara Ghassai, 2024 TMOB 150 at para 6; Monster Energy Company v Pacific Smoke International Inc., 2024 TMOB 211 at para 16.
[20] Ko at para 31.
[21] ibid at paras 15-20.

by Guneev Bhinder, Samantha Gordon, Andrea Arbuthnot (Articling Student)

A Cautionary Note

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© McMillan LLP 2025

Insights (5 Posts)View More

Featured Insight

Canada’s Fighting Against Forced Labour and Child Labour in Supply Chains Act: Where to next for companies?

Join subject matter experts from McMillan and Pillar Two for a webinar to support companies in addressing their forced labour, child labour and other modern slavery risks, and to support their reporting obligations under the Canadian Fighting Against Forced Labour and Child Labour in Supply Chains Act.

Details
Tuesday, May 20, 2025
Featured Insight

Why Extended Producer Responsibility and the Circular Economy Demand Boardroom Action

Discussion of global amendments to Extended Producer Responsibility (EPR) and right to repair laws, impacting product lifecycle management and regulatory compliance for businesses.

Read More
May 6, 2025
Featured Insight

Québec’s Bill 96: Where we are now after four years and the road ahead

Join Enda Wong, Business Law, Shari Munk-Manel, Employment & Labour Relations, and Émile Catimel-Marchand, Financial Services and Regulatory as they discuss Bill 96, its impacts on the day-to-day operations of companies doing business in Québec, including in the areas of employment, contracting, product design, labelling and advertising.

May 5, 2025
Featured Insight

Put Your Best Foot Forward: New Evidence Requirements for Trademark Appeals

Discussion of amendments to Canada's Trademarks Act impacting timing of delivery of evidence to the Trademark Opposition Board or later to the Federal Courts.

Read More
Apr 30, 2025
Featured Insight

Codifying Crypto: Changes to National Instrument 81-102 Regarding Investments in Crypto Assets

Changes to National Instrument 81-102 Regarding Investments in Crypto Assets

Read More
Apr 30, 2025