When lawyers lean on AI, fake cases could lead to a 'miscarriage of justice,' experts say

2 days ago 3

Toronto·New

Legal experts accidental an Ontario judge's disapproval of a lawyer who seemingly leaned connected artificial quality to hole tribunal materials is putting the spotlight connected the dangers of AI tools that tin nutrient mendacious oregon fictitious information. That, successful turn, tin person real-life consequences, they say.

Artificial quality tin nutrient mendacious accusation that tin marque its mode into tribunal submissions

Muriel Draaisma · CBC News

· Posted: Jun 03, 2025 5:00 AM EDT | Last Updated: 9 minutes ago

A idiosyncratic   walks successful  beforehand   of a courthouse building.

Justice Joseph F. Kenkel, a justice with the Ontario Court of Justice, precocious ordered a transgression defence lawyer to refile his defence submissions, uncovering "serious problems" successful them. Leaning connected AI tools without verifying if their output is close could pb to superior problems for lawyers and the public, experts say. (Christopher Katsarov/The Canadian Press)

Legal experts accidental an Ontario judge's disapproval of a lawyer who seemingly leaned connected artificial quality to hole tribunal materials is putting the spotlight connected the dangers of AI tools that tin nutrient mendacious oregon fictitious information.

That, successful turn, tin person real-life consequences, they say.

Fake cases, known arsenic AI hallucinations, tin marque their mode into ineligible submissions if a lawyer doesn't instrumentality further steps to marque definite the cases really exist, says Amy Salyzyn, an subordinate prof astatine the University of Ottawa's module of law.

Lawyers routinely suggest what past decisions — oregon lawsuit instrumentality — a tribunal should use successful their clients' cases. Judges past find what cases to consider.

The occupation arises erstwhile lawyers usage generative AI tools that tin nutrient made-up information, Salyzyn says. A justice making a determination could truthful beryllium presented with incorrect oregon mendacious information. 

"You don't privation a tribunal making a determination astir someone's rights, someone's liberty, someone's money, based connected thing wholly made-up," Salyzyn told CBC Radio's Metro Morning connected Friday.

"There's a large interest that if 1 of these cases did perchance sneak through. You could person a miscarriage of justice."

Amy Salyzyn, an subordinate  prof  astatine  the University of Ottawa's module  of law.

Amy Salyzyn, an subordinate prof astatine the University of Ottawa's module of law, says: 'You don't privation a tribunal making a determination astir someone's rights, someone's liberty, someone's money, based connected thing wholly made-up.' (Shabana Buwalda)

Her comments travel aft Justice Joseph F. Kenkel, a justice with the Ontario Court of Justice, ordered transgression defence lawyer Arvin Ross connected May 26 to refile his defence submissions for an aggravated battle case, uncovering "serious problems" successful them. 

Kenkel said 1 lawsuit cited appeared to beryllium fictitious, portion respective lawsuit citations referred to unrelated civilian cases. Still different citations led to a lawsuit named that was not the authority for the constituent being made.

"The errors are galore and substantial," Kenkel said.

Kenkel ordered Ross to hole a "new acceptable of defence submissions" ensuring that: the paragraphs and pages are numbered; lawsuit citations see a "pinpoint cite" to the paragraph that explains the constituent being made; lawsuit citations are checked and see links to CanLII, a non-profit enactment that provides online entree to ineligible decisions, oregon different sites to guarantee they are accurate.

"Generative AI oregon commercialized ineligible bundle that uses GenAI indispensable not beryllium utilized for ineligible probe for these submissions," Kenkel said.  

CBC Toronto contacted Ross but helium declined the petition for an interview, saying successful a connection that he's "focused connected complying with the court's directions."

The Pillars of Justice sculpture, by Edwina Sandys, is pictured successful  The McMurtry Gardens of Justice, extracurricular  the Ontario Court of Justice successful  Toronto connected  June 13, 2022.

The Pillars of Justice sculpture, by Edwina Sandys, is pictured successful The McMurtry Gardens of Justice, extracurricular the Ontario Court of Justice successful Toronto connected June 13, 2022. (Esteban Cuevas/CBC)

French lawyer tracking cases with AI hallucinations

The case, known arsenic  R. v. Chand, is the 2nd Canadian lawsuit to person been included connected an planetary list, compiled by French lawyer Damien Charlotin, of ineligible decisions successful "cases wherever generative AI produced hallucinated content." In galore cases, the lawyers connected the database utilized fake citations. The database identifies 137 cases truthful far.

In the list's archetypal Canadian case, Zhang v. Chen, B.C. Justice D. M. Masuhara reprimanded lawyer Chong Ke connected Feb. 23, 2024 for inserting 2 fake cases into a announcement of exertion that were aboriginal discovered to person been created by ChatGPT. The judge, who described the errors arsenic "alarming," ordered Ke to wage tribunal costs but not peculiar costs.

"As this lawsuit has unluckily made clear, generative AI is inactive nary substitute for the nonrecreational expertise that the justness strategy requires of lawyers," Masuhara wrote successful a ruling connected costs. "Competence successful the enactment and usage of immoderate exertion tools, including those powered by AI, is critical. The integrity of the justness strategy requires nary less." 

Salyzyn said the improvement wherever lawyers record tribunal materials that cite non-existent cases is simply a planetary 1 and it's arising due to the fact that AI tools, specified arsenic ChatGPT, are not accusation retrieval devices but tools that lucifer patterns successful language. The effect tin beryllium inaccurate accusation that looks "quite real" but is successful information fabricated.

AI tools "can enactment things unneurotic that look similar ineligible cases. Sometimes they mightiness notation existent ineligible cases too, if it appears a batch successful the information that it has consumed. But fundamentally, the instrumentality is benignant of predicting the adjacent words to spell together, and sometimes it predicts and mixes unneurotic citations that look rather existent but don't accord with thing successful reality," she told Metro Morning.

Verification is key, instrumentality prof says

Salyzyn said lawyers are liable to clients and the courts for the enactment they produce, but if they are going to trust connected technology, they request to marque definite that made-up accusation is not being passed along. Verification is key, she said.

"If lawyers are utilizing exertion to assistance their practices, they request to inactive verify what that exertion is producing," she said. 

courtroom 125, Ontario provincial court, Ontario provincial courtroom, Toronto Old City Hall, Queen street, Jian Ghomeshi, Jian, Ghomeshi, empty, courtroom, court, room, law, judge, law, trial, judgement,  legal, innocence, courthouse, sentence, verdict, defendant, court, crime, litigation, criminal, litigate, Toronto, Ontario, CRD Jan 22 2016, (David Donnelly/CBC)

An bare courtroom astatine Toronto Old City Hall is shown here. (David Donnelly/CBC)

Nadir Sachak, a transgression defence lawyer with Sachak Law successful Toronto, said AI is simply a assets that tin beryllium utilized by lawyers but the lawyers are inactive yet liable for what they taxable to court. 

"You amended marque definite that, if you're relying upon exertion similar AI, that it's done properly," Sachak said.

He said successful the case, R. v. Chand, the justice had nary contented with the prime of the defence presented but it appears the lawyer progressive had not reviewed the statement presented to court.

The usage of AI besides poses questions for however lawyers measure clients, Sachak said. 

"Obviously, if 1 is acting ethically, 1 cannot simply measure a lawsuit for hours of enactment that the lawyer did not do, if the AI generated the material, let's say, successful 5 minutes," helium said. "One inactive has to marque definite that immoderate is presented is professional, done properly, diligently, and accurately."

In an email connected Monday, the Law Society of Ontario said it cannot stock accusation connected immoderate investigations it has undertaken, but said it has produced a achromatic insubstantial that provides an overview of generative AI, arsenic good arsenic guidance and considerations for lawyers on however its nonrecreational behaviour rules use to the transportation of ineligible services empowered by generative AI.

With files from Mercedes Gaztambide and Metro Morning

Read Entire Article