Skip to content
Home » Blog » B.C. lawyer accused of citing fake legal cases “hallucinated” by AI filed to B.C. Supreme Court

B.C. lawyer accused of citing fake legal cases “hallucinated” by AI filed to B.C. Supreme Court

    AI Lawyer

    A B.C. lawyer is accused of using artificial intelligence (AI) to prepare legal briefs in a civil case before the B.C. Supreme Court, citing fake case law fabricated by ChatGPT, an AI-powered chatbot developed by U.S. AI firm OpenAI. 

    Lawyer Chong Ke with Vancouver firm Westside Family Law allegedly used ChatGPT to prepare legal briefs she filed with the B.C. Supreme Court in support of a client’s application to take his children to China for a visit. The AI-generated briefs cited cases that don’t actually exist, instead fabricating fake case law which Ke then submitted to the court.

    The fake case law was discovered by Vancouver lawyers Lorne and Fraser MacLean of the firm MacLean Family Law, the opposing lawyers in the civil case.

    Fact checking AI

    “The impact of the case is chilling for the legal community,” Lorne MacLean, K.C., told Global News. “If we don’t fact check AI materials and they are inaccurate it can lead to an existential threat for the legal system: people waste money, courts waste resources and tax dollars, and there is a risk that the judgments will be erroneous, so it’s a huge deal.”

    Global News reported that Ke apologized to the court, stating that she was unaware that AI chatbots like ChatGPT can be unreliable, and admitted that she did not check to see if the cases actually existed.

    The MacLeans told Global News they intend to ask the court to award special costs over the AI issue. However, Lorne MacLean said he’s worried this case could be just the tip of the iceberg.

    “One of the scary things is, have any false cases already slipped through the Canadian justice system and we don’t even know?”

    ChatGPT

    While AI chatbots like ChatGPT can be helpful tools to gather and process information, they are known to sometimes make up realistic sounding but incorrect information. This phenomena, known as “artificial hallucination,” results in the AI inserting plausible-sounding random falsehoods into its results as it attempts to fill in gaps in data. Research has suggested that modern AI-powered chatbots generate factual errors in as many as 27% to 46% of generated responses. 

    While this is the first known case of AI-generated fake case law being submitted to a B.C. court, the issue has already come up in the U.S.

    In May 2023, New York judge fined lawyers who submitted a legal brief with imaginary cases dreamed up by ChatGPT. Much like Ke, those lawyers maintained it was a good-faith error caused by not understanding the limitations the AI-powered chatbots.

    “It sent shockwaves in the U.S. when it first came out in the summer of 2023 … shockwaves in the United Kingdom, and now it’s going to send shockwaves across Canada,” Lorne MacLean told Global News. “It erodes confidence in the merits of a judgment or the accuracy of a judgment if it’s been based on false cases.”

    AI should have lawyers on high alert

    Legal experts say the arrival of AI technology in Canada should have lawyers on high alert.

    The Law Society of BC warned lawyers about the use of AI and provided guidance in a practice resource document published in October 2023, advising them to closely review any work produced by an AI-powered chatbot.

    “The adoption of generative AI tools like ChatGPT-4 presents lawyers with exciting opportunities for efficiency and productivity,” the guidelines concluded. “However, it also introduces ethical challenges that demand vigilance and adherence to professional responsibilities. By understanding and addressing these challenges, lawyers can harness the power of generative AI in a responsible and ethical manner while upholding the integrity of the legal profession. Staying informed, maintaining transparency, and putting clients’ interests first are key to successfully navigating the ethical landscape of generative AI in the practice of law.”

    Interestingly, the Law Society of BC’s practice resource was, in part, generated by ChatGPT.

    B.C. Supreme Court Chief Justice, Hon. Christopher E. Hinkson, issued a directive last March telling judges not to use AI. Canada’s federal court also followed suit in December.

    Do you need assistance with a legal matter? Clark Woods LLP is a full service law firm serving all of British Columbia. Talk to one of our lawyers today!