Insights Header image
Insights Header image
Insights Header image

A Word of Caution: Company Liable for Misrepresentations Made by Chatbot

March 4, 2024 Litigation & Dispute Resolution Bulletin 3 minute read

In a recent decision of the British Columbia Civil Resolution Tribunal (the “Tribunal”), the Tribunal found Air Canada liable for certain representations made to a customer by a chatbot on Air Canada’s webpage.

The chatbot provided the customer with incorrect information relating to Air Canada’s bereavement travel policy. As a result, the customer brought a claim against Air Canada for negligent misrepresentation. In its defence, Air Canada argued that it could not be held liable for information provided by its chatbot as the chatbot was a separate legal entity responsible for its own actions. The Tribunal rejected this argument, finding in favour of the customer.

Facts

On November 11, 2022, Mr. Moffatt’s grandmother passed away in Ontario. That same day, Mr. Moffatt visited Air Canada’s website to book a flight and interacted with Air Canada’s website chatbot.

A chatbot is an automated system that provides information to a person using a website in response to that person’s prompts and inputs.

Mr. Moffatt asked the chatbot about Air Canada’s bereavement fares. The chatbot advised that it was possible to submit a purchased ticket for a reduced bereavement rate within 90 days of the air travel ticket being issued.  Mr. Moffatt then booked his flights using ordinary rates, relying on the chatbot’s advice that he could submit a request to be reimbursed by Air Canada at a later date.

The chatbot was wrong.

Upon submitting his tickets for a reduced bereavement rate, Mr. Moffatt was informed by an Air Canada representative that the chatbot had provided misleading information. Unfortunately for Mr. Moffatt, Air Canada informed him that the company’s bereavement policy did not apply to requests for bereavement consideration after travel had been completed.

Is a Chatbot a Separate Legal Entity?

Air Canada attempted to defend the claim by arguing that it cannot be held liable for inaccurate information provided by one of its agents, servants or representatives – including a chatbot. However, the Tribunal disagreed. Although a chatbot provides an interactive element, it is still part of Air Canada’s website, and Air Canada is responsible for all the information on its website, including information provided by its chatbot. It makes no difference whether the information comes from a static webpage or a chatbot.

Air Canada Found Liable for Negligent Misrepresentation

Due to the commercial relationship between Air Canada and Mr. Moffatt, the Tribunal found that Air Canada owed Mr. Moffatt a duty of care to ensure its representations were accurate and not misleading. Air Canada breached its duty by failing to take reasonable care to ensure its chatbot provided accurate information and was subsequently found liable for Mr. Moffatt’s damages – the difference between the price Mr. Moffatt paid for his tickets at the ordinary rates and Air Canada’s bereavement fair.

Implications for Artificial Intelligence Tools

It is not clear from the Tribunal’s decision whether the Air Canada chatbot used artificial intelligence technology to compose its responses. It may have been a rule-based chatbot providing pre-scripted responses based on keywords in customer queries (in which case, the error here could have been the result of an inaccurate or outdated script).

Still, the decision carries important implications for all types of chatbots, including those powered by artificial intelligence and large language models. The main takeaway? Companies can be held liable for anything their chatbot says.

Accordingly, companies should assess what type of chatbot is appropriate for their platform and consider whether a simple rule-based chatbot is a safer option. Companies should develop disclaimers and notices where chatbots are deployed informing users of the risks of hallucinations or inaccuracies (keeping in mind that this may not fully insulate the company from risk). Finally, if a company does intend to use an AI chatbot, it should test the chatbot carefully before implementation to reduce the chances of hallucinations, inaccuracies, or other issues arising.

Takeaways

  • This decision provides insight into what a court may deem to be a reasonable position to take in a dispute involving the dissemination of misinformation.
  • Like the utterances of their employees and agents, companies may be presumptively responsible for the output of their chatbots.
  • This decision also shows that a judicial body will be reluctant to rule that technology can be “responsible” for misinformation and allow a company to escape liability on that basis.
  • Companies that implement a chatbot, or a similar customer-facing tool, on its website should keep the tool as simple as possible for the required purposes, use disclaimers, and test the system rigorously before implementation.
  • If your company is involved in a case involving chatbots or artificial intelligence tools, strongly consider settling out of court. While the claim was for relatively nominal damages ($812.02), the resulting decision has already made international news.[1]

[1] See articles recently published about the incident by BBC and Forbes, for example.

By Robert Piasentin, Carina Chiu, Robbie Grant, Danielle Perris (Articling Student)

A Cautionary Note

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© McMillan LLP 2024

Insights (5 Posts)View More

Featured Insight

More Than Meets the Eye: The Legal Implications of British Columbia’s Agreement to Recognize Aboriginal Title Over Haida Gwaii

An analysis of legal implications related to the BC Government's agreement with the Haida Nation to recognize Aboriginal title over Haida Gwaii.

Read More
Apr 23, 2024
Featured Insight

Lessons Learned from the TTC’s Ransomware Attack

Lessons learned from the recent investigation by the Ontario IPC into the effectiveness of the TTC's cybersecurity measures and ransomware attack response

Read More
Apr 23, 2024
Featured Insight

Don’t Get Caught by Canada’s Patent Novelty Grace-Period

The key difference between Canada and other jurisdictions like the United States when relying on the grace-period for inventor disclosures.

Read More
Apr 23, 2024
Featured Insight

Shifting Gears – Canada to Consider New Motor Vehicle Equipment Regulations to Help Prevent Auto Theft

Transport Canada announces plan to update safety standards to combat auto theft.

Read More
Apr 22, 2024