Insights Header image
Insights Header image
Insights Header image

A Word of Caution: Company Liable for Misrepresentations Made by Chatbot

March 4, 2024 Litigation & Dispute Resolution Bulletin 3 minute read

In a recent decision of the British Columbia Civil Resolution Tribunal (the “Tribunal”), the Tribunal found Air Canada liable for certain representations made to a customer by a chatbot on Air Canada’s webpage.

The chatbot provided the customer with incorrect information relating to Air Canada’s bereavement travel policy. As a result, the customer brought a claim against Air Canada for negligent misrepresentation. In its defence, Air Canada argued that it could not be held liable for information provided by its chatbot as the chatbot was a separate legal entity responsible for its own actions. The Tribunal rejected this argument, finding in favour of the customer.

Facts

On November 11, 2022, Mr. Moffatt’s grandmother passed away in Ontario. That same day, Mr. Moffatt visited Air Canada’s website to book a flight and interacted with Air Canada’s website chatbot.

A chatbot is an automated system that provides information to a person using a website in response to that person’s prompts and inputs.

Mr. Moffatt asked the chatbot about Air Canada’s bereavement fares. The chatbot advised that it was possible to submit a purchased ticket for a reduced bereavement rate within 90 days of the air travel ticket being issued.  Mr. Moffatt then booked his flights using ordinary rates, relying on the chatbot’s advice that he could submit a request to be reimbursed by Air Canada at a later date.

The chatbot was wrong.

Upon submitting his tickets for a reduced bereavement rate, Mr. Moffatt was informed by an Air Canada representative that the chatbot had provided misleading information. Unfortunately for Mr. Moffatt, Air Canada informed him that the company’s bereavement policy did not apply to requests for bereavement consideration after travel had been completed.

Is a Chatbot a Separate Legal Entity?

Air Canada attempted to defend the claim by arguing that it cannot be held liable for inaccurate information provided by one of its agents, servants or representatives – including a chatbot. However, the Tribunal disagreed. Although a chatbot provides an interactive element, it is still part of Air Canada’s website, and Air Canada is responsible for all the information on its website, including information provided by its chatbot. It makes no difference whether the information comes from a static webpage or a chatbot.

Air Canada Found Liable for Negligent Misrepresentation

Due to the commercial relationship between Air Canada and Mr. Moffatt, the Tribunal found that Air Canada owed Mr. Moffatt a duty of care to ensure its representations were accurate and not misleading. Air Canada breached its duty by failing to take reasonable care to ensure its chatbot provided accurate information and was subsequently found liable for Mr. Moffatt’s damages – the difference between the price Mr. Moffatt paid for his tickets at the ordinary rates and Air Canada’s bereavement fair.

Implications for Artificial Intelligence Tools

It is not clear from the Tribunal’s decision whether the Air Canada chatbot used artificial intelligence technology to compose its responses. It may have been a rule-based chatbot providing pre-scripted responses based on keywords in customer queries (in which case, the error here could have been the result of an inaccurate or outdated script).

Still, the decision carries important implications for all types of chatbots, including those powered by artificial intelligence and large language models. The main takeaway? Companies can be held liable for anything their chatbot says.

Accordingly, companies should assess what type of chatbot is appropriate for their platform and consider whether a simple rule-based chatbot is a safer option. Companies should develop disclaimers and notices where chatbots are deployed informing users of the risks of hallucinations or inaccuracies (keeping in mind that this may not fully insulate the company from risk). Finally, if a company does intend to use an AI chatbot, it should test the chatbot carefully before implementation to reduce the chances of hallucinations, inaccuracies, or other issues arising.

Takeaways

  • This decision provides insight into what a court may deem to be a reasonable position to take in a dispute involving the dissemination of misinformation.
  • Like the utterances of their employees and agents, companies may be presumptively responsible for the output of their chatbots.
  • This decision also shows that a judicial body will be reluctant to rule that technology can be “responsible” for misinformation and allow a company to escape liability on that basis.
  • Companies that implement a chatbot, or a similar customer-facing tool, on its website should keep the tool as simple as possible for the required purposes, use disclaimers, and test the system rigorously before implementation.
  • If your company is involved in a case involving chatbots or artificial intelligence tools, strongly consider settling out of court. While the claim was for relatively nominal damages ($812.02), the resulting decision has already made international news.[1]

[1] See articles recently published about the incident by BBC and Forbes, for example.

By Robert Piasentin, Carina Chiu, Robbie Grant, Danielle Perris (Articling Student)

A Cautionary Note

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© McMillan LLP 2024

Insights (5 Posts)View More

Featured Insight

Budget 2024: Legislative Changes of Note for Investment Funds

In Budget 2024, the Government acknowledges that the restrictions placed on the property that may be held by registered plans have become unduly complex.

Read More
Apr 19, 2024
Featured Insight

Budget 2024: Clean Energy Incentives and Resource Sector Measures

Budget 2024 prioritizes Canada’s transition to a net-zero economy and contains several measures aimed at facilitating that ongoing transition.

Read More
Apr 19, 2024
Featured Insight

Budget 2024: Synthetic Equity Arrangement Restrictions Tightened

Budget 2024 proposes to tighten the “synthetic equity arrangement” anti-avoidance rule by eliminating the no “tax-indifferent investor” exception.

Read More
Apr 19, 2024
Featured Insight

Budget 2024: Increases in the Taxation of Capital Gains

Budget 2024 proposes to significantly change how capital gains are taxed under the Income Tax Act (Canada).

Read More
Apr 19, 2024
Featured Insight

Budget 2024: Expanded Relief for Non-Resident Service Providers

Budget 2024 will expand the circumstances under which relief may be granted from withholdings on payments made to non-residents that render services in Canada.

Read More
Apr 19, 2024