A Word of Caution: Company Liable for Misrepresentations Made by Chatbot
A Word of Caution: Company Liable for Misrepresentations Made by Chatbot
In a recent decision of the British Columbia Civil Resolution Tribunal (the “Tribunal”), the Tribunal found Air Canada liable for certain representations made to a customer by a chatbot on Air Canada’s webpage.
The chatbot provided the customer with incorrect information relating to Air Canada’s bereavement travel policy. As a result, the customer brought a claim against Air Canada for negligent misrepresentation. In its defence, Air Canada argued that it could not be held liable for information provided by its chatbot as the chatbot was a separate legal entity responsible for its own actions. The Tribunal rejected this argument, finding in favour of the customer.
Facts
On November 11, 2022, Mr. Moffatt’s grandmother passed away in Ontario. That same day, Mr. Moffatt visited Air Canada’s website to book a flight and interacted with Air Canada’s website chatbot.
A chatbot is an automated system that provides information to a person using a website in response to that person’s prompts and inputs.
Mr. Moffatt asked the chatbot about Air Canada’s bereavement fares. The chatbot advised that it was possible to submit a purchased ticket for a reduced bereavement rate within 90 days of the air travel ticket being issued. Mr. Moffatt then booked his flights using ordinary rates, relying on the chatbot’s advice that he could submit a request to be reimbursed by Air Canada at a later date.
The chatbot was wrong.
Upon submitting his tickets for a reduced bereavement rate, Mr. Moffatt was informed by an Air Canada representative that the chatbot had provided misleading information. Unfortunately for Mr. Moffatt, Air Canada informed him that the company’s bereavement policy did not apply to requests for bereavement consideration after travel had been completed.
Is a Chatbot a Separate Legal Entity?
Air Canada attempted to defend the claim by arguing that it cannot be held liable for inaccurate information provided by one of its agents, servants or representatives – including a chatbot. However, the Tribunal disagreed. Although a chatbot provides an interactive element, it is still part of Air Canada’s website, and Air Canada is responsible for all the information on its website, including information provided by its chatbot. It makes no difference whether the information comes from a static webpage or a chatbot.
Air Canada Found Liable for Negligent Misrepresentation
Due to the commercial relationship between Air Canada and Mr. Moffatt, the Tribunal found that Air Canada owed Mr. Moffatt a duty of care to ensure its representations were accurate and not misleading. Air Canada breached its duty by failing to take reasonable care to ensure its chatbot provided accurate information and was subsequently found liable for Mr. Moffatt’s damages – the difference between the price Mr. Moffatt paid for his tickets at the ordinary rates and Air Canada’s bereavement fair.
Implications for Artificial Intelligence Tools
It is not clear from the Tribunal’s decision whether the Air Canada chatbot used artificial intelligence technology to compose its responses. It may have been a rule-based chatbot providing pre-scripted responses based on keywords in customer queries (in which case, the error here could have been the result of an inaccurate or outdated script).
Still, the decision carries important implications for all types of chatbots, including those powered by artificial intelligence and large language models. The main takeaway? Companies can be held liable for anything their chatbot says.
Accordingly, companies should assess what type of chatbot is appropriate for their platform and consider whether a simple rule-based chatbot is a safer option. Companies should develop disclaimers and notices where chatbots are deployed informing users of the risks of hallucinations or inaccuracies (keeping in mind that this may not fully insulate the company from risk). Finally, if a company does intend to use an AI chatbot, it should test the chatbot carefully before implementation to reduce the chances of hallucinations, inaccuracies, or other issues arising.
Takeaways
- This decision provides insight into what a court may deem to be a reasonable position to take in a dispute involving the dissemination of misinformation.
- Like the utterances of their employees and agents, companies may be presumptively responsible for the output of their chatbots.
- This decision also shows that a judicial body will be reluctant to rule that technology can be “responsible” for misinformation and allow a company to escape liability on that basis.
- Companies that implement a chatbot, or a similar customer-facing tool, on its website should keep the tool as simple as possible for the required purposes, use disclaimers, and test the system rigorously before implementation.
- If your company is involved in a case involving chatbots or artificial intelligence tools, strongly consider settling out of court. While the claim was for relatively nominal damages ($812.02), the resulting decision has already made international news.[1]
[1] See articles recently published about the incident by BBC and Forbes, for example.
By Robert Piasentin, Carina Chiu, Robbie Grant, Danielle Perris (Articling Student)
A Cautionary Note
The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.
© McMillan LLP 2024
Insights (5 Posts)View More
Alberta’s Captive Insurers Gain Reinsurance Flexibility for Third-Party Risks
Overview of Alberta's updated rules for captive insurance companies.
CSSB Releases Final Canadian Sustainability Disclosure Standards: Mandatory Disclosure Rules are on the Horizon
The CSSB has released the final Canadian Sustainability Disclosure Standards, with sustainability-related disclosure and climate-specific requirements.
Competition Bureau Releases Draft Greenwashing Enforcement Guidelines: A Pragmatic Approach
The bulletin highlights key elements of the Competition Bureau's draft greenwashing guidelines in respect of June 2024 amendments to the Competition Act.
Capital Gains Confusion: The Reporting Conundrum for Investment Funds
Considerations when determining whether to complete T3 returns on the basis of the proposed capital gains tax changes that have yet to be enacted.
Know What You Are Leasing: Case Comment on Augusta Studios Inc. v 8699011 Canada Inc., 2024 ONSC 1905
A case comment on carefully describing areas that are or are not intended to be leased, and when a landlord ought to know about a subtenancy.
Get updates delivered right to your inbox. You can unsubscribe at any time.