Blog

Blog

5 Mar

Can statements by client-facing AI Chatbots bind their owners? (Moffatt v. Air Canada)

Tuesday, March 5, 2024Noah Bonis CharancleLitigationTransportation & Logistics, AI

More and more corporations are using client-facing Artificial Intelligence Chatbots to respond to client inquiries. Although these Chatbots can enhance customer service by rapidly and cost effectively answering simple queries, the question begs as to who is liable when these bots mistakenly provide misleading or outright false information to clients.

The BC Civil Resolution Tribunal’s recent decision in Moffatt v. Air Canada, 2024 BCCRT 149 discusses whether a Chatbot’s owner can be held liable for any inaccurate information the bot provides. In that decision, Jake Moffat, who was travelling due to his grandmother’s passing, booked flights through Air Canada from Vancouver to Toronto. While researching flights, he used Air Canada’s Chatbot which was described by the Court as an automated system capable of providing information in response to human prompts.

While interacting with the Chatbot, Mr. Moffatt was told that he could apply for a bereavement fare rate discount for his travels by completing a ticket refund application form within 90 days of the date of ticket issue. The relevant messages from the Chatbot read:

Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.

If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form. (emphasis in original)

Of note, the words “bereavement fares” in the Chatbot’s message were actually a hyperlink inserted by the bot which linked to Air Canada’s actual bereavement policy. Contrary to the bot’s statements, this policy said that bereavement fare requests could not be made after travel had been completed. As such, the Chatbot had simultaneously given Mr. Moffat the wrong information while providing a link to the correct policy.

Mr. Moffat, relying on the Chatbot, booked two last minute flights to and from Toronto. After he returned, he submitted his request for bereavement fare to Air Canada within the 90 days indicated by the Chatbot. However, Air Canada declined to grant him its bereavement fare citing its actual bereavement policy that did not allow for post travel requests. Mr. Moffatt brought his complaint to the BC Civil Resolution Tribunal where Air Canada stated it was not liable for information provided by the Chatbot.

In its decision, the Tribunal disagreed with Air Canada’s position and said that Mr. Moffat was entitled to rely on the bot to provide him with accurate information. It also stated that Air Canada’s duty of care required it to ensure that the bot’s representations were accurate and not misleading. Finally, the Tribunal specifically addressed the fact that, even though the actual “bereavement fares” webpage was hyperlinked, the Chatbot formed part of Air Canada’s website and users should not have to cross reference information on one webpage against another.

The Tribunal ultimately found that Air Canada had committed the tort of negligent misrepresentation as Mr. Moffatt had relied on the Chatbot’s false statements to his own detriment. In its decision, the Tribunal awarded Mr. Moffat judgment in the form of the difference between what he paid for the flights and what he would have paid had he been given the discounted fare.

Although this is only a BC Tribunal decision and is not binding on other tribunals or judges, it offers a glimpse into how courts and other tribunals may potentially treat misleading AI Chatbot statements that are relied upon by customers. The BC Tribunal’s decision establishes a duty of care owed by companies operating Chatbots whereby they are required to ensure that the bots’ statements are not misleading and are held liable for any statements made. More specifically, the BC Tribunal stated at paragraphs 26 and 27:

Here, given their commercial relationship as a service provider and consumer, I find Air Canada owed Mr. Moffatt a duty of care. Generally, the applicable standard of care requires a company to take reasonable care to ensure their representations are accurate and not misleading.

Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.

Going forward, companies using Chatbots should ensure they have safeguards in place to prevent their bots from making misleading statements. Companies should also ensure they develop frameworks to flag and deal with Chatbot messages that may contain mistakes or discuss a sensitive topic so that any confusion created may be clarified before the inaccurate statements are relied upon. A PDF version is available for download here.

Noah Bonis Charancle
Noah Bonis Charancle
Associate
T 416.865.6661
nbonischarancle@grllp.com

 

(This blog is provided for educational purposes only, and does not necessarily reflect the views of Gardiner Roberts LLP).

Subscribe Now