- February 19, 2024
- 6 minutes read
Air Canada ordered to refund passenger after ‘misleading’ conversation with site’s AI chatbot
Air Canada was ordered to reimburse a passenger who was mistakenly promised a cheaper bereavement fare by the airline’s AI chatbot — a possible landmark decision as more companies turn to artificial intelligence for customer service.
Jack Moffat, a Vancouver resident, had asked the airline’s support chatbot whether it offered bereavement rates following the death of his grandmother in November 2022.
The chatbot responded by telling the grieving grandson he could claim the lower price up to 90 days after flying by filing a claim.
Air Canada was ordered to pay a passenger an $812 refund after he was promised one by the company’s chat bot. Getty Images
However, the airline’s actual bereavement policy, does not include a post-flight refund. It also says all discounts must first be approved.
Moffatt ended up booking a roundtrip flight to Toronto for the funeral for around $1,200 but when he contacted Air Canada for the refund he was told he wasn’t eligible, according to the court filing.
He sent numerous emails with the attached screenshots of his conversation with the chatbot to Air Canada in an attempt to retrieve the money, the complaint said.
But on Feb. 8, 2023, an Air Canada representative informed him that the chatbot provided “misleading words” and that the company’s bereavement policy did not apply discounts retroactively.
Moffatt was told by the airline that it would update the chatbot so that its messages would align with the information that was posted to the company website.
The peeved passenger then filed suit against the airline, which claimed in court that the chatbot was a “separate legal entity” and thus was responsible for its actions.
Last week, a Canadian tribunal sided with Moffatt and ordered Air Canada to issue a refund for roughly $600.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website,” wrote Christopher Rivers, a civil resolution tribunal member of the courts in British Columbia. “It makes no difference whether the information comes from a static page or a chatbot.”
“I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Rivers continued. “While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.”
On Monday, the chatbot, launched last year, was not available on Air Canada’s site.
The Post reached out to Air Canada for comment.
Air Canada tried to avoid the refund by claiming that the chat bot offered “misleading words.” Getty Images
Moffatt was told by the airline that it would update the chatbot so that its messages would align with the information that was posted to the company web site.
Moffatt then filed suit against the airline, which claimed in court that the chatbot was a “separate legal entity” and thus was responsible for its actions.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website,” wrote Christopher Rivers, a civil resolution tribunal member of the courts in British Columbia.
“It makes no difference whether the information comes from a static page or a chatbot.”