Man Defeats Air Canada in AI Chatbot Misinformation Case

In what is being seen as a landmark decision, a B.C. civil resolution tribunal has ruled in favour of Jake Moffatt, a passenger who contested Air Canada’s refusal to honour a bereavement fare discount mistakenly promised by the airline’s online AI-powered chatbot.

The case underscores the growing legal challenges companies face as they increasingly rely on artificial intelligence for customer service.

Moffatt had booked a flight to Toronto to attend his grandmother’s funeral in 2022, following advice from Air Canada’s chatbot that he could apply for a bereavement fare discount after purchasing a full-price ticket.

However, when he sought to claim the discount post-flight, an Air Canada representative informed him that the application for the discount could not be made retroactively. The airline later said that it was not liable for the chatbot’s error, suggesting the AI system operated as a separate legal entity.

Tribunal member Christopher Rivers, in his decision, found this argument by Air Canada to be “remarkable,” emphasizing that the airline owed a duty of accuracy to its customers, reports The Canadian Press.

“The applicable standard of care requires a company to take reasonable care to ensure their representations are not misleading,” Rivers wrote. He further criticized the airline’s stance that it could not be held responsible for the chatbot’s misinformation, stating, “It should be obvious to Air Canada that it is responsible for all the information on its website.”

Moffatt informed the tribunal that the total cost of his round-trip ticket was $1,640, while the bereavement fare he was promised amounted to approximately $760, resulting in an $880 discrepancy.

Upon requesting a partial refund, he also provided the tribunal with a screenshot of his conversation with the chatbot. Air Canada, in response, claimed that the chatbot’s advice was “misleading” and consequently denied the refund request. Good thing he had screenshots of the chatbot to back up his claims.

Gabor Lukacs, president of the Air Passenger Rights consumer advocacy group, praised the decision as “cutting edge” and a significant win for consumer rights.

At the end of the day, Air Canada should definitely be held responsible for the information from its AI chatbot. The company said in a statement it would comply with the tribunal’s ruling.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.