Air Canada’s use of an automated chatbot has come under scrutiny after a customer accused the bot of providing misleading information. In a recent case, Jake Moffatt sought information about bereavement fares following the passing of his grandmother. However, when he requested a refund based on the chatbot’s advice, Air Canada denied his claim.
The airline initially defended the chatbot, stating that it was “responsible for its own actions.” This argument did not sit well with the public or the courts. In a hearing, a Civil Resolution Tribunal (CRT) member called Air Canada’s defense “remarkable” and emphasized that the airline was ultimately responsible for all the information on its website, including that provided by the chatbot.
Moreover, Air Canada argued that the correct information about bereavement fares was available on its website. However, the CRT member pointed out that customers should not be expected to differentiate between the accuracy of information…
#AirCanada














