Air Canada has been ordered to uphold a policy fabricated by its AI customer chatbot in a recent Civil Resolution Tribunal (CRT) dispute.
The decision is a cautionary tale for why clients need to be sure their AI chatbots provide accurate information — or risk being held liable in court.
The dispute arose after passenger Jake Moffatt booked a flight in Nov. 2022 with Air Canada after a relative died. While researching flight options, Moffatt inquired through the airline’s chatbot about bereavement fare options.
The chatbot said Moffatt could apply for bereavement fares retroactively.
“If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form,” the chatbot’s response read, according to CRT.
The chatbot hyperlinked to a separate Air Canada webpage titled ‘Bereavement…
#AirCanada
















