Artificial intelligence is all fun and games until someone gets hurt.
A cautionary tale is making headlines in British Columbia where a small claims court on Wednesday (Feb. 15) ruled that Air Canada tried to deny liability when its AI-driven chatbot gave bad advice about bereavement fares.
As reported by CBC News, Air Canada apparently tried to hold its own chatbot responsible for misleading a customer, saying its online tool was “a separate legal entity that is responsible for its own actions.”
That argument didn’t exactly fly with the adjudicator.
“This is a remarkable submission,” Civil Resolution Tribunal (CRT) member Christopher Rivers wrote. “While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
Air Canada was ordered to pay…
#AirCanada

















