A decision on Air Canada’s liability for what its chatbot said is a reminder of how companies need to be cautious when relying on artificial intelligence, experts say.
The B.C. Civil Resolution Tribunal decision issued Wednesday showed that Air Canada tried to deny liability when its chatbot gave misleading information about the airline’s bereavement fares.
“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions,” tribunal member Christopher Rivers said in his decision.
“This is a remarkable submission,” he said.
Jake Moffatt brought the challenge after he tried to get the lower bereavement fare after already having paid full price for a flight, as the chatbot had implied he could, but the airline denied the claim saying he had to apply before taking the trip.
Rivers said in his decision that it should be obvious Air Canada is responsible for the information on its website, and in this case…
#AirCanada













