Air Canada Ordered to Pay for Its Chatbot’s False Info

Travel

Skift Take

The AI that powers the latest chatbot craze is still in the experimental phase, but not everyone knows that. This example brings into question whether companies should be more cautious about experimenting with unreliable tech.

Air Canada is being forced to refund a passenger who said he got bad advice from a chatbot on the airline’s website. 

Jake Moffatt booked a flight through Air Canada following the death of his grandmother in November 2022. The chatbot from the website suggested he could apply for a discounted bereavement fare after booking a flight at regular price, which he later learned is not the airline’s policy, according to a ruling in small claims court in British Columbia. 

Moffatt sued for a reimbursement, and Air Canada said it cannot be held liable if the chatbot provides false information. 

The airline’s bereavement fare webpage, which the chatbot linked to in its answers, says that the policy does not apply after travel has already been completed.

Many travel companies have been implementing new chatbots over the past year or so. They are largely powered by generative AI, which has been shown to consistently become “confused” and produce errors.

Moffatt provided a screenshot of the chatbot’s advice, according to the ruling: “If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.” 

Based on that advice, Moffatt booked a one-way flight from Vancouver to Toronto for $794.98 and a separate return flight for $845.38. Those flights departed on Nov. 12 and Nov. 16, respectively. He submitted the bereavement claim on Nov. 17. 

Moffatt and the airline exchanged messages for a couple of months about the issue. Even though an airline representative admitted that the chatbot had stated “misleading words,” according to the ruling, they were not able to resolve the issue before Moffatt filed a lawsuit. 

He also claimed to have been told by an airline representative that the fare should be $380. While that number was not proven in court, the airline did not provide any contrary information. 

The judge found that Air Canada “did not take reasonable care to ensure its chatbot was accurate” and ordered the airline to pay Moffatt $812.

The judge’s response to the airline’s defense:

“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Air Canada did not immediately respond to a Skift inquiry about a reaction to the ruling.

View original source here

Articles You May Like

Books on Parenting and Pregnancy for the First-Time Parent
India’s Hotel Gold Rush Lures Marriott, Hilton, and Other Giants
The Best Online Flower Delivery Services for Mother’s Day 2024
The PEN World Voices Festival has been canceled. ‹ Literary Hub
18 Must-See Markdowns From Zara’s Sneaky Special Prices Sale