Air Canada must grant a partial refund to a customer made by a chatbot on behalf of the company.
The chatbot promised the customer a refund contrary to company policy after the purchase of a full-price ticket. The man had inquired about a discount for a last-minute booking due to a bereavement—his grandmother had died. In doing so, the chatbot erroneously advised him to first book a ticket at the standard price and then apply for a refund within 90 days. However, Air Canada did not want to grant this payment to the customer.
Particularly interesting in the case is the airline's reasoning, which completely shifted the responsibility onto the chatbot. According to the ruling, the airline argued that it could not be held liable for information provided by one of its agents, employees, or representatives – including a chatbot. The company suggested that the chatbot was a separate legal entity and therefore responsible for its own actions. However, the court did not follow this argument.
The arbitration decision states that it should be clear to Air Canada that they are responsible for all information on their website, whether it originates from a static page or a chatbot. This legal dispute also raises the question of responsibility in connection with the use of artificial intelligence: Who bears the liability in the event of a faulty decision by AI?
In this case, the company's chatbot had clearly violated the guidelines, and thus the responsibility lay with Air Canada. The airline must now grant the customer the promised partial refund. Whether this decision will have significant impacts on the use of artificial intelligence in companies remains to be seen. However, it is clear that the responsibility and liability for a chatbot's actions should ultimately always lie with the company.