Air Canada has been directed to reimburse a passenger, Jake Moffat, for a discount promised erroneously by its chatbot, as reported by the Washington Post.
Moffat, who bought tickets two years ago to attend his grandmother’s funeral, relied on information provided by Air Canada’s support chatbot. The chatbot mistakenly assured him that he could receive a discount under the airline’s bereavement policy if he paid full price upfront and filed a claim later.
Consequently, Moffat spent over $700 (CAD) for a next-day ticket from Vancouver to Toronto and an additional $700 (CAD) for a return flight a few days later, based on the promise of a discount.
However, Air Canada clarified that its bereavement policy requires customers to request discounted fares before traveling, and it does not offer refunds for completed travel. The airline’s policy is intended to provide flexibility for upcoming travel during difficult times, as stated on its website.
The ruling by Canada’s Civil Resolution Tribunal (CRT) could establish a precedent for holding companies accountable for misinformation provided by interactive technology tools like chatbots, including generative artificial intelligence, used in customer service roles.
The tribunal’s decision underscores the importance of accurate and transparent communication in customer interactions, especially in sensitive situations such as bereavement travel.
As businesses increasingly rely on technology-driven solutions, ensuring the reliability and accuracy of these tools becomes crucial to maintaining customer trust and satisfaction.