TECH Tribunal rules Air Canada must pay refund promised by AI chatbot

1911user

Veteran Member
Very interesting precedent.


Air Canada must pay refund promised by AI chatbot, tribunal rules
by Nick Robertson - 02/18/24 11:05 PM ET

Air Canada must pay a Vancouver man a partial refund for his flight ticket that was promised by the site’s chatbot, a Canadian tribunal ruled Wednesday, in what could be a landmark case for the use of artificial intelligence in business.

Jake Moffatt asked the airline’s artificial intelligence support chatbot whether the airline offered bereavement fares in November 2022, following the death of his grandmother. The chatbot said the airline does offer discount fares and that Moffat could receive the discount up to 90 days after flying by filing a claim.

The airline’s actual bereavement policy, however, does not include a post-flight refund, and specifically states the discount must be approved beforehand.

Moffat booked and flew from Vancouver to Toronto and back for about $1200, and later requested the promised discount of about half off but was told by the airline’s support staff that the chatbot’s replies were incorrect and nonbinding.

Air Canada argued in the civil tribunal that the chatbot is a “separate legal entity” to the company, and that it could not be held responsible for its words to customers.

Tribune member Christopher Rivers ruled in favor of Moffat on Wednesday, determining the airline committed “negligent misrepresentation” and it must follow through with the chatbot’s promised discount.

“This is a remarkable submission,” he wrote. “While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Rivers ordered Air Canada to pay Moffat the promised $483 refund plus nominal fees.

“I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Rivers continued. “While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.”

The support chatbot, launched last year, was not visible on the airline’s website as of Sunday.
 

Dobbin

Faithful Steed
Very interesting precedent.


Air Canada must pay refund promised by AI chatbot, tribunal rules
by Nick Robertson - 02/18/24 11:05 PM ET

Air Canada must pay a Vancouver man a partial refund for his flight ticket that was promised by the site’s chatbot, a Canadian tribunal ruled Wednesday, in what could be a landmark case for the use of artificial intelligence in business.

Jake Moffatt asked the airline’s artificial intelligence support chatbot whether the airline offered bereavement fares in November 2022, following the death of his grandmother. The chatbot said the airline does offer discount fares and that Moffat could receive the discount up to 90 days after flying by filing a claim.

The airline’s actual bereavement policy, however, does not include a post-flight refund, and specifically states the discount must be approved beforehand.

Moffat booked and flew from Vancouver to Toronto and back for about $1200, and later requested the promised discount of about half off but was told by the airline’s support staff that the chatbot’s replies were incorrect and nonbinding.

Air Canada argued in the civil tribunal that the chatbot is a “separate legal entity” to the company, and that it could not be held responsible for its words to customers.

Tribune member Christopher Rivers ruled in favor of Moffat on Wednesday, determining the airline committed “negligent misrepresentation” and it must follow through with the chatbot’s promised discount.

“This is a remarkable submission,” he wrote. “While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Rivers ordered Air Canada to pay Moffat the promised $483 refund plus nominal fees.

“I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Rivers continued. “While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.”

The support chatbot, launched last year, was not visible on the airline’s website as of Sunday.
Makes sense. The chatbot in effect becomes an "informational agent." And - like any agent of the company - the company is responsible for agent actions.

Dobbin
 

BadMedicine

Would *I* Lie???
I wonder when chatbot insurance will become available to protect businesses?
well computer/robots only do or say what you program them to so, a little prevention would go a long way on this one... like "Don't havee it offer discounts, or LIE for that matter..."

Yeah, airline deserved this one. How many people did the bot talk in to taking flights before it got sued? Doubt he was the first.
 

Griz3752

Retired, practising Curmudgeon
I wonder when chatbot insurance will become available to protect businesses?
Just as soon as the underwriters can determine how much they can screw people-facing businesses out of.

Fear not - the business verticals will figure out how to incorporate those fees int othe retaill cost calcs
 

Melodi

Disaster Cat
Notice how the company tried to get off the hook by claiming the AI is a "contractor." It is interesting that, at least in Europe, the tendency of Big Corporations to farm out nearly everything, especially deliveries, is getting into hot water with the court. There is a growing tendency for both courts, revisions in law that say that if an employee wears your uniform and acts in your name, they are your employee. This isn't universal, but it is a growing trend. Especially after places like Facebook were accused of hiring only "contractors" to look over horrific content that needed to be "scrubbed" for many hours daily. But then claimed they were not responsible for any medical or other bills that resulted because they were all "contractors." Yet, their actions would always say "Facebook" on them. Plus, they worked in the Facebook building, although they were denied access to most of the facility.

Drivers are another area becoming an increasingly large problem due to accidents and other serious issues. It is perfectly legal in most places for the Bigs to hire another company to do their shipping. But if they wear the livery of the company and act in their name, that's where the laws are being redrawn or enforced differently.

AI is likely to fall into a similar category.
 
Top