Chapter 12

Air Canada's $812 Chatbot Mistake

A man was trying to get a bereavement discount. An AI told him he could. The airline said the bot was wrong. A court disagreed — and rewrote the rules on corporate responsibility for AI.

01 — The SetupA Grieving Traveler

In November 2022, Jake Moffatt's grandmother passed away. He needed to book last-minute flights from Vancouver to Toronto for the funeral — never an inexpensive proposition.

Before booking, he visited Air Canada's website and asked the airline's AI chatbot whether he could get a reduced bereavement fare. The bot told him yes — and added a detail that would become the center of a legal battle: that he could book his ticket at full price now and apply for the bereavement discount retroactively within 90 days.

Moffatt took the bot at its word. He booked the tickets. He paid full price. Then he submitted his bereavement discount application — and Air Canada said no.

$812
extra Moffatt paid vs. bereavement rate
90
days the chatbot said he had to apply
1st
major chatbot liability ruling in Canada

02 — The ConversationWhat the Chatbot Said

This is a reconstructed version of the chatbot exchange, based on court filings and reporting. The key detail: the chatbot gave confident, specific, and incorrect legal/policy information.

Air Canada virtual assistant · November 2022 (reconstructed)
👤
Jake Moffatt
My grandmother just passed away. I need to travel to Toronto for the funeral. Do you have bereavement fares? Is there any way to get a discount for last-minute travel due to a death in the family?
✈️
Air Canada Virtual Assistant
I'm sorry to hear about your loss. Air Canada does offer bereavement fares for immediate family members. If you need to travel immediately, you can book your ticket at the current available fare and then apply for a bereavement fare reduction within 90 days of your return travel date by contacting our Bereavement Travel team.
💡 The critical error: Air Canada's actual bereavement policy did not allow retroactive applications. The discount had to be requested before booking. The chatbot invented a policy that did not exist.
👤
Jake Moffatt
So I can book now at full price and get the bereavement rate applied after I travel?
✈️
Air Canada Virtual Assistant
That's correct. You can book your travel now and submit a request for the bereavement fare reduction within 90 days of completing your travel. Please keep all documentation related to your bereavement.

03 — The Denial"That Was the Bot's Opinion"

Moffatt flew to Toronto, attended his grandmother's funeral, and returned. He then submitted his bereavement discount application along with his documentation.

Air Canada denied it. The airline's customer service team explained that bereavement fares had to be requested before booking and could not be applied retroactively. The chatbot had been wrong.

But Moffatt had screenshots. He'd followed the bot's explicit instructions. When he escalated his complaint, Air Canada's position was extraordinary: the chatbot was a separate legal entity from the airline, and the company couldn't be held responsible for information it provided.

🏛️
Air Canada's Legal Argument The airline argued in tribunal proceedings that its chatbot was "a separate legal entity that is responsible for its own actions." Air Canada claimed it could not be held liable for information the bot provided — even though the bot was operating on Air Canada's own website, under Air Canada's brand, answering questions about Air Canada's policies.

Moffatt filed a claim with British Columbia's Civil Resolution Tribunal — a small claims court designed to handle disputes without lawyers. The amount was just over $800. But the implications were enormous.

04 — The RulingThe Court's Verdict

In January 2024, Tribunal Member Christopher Rivers issued his decision. He rejected Air Canada's argument entirely — and in doing so, established a significant legal precedent.

BC Civil Resolution Tribunal · January 2024
Moffatt v. Air Canada
Claimant: Jake Moffatt
Respondent: Air Canada
Claim: Negligent misrepresentation by chatbot
Air Canada's defense: Chatbot is a "separate legal entity"
Tribunal finding: Defense rejected. Air Canada liable.
Ordered to pay: $812.02 + interest + tribunal fees
Key finding: "Air Canada does not explain why it believes that it can absolve itself of responsibility for information provided by its agents simply because the agent is an automated system rather than a human. Air Canada is responsible for all information on its website."

05 — SignificanceWhy This Case Matters

On its face, this was a $812 dispute. But the tribunal's reasoning has implications that extend far beyond one airline ticket.

⚖️
A New Legal Standard The ruling establishes that companies cannot disclaim liability for their chatbots' hallucinations simply because the source was automated. If your AI provides incorrect information to a customer who reasonably relies on it, that's on you.
🏢

Corporate Accountability

Companies can no longer treat chatbot errors as a "computer said so" shield. If you deploy it on your platform under your brand, you own its outputs.

🤖

Hallucination Has Real Costs

AI hallucinations — confident fabrications — aren't just embarrassing. When a customer relies on one, the company deploying the AI bears the consequence.

📋

Policy Accuracy Is a Product Requirement

Companies that use AI to answer policy questions have a legal obligation to ensure those AI systems are accurate — or to clearly disclaim their limitations.

🌍

Global Regulatory Pressure

This Canadian ruling arrived as the EU AI Act and similar regulations were being finalized. It reinforced the global regulatory direction: AI providers are responsible for their systems.

The dollar amount was small. The principle was not. For the first time, a court looked at an AI chatbot and said: "Someone built this thing. Someone deployed it. And someone has to be responsible for what it says."