Chapter 03

The Ruling That Made AI Liable

When Air Canada's chatbot gave a grieving passenger wrong bereavement fare information, the airline called the bot 'a separate legal entity.' A Canadian court rejected that defense — and established the first major precedent that companies are liable for what their AI systems say.

✓ Verified BC Civil Resolution Tribunal decision is publicly available · Air Canada acknowledged the ruling
Listen to this story Audio Overview
0:00 / 0:00
Share X LinkedIn Reddit HN

01 — The SetupA Grieving Traveler

In November 2022, Jake Moffatt's grandmother passed away. He needed to book last-minute flights from Vancouver to Toronto for the funeral — never an inexpensive proposition.

Before booking, he visited Air Canada's website and asked the airline's AI chatbot whether he could get a reduced bereavement fare. The bot told him yes — and added a detail that would become the center of a legal battle: that he could book his ticket at full price now and apply for the bereavement discount retroactively within 90 days.

Moffatt took the bot at its word. He booked the tickets. He paid full price. Then he submitted his bereavement discount application — and Air Canada said no.

$812
extra Moffatt paid vs. bereavement rate
90
days the chatbot said he had to apply
1st
major chatbot liability ruling in Canada

02 — The ConversationWhat the Chatbot Said

This is a reconstructed version of the chatbot exchange, based on court filings and reporting. The key detail: the chatbot gave confident, specific, and incorrect legal/policy information.

Air Canada virtual assistant · November 2022 (reconstructed)
👤
Jake Moffatt
My grandmother just passed away. I need to travel to Toronto for the funeral. Do you have bereavement fares? Is there any way to get a discount for last-minute travel due to a death in the family?
✈️
Air Canada Virtual Assistant
I'm sorry to hear about your loss. Air Canada does offer bereavement fares for immediate family members. If you need to travel immediately, you can book your ticket at the current available fare and then apply for a bereavement fare reduction within 90 days of your return travel date by contacting our Bereavement Travel team.
💡 The critical error: Air Canada's actual bereavement policy did not allow retroactive applications. The discount had to be requested before booking. The chatbot invented a policy that did not exist.
👤
Jake Moffatt
So I can book now at full price and get the bereavement rate applied after I travel?
✈️
Air Canada Virtual Assistant
That's correct. You can book your travel now and submit a request for the bereavement fare reduction within 90 days of completing your travel. Please keep all documentation related to your bereavement.

The Gap — Two parallel policy streams flow undisturbed. A chatbot particle bridges them — and the divergence it creates widens until it becomes its own reality.

03 — The Denial"That Was the Bot's Opinion"

Moffatt flew to Toronto, attended his grandmother's funeral, and returned. He then submitted his bereavement discount application along with his documentation.

Air Canada denied it. The airline's customer service team explained that bereavement fares had to be requested before booking and could not be applied retroactively. The chatbot had been wrong.

But Moffatt had screenshots. He'd followed the bot's explicit instructions. When he escalated his complaint, Air Canada's position was extraordinary: the chatbot was a separate legal entity from the airline, and the company couldn't be held responsible for information it provided.

🏛️
Air Canada's Legal Argument The airline argued in tribunal proceedings that its chatbot was "a separate legal entity that is responsible for its own actions." Air Canada claimed it could not be held liable for information the bot provided — even though the bot was operating on Air Canada's own website, under Air Canada's brand, answering questions about Air Canada's policies.

Moffatt filed a claim with British Columbia's Civil Resolution Tribunal — a small claims court designed to handle disputes without lawyers. The amount was just over $800.

04 — The RulingThe Court's Verdict

In January 2024, Tribunal Member Christopher Rivers issued his decision. He rejected Air Canada's argument entirely — ruling that a company is liable for information its chatbot provides, just as it would be for information from a human agent.

BC Civil Resolution Tribunal · January 2024
Moffatt v. Air Canada
Claimant:Jake Moffatt
Respondent:Air Canada
Claim:Negligent misrepresentation by chatbot
Air Canada's defense:Chatbot is a "separate legal entity"
Tribunal finding:Defense rejected. Air Canada liable.
Ordered to pay:$812.02 + interest + tribunal fees
Key finding: "Air Canada does not explain why it believes that it can absolve itself of responsibility for information provided by its agents simply because the agent is an automated system rather than a human. Air Canada is responsible for all information on its website."

05 — SignificanceWhy This Case Matters

On its face, this was an $812 dispute. But the tribunal's reasoning applies to every company that deploys a customer-facing chatbot.

⚖️
A New Legal Standard The ruling establishes that companies cannot disclaim liability for their chatbots' hallucinations simply because the source was automated. If your AI provides incorrect information to a customer who reasonably relies on it, that's on you.
🏢

Corporate Accountability

Companies can no longer treat chatbot errors as a "computer said so" shield. If you deploy it on your platform under your brand, you own its outputs.

🤖

Hallucination Has Real Costs

AI hallucinations — confident fabrications — aren't just embarrassing. When a customer relies on one, the company deploying the AI bears the consequence.

📋

Policy Accuracy Is a Product Requirement

Companies that use AI to answer policy questions have a legal obligation to ensure those AI systems are accurate — or to clearly disclaim their limitations.

🌍

Global Regulatory Pressure

This Canadian ruling arrived as the EU AI Act and similar regulations were being finalized. It reinforced the global regulatory direction: AI providers are responsible for their systems.

The dollar amount was small. The principle was not. A court looked at an AI chatbot and said: someone built this thing, someone deployed it, and someone has to answer for what it says.

What If?

What if this $812 ruling is the seed of an entirely new body of law — one where every AI system deployed by a corporation carries the same legal weight as a human employee's promise? Not just chatbots, but hiring algorithms that must justify every rejection in court, diagnostic AIs that carry malpractice liability, financial advisors built on language models that owe fiduciary duty. The Air Canada ruling didn't just say 'you can't blame the computer.' It said something far more radical: if you put an AI in front of a customer, you are making a promise. And promises, in law, have consequences. The question is whether the legal system can scale that principle faster than companies can deploy AI systems that make new kinds of promises nobody has tested yet.

How did this land?

Sources

← Previous Chapter 02 The Alien Move 7 min read Next → Chapter 04 The Antibiotic That AI Found Hiding in Plain Sight 7 min read
New chapters · No spam
Get the next story in your inbox