01 — The SetupA Grieving Traveler
In November 2022, Jake Moffatt's grandmother passed away. He needed to book last-minute flights from Vancouver to Toronto for the funeral — never an inexpensive proposition.
Before booking, he visited Air Canada's website and asked the airline's AI chatbot whether he could get a reduced bereavement fare. The bot told him yes — and added a detail that would become the center of a legal battle: that he could book his ticket at full price now and apply for the bereavement discount retroactively within 90 days.
Moffatt took the bot at its word. He booked the tickets. He paid full price. Then he submitted his bereavement discount application — and Air Canada said no.
02 — The ConversationWhat the Chatbot Said
This is a reconstructed version of the chatbot exchange, based on court filings and reporting. The key detail: the chatbot gave confident, specific, and incorrect legal/policy information.
The Gap — Two parallel policy streams flow undisturbed. A chatbot particle bridges them — and the divergence it creates widens until it becomes its own reality.
03 — The Denial"That Was the Bot's Opinion"
Moffatt flew to Toronto, attended his grandmother's funeral, and returned. He then submitted his bereavement discount application along with his documentation.
Air Canada denied it. The airline's customer service team explained that bereavement fares had to be requested before booking and could not be applied retroactively. The chatbot had been wrong.
But Moffatt had screenshots. He'd followed the bot's explicit instructions. When he escalated his complaint, Air Canada's position was extraordinary: the chatbot was a separate legal entity from the airline, and the company couldn't be held responsible for information it provided.
Moffatt filed a claim with British Columbia's Civil Resolution Tribunal — a small claims court designed to handle disputes without lawyers. The amount was just over $800.
04 — The RulingThe Court's Verdict
In January 2024, Tribunal Member Christopher Rivers issued his decision. He rejected Air Canada's argument entirely — ruling that a company is liable for information its chatbot provides, just as it would be for information from a human agent.
05 — SignificanceWhy This Case Matters
On its face, this was an $812 dispute. But the tribunal's reasoning applies to every company that deploys a customer-facing chatbot.
Corporate Accountability
Companies can no longer treat chatbot errors as a "computer said so" shield. If you deploy it on your platform under your brand, you own its outputs.
Hallucination Has Real Costs
AI hallucinations — confident fabrications — aren't just embarrassing. When a customer relies on one, the company deploying the AI bears the consequence.
Policy Accuracy Is a Product Requirement
Companies that use AI to answer policy questions have a legal obligation to ensure those AI systems are accurate — or to clearly disclaim their limitations.
Global Regulatory Pressure
This Canadian ruling arrived as the EU AI Act and similar regulations were being finalized. It reinforced the global regulatory direction: AI providers are responsible for their systems.
The dollar amount was small. The principle was not. A court looked at an AI chatbot and said: someone built this thing, someone deployed it, and someone has to answer for what it says.
What if this $812 ruling is the seed of an entirely new body of law — one where every AI system deployed by a corporation carries the same legal weight as a human employee's promise? Not just chatbots, but hiring algorithms that must justify every rejection in court, diagnostic AIs that carry malpractice liability, financial advisors built on language models that owe fiduciary duty. The Air Canada ruling didn't just say 'you can't blame the computer.' It said something far more radical: if you put an AI in front of a customer, you are making a promise. And promises, in law, have consequences. The question is whether the legal system can scale that principle faster than companies can deploy AI systems that make new kinds of promises nobody has tested yet.