Chapter 06

The Hurricane That AI Saw First

DeepMind's GraphCast predicted Hurricane Lee's Nova Scotia landfall nine days out — three days before traditional models converged. It ran in under a minute on a single chip.

✓ Verified Published in Science (Lam et al., Nov 2023) · ECMWF operational adoption confirmed Feb 2025 · NHC Tropical Cyclone Report AL132023
Listen to this story Audio Overview
0:00 / 0:00
Share X LinkedIn Reddit HN

01 — The StormCategory 5

On September 5, 2023, a low-pressure system organized into Tropical Depression Thirteen in the Atlantic Ocean. Within 24 hours it was a hurricane. Within 48 hours it was a major hurricane. By September 7, Hurricane Lee had reached Category 5 — the maximum — with sustained winds of 165 miles per hour after intensifying by 85 mph in a single day over record-warm Atlantic waters.

The forecast problem at that range was not the storm's strength. It was its path. At nine or more days out, traditional weather models showed significant disagreement about where Lee would make landfall. The scenarios ranged from a direct New England hit to a curve out to sea to a track into the Canadian Maritimes. For emergency managers in any of those regions, the difference between preparing and not preparing is measured in days.

A model built by Google DeepMind in London — running on a single computer chip — had already made its call.

Cat 5
165 mph sustained winds at peak intensity
9 days
GraphCast predicted Nova Scotia landfall before it happened
<1 min
to produce a 10-day global forecast on a single chip
90%+
of weather variables where GraphCast outperformed the gold standard

The Forecast — A storm spiral rotates slowly. One blue prediction line traces a path to the target point days before the others. Scattered lines converge on the same answer. The target pulses when they arrive.

02 — The PredictionNine Days Out

On approximately September 7, a live version of GraphCast deployed on the European Centre for Medium-Range Weather Forecasts website predicted that Hurricane Lee would make landfall in Nova Scotia. Nine days before it happened.

At that point, traditional physics-based forecast models — the systems that run on supercomputers and have defined meteorology for decades — still showed substantial uncertainty about the track. The cone of possible landfalls included much of the eastern seaboard. GraphCast was specific. Nova Scotia.

It was not alone. Two other AI weather models — NVIDIA's FourCastNet and Huawei's Pangu-Weather — converged on similar predictions. Three independently developed AI systems, none using physics equations, agreed on the same answer.

Three days later, around September 10, the traditional models converged. Nova Scotia.

On September 16, Hurricane Lee — by then weakened to a post-tropical cyclone with 70 mph winds — made landfall on Long Island, Nova Scotia. Exactly where the AI had said it would, nine days earlier.

Hurricane Lee — Forecast Convergence

GraphCast (AI)

Called Nova Scotia~Sept 7
Lead time9 days
Computation<1 min
Hardware1 TPU chip

Traditional (HRES)

Called Nova Scotia~Sept 10
Lead time6 days
ComputationHours
HardwareSupercomputer

The three-day lead matters. Hurricane preparation — evacuation orders, supply positioning, infrastructure reinforcement — runs on a calendar of days, not hours. Three additional days of confidence in a forecast track is not an incremental improvement. It is the difference between a plan that works and one that is still being debated when the storm arrives.

03 — The MachineData, Not Physics

GraphCast is a neural network trained to predict the weather. Not by simulating the physics of the atmosphere — the approach that has defined numerical weather prediction since the 1950s — but by learning patterns from four decades of what the weather has actually done.

The training data came from ECMWF's ERA5 reanalysis archive: 38 years of global weather observations (1979 to 2017) derived from satellite imagery, radar, weather stations, and atmospheric measurements, processed into a consistent dataset. From that, GraphCast learned a model of cause and effect — given the current state of the atmosphere, what comes next.

The architecture is a graph neural network operating on a multi-scale mesh of over one million grid points at 0.25-degree resolution, roughly 28 kilometers at the equator. It predicts five surface variables and six atmospheric variables at 37 vertical pressure levels. A 10-day global forecast takes under one minute on a single Google TPU v4 chip.

ECMWF's High Resolution Forecast system, the gold standard in operational meteorology, needs hours on a supercomputer to match that output.

Forecast Time
<1 minvshours
Hardware
1 chipvs100s
Accuracy
90%+wins

The results, published in the journal Science on November 14, 2023, by lead author Remi Lam and 17 co-authors: GraphCast outperformed HRES on more than 90 percent of 1,380 verification targets — combinations of weather variables and forecast lead times. In the troposphere, the 6-to-20-kilometer-high layer where most weather occurs, that number was 99.7 percent.

DeepMind open-sourced the code and trained model weights on GitHub the same day.

04 — The LimitsWhat It Cannot Do

GraphCast has specific, acknowledged limitations that explain why it supplements rather than replaces traditional forecasting — at least for now.

It produces deterministic forecasts only. A single prediction, not a probability distribution. Traditional ensemble systems run dozens of slightly varied simulations to map the range of possible outcomes. Meteorologists use those ranges to communicate uncertainty — the familiar "cone" in hurricane forecasts. GraphCast has no cone. It has a dot.

Its resolution, 28 kilometers per grid cell, is too coarse for neighborhood-scale weather. Localized rainfall, individual thunderstorms, micro-terrain effects — these happen at scales smaller than GraphCast's grid. Remi Lam, the lead author, acknowledged this limitation.

"When a model gets something wrong, I want to be able to look at the details and figure out why." — Aaron Kennedy, meteorologist, University of North Dakota

The interpretability problem is fundamental. Physics-based models have traceable reasoning — if the forecast is wrong, you can identify which physical process was modeled incorrectly. GraphCast is a black box. It produces answers. It does not explain them.

And there is the climate change question. GraphCast learned weather from 1979 to 2017. Peter Dueben at ECMWF warned that climate change could erode that foundation — the atmosphere of 2030 may behave differently enough from the atmosphere of 1979–2017 that historical patterns lose predictive power.

Kim Wood, a hurricane researcher at the University of Arizona, noted that AI models excel at the events they see most often in training data but may struggle with events outside the training distribution — storms like Hurricane Otis, which in October 2023 intensified from a tropical storm to a Category 5 hurricane in 24 hours, defying historical norms.

These are real constraints. They are also, notably, constraints that the institutions running the world's best physics models considered manageable enough to adopt the technology anyway.

05 — The AdoptionThe Institution Follows

ECMWF — the institution whose forecasts GraphCast was benchmarked against and outperformed — did not dispute the results. It began running GraphCast experimentally on its public charts website in November 2023, alongside its traditional forecasts.

Then it went further. ECMWF built its own AI forecasting system, called AIFS, using the same graph neural network approach that GraphCast had demonstrated. On February 25, 2025, AIFS went operational — running side by side with ECMWF's traditional Integrated Forecasting System as a full production weather forecast.

The performance gains were substantial. AIFS improved tropical cyclone track predictions by up to 20 percent over physics-based models. It achieved approximately a 1,000-fold reduction in the energy required to produce a forecast. On July 1, 2025, ECMWF made its ensemble AI forecasts operational as well — addressing the probabilistic gap that had been GraphCast's most significant limitation.

"This milestone will transform weather science and predictions." — Florence Rabier, Director-General, ECMWF · February 2025

The structural point is not that GraphCast was accurate. It is that the institution which built the most respected physics-based weather prediction system on Earth looked at a data-trained model from an AI lab and concluded that the approach was worth adopting. Not as a supplement. As a parallel operational system.

Weather forecasting — the field defined by physics simulation, the field where supercomputers running fluid dynamics equations were supposed to reign indefinitely — became the first major computational domain where physics-first modeling was not supplemented by machine learning, but matched by it. Trained on data alone.

What If?

GraphCast predicts weather at 28-kilometer resolution, ten days out. The next generation will predict at one kilometer, thirty days out. When that happens, the entity that controls the forecast controls the economy. A commodity trader with a 30-day precipitation map at field-level resolution can price crop futures before the farmer sees rain. An insurer with month-ahead storm tracks at neighborhood resolution can reprice policies before the homeowner checks the weather app. A military planner with a month of atmospheric visibility data at tactical resolution can schedule operations around windows that the other side cannot see. Weather has always been public infrastructure — forecasts issued by government agencies, available to everyone simultaneously. But AI weather models are software. They can be private. They can be proprietary. They can be fast enough that the entity running them acts on the information before the public forecast is even issued. AI can already predict the weather better. The only question that matters is who gets the prediction first — and what they do with the three days, or three weeks, before everyone else finds out.

How did this land?

Sources

← Previous Chapter 05 The Fifty-Year Problem 8 min read Next → Chapter 07 The Library That Came Back 8 min read
New chapters · No spam
Get the next story in your inbox