Supercomputers and Super Surprises

On Wednesday January 19th, the US National Weather Service installed a new supercomputer. This impressive machine, one of the world’s two fastest weather computers, is five times faster than the machine it replaced at crunching the gigabytes of continually updated atmospheric information fed to it from instruments mounted on satellites, balloons, ships, ocean buoys and land stations. Weather service forecasters were confident that using that mountain of data in the most sophisticated computer weather models existing today would provide a 10 percent improvement in their ability to predict temperatures, humidity and rainfall. NWS director John W. Kelly, Jr., stated that the new machine “put us closer to becoming America’s no surprise weather service.”

Just five days later, America’s weather service was very surprised. A mild storm was brewing in northern Florida and Georgia, and the forecasters were, in the words of National Centers for Environmental Prediction director Louis Uccellini, “watching it like a hawk..” Three different computer models predicted that the storm would move eastward into the Atlantic Ocean, with perhaps a 40 percent chance of light snow in Washington, with less than one inch total accumulation. Meanwhile, oblivious to the models, the storm had suddenly doubled in size and made a sharp left turn. By Tuesday it had become the worst winter snowstorm in four years, hitting every major city on the eastern seaboard with what was later described as “a classic Northeaster” that dumped enough white stuff to close airports in Washington, Baltimore, Philadelphia, New York and Boston.

In fact, by Monday evening, the forecasters had indeed noticed a growing discrepancy between the real weather pattern they could see and the model-generated forecast. As soon as they fed the supercomputer the new data about what the storm was actually doing, it performed a superfast about face. NWS issued a winter storm warning at 10 pm. But even that warning understated the severity of the storm, which dropped twice as much snow as the computer expected based on historical data relating snowfall to water content.

The forecasters were stunned. They had been confident in their early predictions, because their weather model had successfully predicted the blizzard of 1996, but in 2000, the new, improved software running in the new, improved computer fell flat. The fastest computer we have, programmed with the latest equations meteorology can provide and supplied with an abundance of data from one of the most closely monitored environments on Earth, cannot reliably forecast how the weather will behave just one day ahead.

Clearly, our knowledge of weather formation has a long way to go before it understands all the tricks Mother Nature has up her sleeve. That conclusion is even more relevant when we consider attempting to forecast how weather patterns may change world wide on a time scale of decades or centuries. Warnings that we face a general global warming that will produce catastrophic climate changes are inspired by computer models that are far coarser than those used by local or regional forecasters. Global circulation models typically use a single point to represent a chunk of the earth’s atmosphere 150 miles square and almost half a mile high. That means that they cannot even crudely describe weather patterns on scales much smaller than hemispherical. Phenomena like the massive January snowstorm that disrupted the eastern United States do not even register. We must wonder about the perspective of alarmists who want to force billions of people to change their lifestyle today because they don’t like what those models say may happen 50 years from now.

Partner & Fellow Blogs