It’s Raining HPC – How Supercomputing is Shaping Weather Forecasts

Print Friendly, PDF & Email

In this special guest feature, Meike Chabowski from SUSE writes that modern weather forecasting powered by HPC is savings lives.

Meike Chabowski

Meike Chabowski


Historically, your local weather forecaster was the only person who it seemed could be wrong most of the time and still have a job. But with a new generation of computing power, today’s modern meteorologists are armed with a variety of resources to shape their daily projections.

So who are we to blame when it rains on a supposed-to-be-sunny day? The answer is increasingly “nobody,” because forecast accuracy has been helped greatly with HPC. Consider that in 1940, the chance of an American being killed by lightning was about one in 400,000. Today it’s one in 11 million. Or when the National Hurricane Center tried to predict where a hurricane would make landfall three days in advance in the late 1980s, it missed by an average of 350 miles. This hurricane season, it only misses by an average of less than 100 miles. Supercomputing – and our interpretations of its outputs – are critical reasons for this high degree of accuracy.

Equally responsible for this improvement are the resources behind the technology. Beyond shaping our weekend camping trips and BBQs, weather forecasting is big business. It helps predict growing seasons and wind directions for energy, or it mitigates the next major natural disaster. But it’s just getting started. More investment in the forecasting space has opened the door for HPC in three critical ways:

Faster
The number of weather-related collection points continues to increase dramatically, as more data on the ground helps feed the processors that monitor the skies. Forecasting requires lots of observational data collected from land, sea and air, via satellites and weather balloons, when thousands of weather stations across the globe are linked and their data pooled. Collectively, all of these gauges produce more than a million weather-related observations every day.

In order to meet that data demand, greater processing power is needed from the supercomputers analyzing that information.

The Met Office in the UK, for example, is currently using an IBM supercomputer which can perform more than 1,000 trillion calculations per second. Its power allows it to compile hundreds of thousands of weather observations worldwide, which provides a starting point for running an atmospheric model containing more than a million lines of code. The supercomputer requires a large amount of energy to run and maintain – about 2.5 MW of electricity – but its power consumption remains small in comparison to the socio-economic benefits delivered, including the reduction of 20 million tons of carbon dioxide emissions per year.

Consider this level of processing power compared to the human brain. Supercomputing performance is measured in FLOPS, or “Floating Point Operations Per Second.” In 2010, the first supercomputer reached the “petaflop” scale (1015 or quadrillion operations per second), and it is projected that by 2018-2020, the “exaflop” scale (1018 or a quintillion operations per second) will be reached. This means that a supercomputer in four to six years will run 100 times faster than a human brain operating at peak capacity, currently estimated to be around 1016 operations per second.

When speed is combined with higher-resolution imagery, it can save lives. Consider a project several years ago between the National Oceanic and Atmospheric Association (NOAA), the University of Oklahoma and the Pittsburgh Supercomputing Center, which runs SUSE Linux Enterprise Server on the supercomputer Blacklight, an SGI UV 1000cc-NUMA shared-memory system composed of 256 blades with two Intel Xeon X7560 (Nehalem) eight-core processors, for a total of 4,096 cores. The computer conducts “ensemble forecasting,” running 10 weather modules simultaneously to lessen the impact of errors within tornado and thunderstorm forecasting. Seven years ago, it took 700 processors running overnight to run one set of detailed, granular forecasts. Now, the fastest computers in the world are more than 1,000 to 1,200 times faster than they were just a few short years ago. Faster computations mean faster response times during a dynamic and fast-moving severe weather event, such as a tornado or thunderstorm.

Customized
Supercomputing has also ushered in the era of the personalized, mobile forecast. In 2006, IBM announced a supercomputer that could provide a forecast within one kilometer of its source. Eight years later, Weather.com’s mobile app tells us the exact forecast for our town. It can be raining down the street, but not on your doorstep. As a result, supercomputers are continuing to push the boundaries of a customized forecast. Can you imagine having an accurate forecast not just for your city, but for your own house? It’s not as far off as you might think.

Efficient
Supercomputing may be big business, but while computing speeds are going up, prices are going down. It’s largely the scalable and open infrastructures that allow research organizations to run more with less, making their investments in science go further. With many of these organizations facing pressure to run on limited resources, the technical advancement to deliver the power they demand at a manageable cost is invaluable.

Weather forecasting has been a driver of computing investment since the 1950s, but we’re at a unique confluence where large amounts of memory, high processing speeds and a rapid ability to scale have combined to take the science to new heights. It’s largely the underlying, open infrastructure of these systems that has made these advancements possible. So while supercomputing can’t change the weather, it is changing lives as it helps people prepare for the weather’s impact.

Meike Chabowski is a Product Marketing Manager for Enterprise Linux Servers at Novell. Her responsibilities include Linux for HPC, Mainframes and Retail. Meike holds an M.A. in Science of Mass Media and Theatre, as well as an M.A. in Education from University of Erlangen-Nuremberg/Germany, and in Italian Literature and Language from University of Parma/Italy.