Epic HPC Road Trip Continues to NCAR

Print Friendly, PDF & Email

In this special guest feature, Dan Olds from OrionX continues his Epic HPC Road Trip series with a stop at NCAR in Boulder.

Dan Olds from OrionX hits the road for SC18 in Dallas.

What impacts us every day of our lives and there isn’t anything we can do to control it? No, the answer isn’t spam, it’s the weather. And the organization that knows more about the weather and the climate in general is NCAR – the National Center for Atmospheric Research.

Housed in a lofty research facility high above Boulder, Colorado, in a building designed by famed architect I.M. Pei, NCAR is responsible for conducting research into meteorology, climate science, atmospheric chemistry, plus societal and social impacts of climate related issues.

On my visit, I had the great fortune to interview Richard Loft, the Director of NCAR’s Computational & Information Systems Laboratory.

Highlights:

  • In our interview, we start the discussion by talking about NCAR’s mission and hore they now have responsibility for modeling Earth systems – a very complex job – and one that can only be done with supercomputing technology.
  • We also discussed NCAR’s Wyoming Supercomputer, where I stopped for a few photos on the way to Boulder.
  • We also talked about how the lab is investigating the use of GPUs in their climate modeling through their GPU-enhanced systems.
  • The bulk of our conversation was centered on my question “Where do you see HPC going?” This launched a discussion of how their simulations work and what they need in future system technology. Their ability to increase model precision/resolution and to increase throughput at the same time is becoming more difficult over time due to core speed slowing down as more cores are added. In other words, new chips aren’t providing the same increase in performance as we’ve become accustomed to over the years.
    • One possible solution to sidestep the lag in technology advances? Machine learning and AI. Using machine learning, you can fit the model to the data rather than fitting the data to a model. This would radically reduce the number and complexity of simulations – basically informing their simulations before they’re run and cutting out much of the typical processing they’d do to run a traditional simulation.
    • This combination of AI and traditional simulation has huge implications for nearly every research and enterprise HPC organization. Richard does a great job of explaining how this would work and the impact on their computational workflow at NCAR. We also discuss the potential downsides of neural networks and AI when it comes to modeling. Check out the video for the whole conversation.

A big thank you goes out to Richard Loft and everyone at NCAR that made this interview possible. I also need to thank Cray for their support of this epic HPC Road Trip.

Many thanks go out to Cray for sponsoring this journey. We’re now 1,238 miles into our road trip. Next stop is NREL, just a few miles away in Golden, Colorado.

Dan Olds is an Industry Analyst at OrionX.net. An authority on technology trends and customer sentiment, Dan Olds is a frequently quoted expert in industry and business publications such as The Wall Street Journal, Bloomberg News, Computerworld, eWeek, CIO, and PCWorld. In addition to server, storage, and network technologies, Dan closely follows the Big Data, Cloud, and HPC markets. He writes the HPC Blog on The Register, co-hosts the popular Radio Free HPC podcast, and is the go-to person for the coverage and analysis of the supercomputing industry’s Student Cluster Challenge.

Sign up for our insideHPC Newsletter