Sign up for our newsletter and get the latest big data news and analysis.

HPE to Build $35M+ NCAR Supercomputer for Extreme Weather Research

Hewlett Packard Enterprise (HPE) this morning said it has won a $35+ million contract to build a supercomputer for the National Center for Atmospheric Research (NCAR), a federal geoscience R&D center for meteorology, climate change and solar activity. HPE said the CPU/GPU-powered system, funded by the National Science Foundation, is expected to deliver 3.5x the […]

Deep Learning for Predicting Severe Weather

Researchers from Rice University have introduced a data-driven framework that formulates extreme weather prediction as a pattern recognition problem, employing state-of-the-art deep learning techniques. “In this paper, we show that with deep learning you can do analog forecasting with very complicated weather data — there’s a lot of promise in this approach.”

Interactive Supercomputing with Jupyter and DASK

Anderson Banihirwe from NCAR gave this talk at SciPy 2019. “This talk demonstrates how to use Dask and Jupyter on large high-performance computing (HPC) systems to scale and accelerate large interactive data analysis tasks — effectively turning HPC systems into interactive big-data platforms. We will introduce dask-jobqueue which allows users to seamlessly deploy and scale dask on HPC clusters that use a variety of job queuing systems such as PBS, Slurm, SGE, and LSF. We will also introduce dask-mpi, a Python package that makes deploying Dask easy from within a distributed MPI environment.”

Epic HPC Road Trip Continues to NCAR

In this special guest feature, Dan Olds from OrionX continues his Epic HPC Road Trip series with a stop at NCAR in Boulder. “Their ability to increase model precision/resolution and to increase throughput at the same time is becoming more difficult over time due to core speed slowing down as more cores are added. In other words, new chips aren’t providing the same increase in performance as we’ve become accustomed to over the years.”

NOAA and NCAR team up for Weather and Climate Modeling

The United States is making exciting changes to how computer models will be developed in the future to support the nation’s weather and climate forecast system. NOAA and the National Center for Atmospheric Research (NCAR) have joined forces to help the nation’s weather and climate modeling scientists achieve mutual benefits through more strategic collaboration, shared resources and information.

Advances in the Fields of Atmospheric Science, Climate, and Weather

Susan Bates from NCAR gave this talk at the Blue Waters Summit. “For the past five years, the Blue Waters Project has provided an invaluable platform for research in the fields of atmospheric science, climate, and weather. The computationally intensive numerical models running on Blue Waters push the limits of model resolution and/or capability in first-of-their-kind simulations.”

Building a GPU-enabled and Performance-portable Global Cloud-resolving Atmospheric Model

Richard Loft from NCAR gave this talk at the NVIDIA booth at SC17. “The objectives of NCAR’s exploration of accelerator architectures for high performance computing in recent years has been to 1) speed up the rate of code optimization and porting and 2) understand how to achieve performance portability on codes in the most economical and affordable way.

Job of the Week: Software Engineer at NCAR

NCAR in Boulder is seeking a Software Engineer in our Job of the Week. “This position focuses primarily on the development of tools to meet the needs for the NCAR/IT community, and the design, writing, implementation, and support for systems monitoring tools necessary for the management of the computer infrastructure. Support will also be provided to the research community for the development of web-based analysis tools and general web programming.”

Cheyenne Supercomputer Triples Scientific Capability at NCAR

The National Center for Atmospheric Research (NCAR) is launching operations this month of one of the world’s most powerful and energy-efficient supercomputers, providing the nation with a major new tool to advance understanding of the atmospheric and related Earth system sciences. Named “Cheyenne,” the 5.34-petaflop system is capable of more than triple the amount of scientific computing performed by the previous NCAR supercomputer, Yellowstone. It also is three times more energy efficient.

Video: Introduction to the Cheyenne Supercomputer

Cheyenne is a new 5.34-petaflops, high-performance computer built for NCAR by SGI. Cheyenne be a critical tool for researchers across the country studying climate change, severe weather, geomagnetic storms, seismic activity, air quality, wildfires, and other important geoscience topics. In this video, Brian Vanderwende from UCAR describes typical workflows in the NCAR/CISL Cheyenne HPC environment as well as performance […]