Janice Coen from NCAR gave this Invited Talk at SC16. “The past two decades have seen the infusion of technology that has transformed the understanding, observation, and prediction of wildland fires and their behavior, as well as provided a much greater appreciation of its frequency, occurrence, and attribution in a global context. This talk will highlight current research in integrated weather – wildland fire computational modeling, fire detection and observation, and their application to understanding and prediction.”
Pamela Hill from NCAR/UCAR presented this talk at the DDN User Group at SC16. “With the game-changing SFA14K, NCAR now has the storage capacity and sustained compute performance to perform sophisticated modeling while substantially reducing workflow bottlenecks. As a result, the organization will be able to quickly process mixed I/O workloads while sharing up to 40 PBs of vital research data with a growing scientific community around the world.”
In this video, Dave Hart, CISL User Services Manager presents: Cheyenne – NCAR’s Next-Generation Data-Centric Supercomputing Environment. “Cheyenne is a new 5.34-petaflops, high-performance computer built for NCAR by SGI. The hardware was delivered on Monday, September 12, at the NCAR-Wyoming Supercomputing Center (NWSC) and the system is on schedule to become operational at the beginning of 2017. All of the compute racks were powered up and nodes booted up within a few days of delivery.”
“PSyclone was developed for the UK Met Office and is now a part of the build system for Dynamo, the dynamical core currently in development for the Met Office’s ‘next generation’ weather and climate model software. By generating the complex code needed to make use of thousands of processors, PSyclone leaves the Met Office scientists free to concentrate on the science aspects of the model. This means that they will not have to change their code from something that works on a single processing unit (or core) to something that runs on many thousands of cores.”
NOAA and its partners have developed a new forecasting tool to simulate how water moves throughout the nation’s rivers and streams, paving the way for the biggest improvement in flood forecasting the country has ever seen. Launched today and run on NOAA’s powerful new Cray XC40 supercomputer, the National Water Model uses data from more than 8,000 U.S. Geological Survey gauges to simulate conditions for 2.7 million locations in the contiguous United States. The model generates hourly forecasts for the entire river network. Previously, NOAA was only able to forecast streamflow for 4,000 locations every few hours.
“A new supercomputer, dubbed Cheyenne, is expected to be operational at the beginning of 2017. The new high-performance computer will be a 5.34-petaflop system, meaning it can carry out 5.34 quadrillion calculations per second. It will be capable of more than 2.5 times the amount of scientific computing performed by Yellowstone.”
Today DDN announced that the National Center for Atmospheric Research (NCAR) has selected DDN’s new SFA14K high-performance hyper-converged storage platform to drive the performance and deliver the capacity needed for scientific breakthroughs in climate, weather and atmospheric-related science to power its Cheyenne supercomputer. “Having a centralized, large-scale storage resource delivers a real benefit to our scientists,” said Anke Kamrath, director of the operations and services division at NCAR’s computing lab. “With the new system, we have a balance between capacity and performance so that researchers will be able to start looking at the model output immediately without having to move data around. Now, they’ll be able to get right down to the work of analyzing the results and figuring out what the models reveal.”
In this video, Patrick Nichols from the Computational and Informational Systems Laboratory presents: Introduction to the Yellowstone Supercomputer. Yellowstone is NCAR’s 1.5-petaflops high-performance IBM iDataPlex cluster, which features 72,576 Intel Sandy Bridge processors and 144.6 TB of memory.
“Since Hurricane Katrina made landfall in 20015, storm prediction technology has seen dramatic forward movement, from improved software to better use of observations and increased computing power – all aimed at giving emergency decision makers more time and specifics to help protect lives and property. The expert panelists in this Congressional Briefing outline research advances that have led to better forecasting of hurricane and tropical storm weather and impacts. And they spotlight research directions that hold promise for future improvements.”
Over at Live Science, Shannon Hall writes that new global map of the world’s oceans is so visually stunning that it could be mistaken for art. Computed on LANL supercomputers, the simulation is a component of the DOE’s Accelerated Climate Model for Energy (ACME), which is expected to be the most complete climate and Earth system model once it is finished.