Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Video: Quantum Computing and Quantum Supremacy at Google

John Martinis from Google presents: Quantum Computing and Quantum Supremacy. “The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems. Our strategy is to explore near-term applications using systems that are forward compatible to a large-scale universal error-corrected quantum computer. In order for a quantum processor to be able to run algorithms beyond the scope of classical simulations, it requires not only a large number of qubits.”

Video: Moving Towards Personalized Medicine at The UberCloud

“In the last six years UberCloud has performed 200 cloud experiments with engineers and scientists and their complex applications. In a series of challenging high performance computing applications in the Life Sciences, UberCloud’s HPC Containers have been packaged recently with several scientific workflows and application data to simulate complex phenomena in human’s heart and brain. As the core software for these HPC Cloud experiments we are using the (containerized) Abaqus solver running in a fully automated multi-node HPE environment in the Advania HPC Cloud.”

Video: Big Data Assimilation Revolutionizing Weather Prediction

In this video from the HPC User Forum in Tucson, Miyoshi Takemasa from RIKEN presents: Big Data Assimilation Revolutionizing Weather Prediction. “A new project harnessing data from a Japanese satellite called Himawari-8 could improve weather forecasting and allow officials to issue life-saving warnings before natural disasters. The breakthrough is the result of pairing data collected by Japan’s Himawari-8 weather satellite with a program run on a supercomputer at the RIKEN science institute.”

How Exascale will Move Earthquake Simulation Forward

In this video from the HPC User Forum in Tucson, David McCallen from LBNL describes how exascale computing capabilities will enhance earthquake simulation for improved structural safety. “With the major advances occurring in high performance computing, the ability to accurately simulate the complex processes associated with major earthquakes is becoming a reality. High performance simulations offer a transformational approach to earthquake hazard and risk assessments that can dramatically increase our understanding of earthquake processes and provide improved estimates of the ground motions that can be expected in future earthquakes.”

The Use of HPC to Model the California Wildfires

Ilkay Altintas from the San Diego Supercomputer Center gave this talk at the HPC User Forum. “WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire’s rate of spread.”

Quantum Computing at NIST

Carl Williams from NIST gave this talk at the HPC User Forum in Tucson. “Quantum information science research at NIST explores ways to employ phenomena exclusive to the quantum world to measure, encode and process information for useful purposes, from powerful data encryption to computers that could solve problems intractable with classical computers.”

Using the Titan Supercomputer to Develop 50,000 Years of Flood Risk Scenarios

Dag Lohmann from KatRisk gave this talk at the HPC User Forum in Tucson. “In 2012, a small Berkeley, California, startup called KatRisk set out to improve the quality of worldwide flood risk maps. The team wanted to create large-scale, high-resolution maps to help insurance companies evaluate flood risk on the scale of city blocks and buildings, something that had never been done. Through the OLCF’s industrial partnership program, KatRisk received 5 million processor hours on Titan.”

Video: HPC Use for Earthquake Research

Christine Goulet from the Southern California Earthquake Center gave this talk at the HPC User Forum in Tucson. “SCEC coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. The SCEC community advances earthquake system science through synthesizing knowledge of earthquake phenomena through physics-based modeling, including system-level hazard modeling and communicating our understanding of seismic hazards to reduce earthquake risk and promote community resilience.”

Video: Addressing Key Science Challenges with Adversarial Neural Networks

Wahid Bhimji from NERSC gave this talk at the 2018 HPC User Forum in Tucson. “Machine Learning and Deep Learning are increasingly used to analyze scientific data, in fields as diverse as neuroscience, climate science and particle physics. In this page you will find links to examples of scientific use cases using deep learning at NERSC, information about what deep learning packages are available at NERSC, and details of how to scale up your deep learning code on Cori to take advantage of the compute power available from Cori’s KNL nodes.”

Agenda Posted for April HPC User Forum in Tucson

The HPC User Forum has posted their speaker agenda for their upcoming meeting in Tucson. Hosted by Hyperion Research, the event takes place April 16-18 at Loews Ventana Canyon. “The April meeting will explore the status and prospects for quantum computing and HPC use of HPC for environmental research, especially natural disasters such as earthquakes and the recent California wildfires. As always, the meeting will also look at new developments in HPDA-AI, cloud computing and other areas of continuing interest to the HPC community. A special session will look at the growing field of processors and accelerators supporting HPC systems.”