Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Video: An Update on HPC at CSCS

Thomas Schulthess from CSCS gave this talk at the HPC User Forum. “CSCS has a strong track record in supporting the processing, analysis and storage of scientific data, and is investing heavily in new tools and computing systems to support data science applications. For more than a decade, CSCS has been involved in the analysis of the many petabytes of data produced by scientific instruments such as the Large Hadron Collider (LHC) at CERN. Supporting scientists in extracting knowledge from structured and unstructured data is a key priority for CSCS.”

Video: HPC and the Living Heart Project

In this video, HPE’s Jean-Luc Assor talk about their involvement in the Living Heart Project – and how HPC is explains how high performance computing is saving lives. The Stanford Living Heart Project is uniting industry-leading researchers, doctors, educators, and technology manufacturers to develop a new standard for drug testing with Hybrid HPC.

Visualizing and Simulating Atomic Structures with CUDA

In this video, John Stone from the University of Illinois, Urbana-Champaign discusses the role of CUDA and GPUs in processing large datasets to visualize and simulate high-resolution atomic structures. CUDA does this by allowing researchers to describe hundreds of thousands to millions of independent, data-parallel work units and write software that executes on those work units, all while achieving peak hardware performance.

Podcast: Advancing Scientific Research with HPC

In this podcast, Dana Chang from Atipa Technologies joins Conversations in the Cloud to discuss Atipa’s Polaris High Performance Computing and Visualization (HPCV) platform. “Today’s scientists need faster insights and visualize increasingly large data sets. The Atipa Polaris HPCV Platform uses 2nd Generation Intel Xeon Scalable processors and the Intel Rendering Framework for CPU-based rendering. This enables a balanced performance for simulation and modeling or simulation and visualization. The pre-validated hardware of Intel Select Solutions saves time that would have been spent assessing and procuring configurations.”

Supercomputing Structures of Intrinsically Disordered Proteins

Researchers using the Titan supercomputer at ORNL have created the most accurate 3D model yet of an intrinsically disordered protein, revealing the ensemble of its atomic-level structures. The combination of neutron scattering experiments and simulation is very powerful,” Petridis said. “Validation of the simulations by comparison to neutron scattering experiments is essential to have confidence in the simulation results. The validated simulations can then provide detailed information that is not directly obtained by experiments.”

Supercomputing Earth’s Geomagnetism with Blue Waters

Researchers are using the Blue Waters supercomputer at NCSA to better understand geomagnetic variations and their underlying mechanisms, so that better forecasting models can be developed. “Without Blue Waters (or any comparable computing facilities), we would have to scale back our ensemble size in order to complete all simulations within a reasonable time frame. This will certainly limit our ability to achieve meaningful research and application goals.”

ENERXICO to Empower Mexican Energy Sector with HPC

Recently launched in Barcelona, ENERXICO is a new project jointly funded by European Union and the government of Mexico to solve such real-world engineering problems. “Mexico through the ENERXICO project and the European collaboration aims to develop exascale-ready application codes, some of which Pemex plans to use as part of their production environment. This is a major effort that could boost HPC applications in Mexico in the energy sector.”

A New World of Simulation for Oil & Gas

In this special guest feature from Scientific Computing World, Gemma Church writes that the Oil and Gas market aims to use more simulation to ensure sound decision making s the world makes better use of renewable energy. “Oil and gas companies are increasingly relying on digital technology to improve their operational efficiency, reduce manpower through automation and developing autonomous systems for drilling, offshore platforms and increasing use of remotely operated systems and equipment for inspection.”

A Simulation Booster for Nanoelectronics

Researchers from ETH Zurich have developed a method that can simulate nanoelectronics devices and their properties realistically, quickly and efficiently. This offers a ray of hope for the industry and data centre operators alike, both of which are struggling with the (over)heating that comes with increasingly small and powerful transistors – and with the high resulting electricity costs for cooling.

Deep Learning at scale for the construction of galaxy catalogs

A team of scientists is now applying the power of artificial intelligence (AI) and high-performance supercomputers to accelerate efforts to analyze the increasingly massive datasets produced by ongoing and future cosmological surveys. “Deep learning research has rapidly become a booming enterprise across multiple disciplines. Our findings show that the convergence of deep learning and HPC can address big-data challenges of large-scale electromagnetic surveys.”