Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Supercomputing Complex Materials with QMCPACK

An effort within the Exascale Computing Project is developing a QMC methods software named QMCPACK to find, predict, and control materials and properties at the quantum level. The ultimate aim is to achieve an unprecedented and systematically improvable accuracy by leveraging the memory and power capabilities of the forthcoming exascale computing systems.

Job of the Week: Research Computing Administrator at University of Oxford

Would you like to manage and develop a state-of-the-art HPC facility to help enable cutting-edge biomedical research? We have an exciting opportunity for a Research Computing Administrator to join us at the Kennedy Institute of Rheumatology (KIR). The KIR is a world-class research centre that is situated between the Wellcome Centre for Human Genetics and the Big Data Institute at the University of Oxford.

Podcast: How Exascale Computing Could Help Boost Energy Production

In this podcast, Tom Evans, technical lead for ECP’s Energy Applications projects, shares about the motivations, progress, and aspirations on the path to the exascale. “Evans describes the unprecedented calculations expected at the exascale, the example of taking wind energy simulations much further, and the movement toward the use of more-general-purpose programming tools.”

Simulating Hurricanes with the Blue Waters Supercomputer

In this video, researchers describe how they are using the Blue Waters supercomputer at NCSA to figure out the effects of a changing climate on the frequency and intensity of hurricanes. “We couple the atmosphere model to the ocean because tropical cyclones get their energy from the warm, near-surface waters. We wanted to see what is the effect of climate-induced changes in the ocean on the storms themselves when we go to a fully coupled, high-resolution climate model configuration.”

Video: A Full Week of Fueling Innovation at ISC 2019

In this video, Michael Feldman of The Next Platform and Florina Ciorba of University of Basel look back on ISC 2019. “ISC 2019 brought together 3,573 HPC practitioners and enthusiasts interested in high performance computing, storage, networking, and AI. The theme of this year’s conference was Fueling Innovation.”

Podcast: HPC Market Eyes $44B in 5 Years

In this podcast, the Radio Free HPC team looks at a new projections from Hyperion Research that has the HPC+AI market growing to $44B, in 5 years. “The industry is hitting on all cylinders, benefiting from the Exascale race, AI coming to the enterprise, and it’s customary slow but always steady growth. The big news continues to be AI fundamentally bringing HPC closer to the mainstream of enterprise computing whether it is on-prem, in a co-location facility, or in a public cloud.”

New Paper Surveys Micron’s Automata Processor

“Micron’s automata processor (AP) exploits massively parallel in-memory processing capability of DRAM for executing NFAs and hence, it can provide orders of magnitude performance improvement compared to traditional architectures. This paper presents a survey of techniques that propose architectural optimizations to AP and use it for accelerating problems from various application domains such as bioinformatics, data-mining, network security, natural language, high-energy physics, etc.”

NSF Funds $10 Million for ‘Expanse’ Supercomputer at SDSC

SDSC has been awarded a five-year grant from the NSF valued at $10 million to deploy Expanse, a new supercomputer designed to advance research that is increasingly dependent upon heterogeneous and distributed resources. “As a standalone system, Expanse represents a substantial increase in the performance and throughput compared to our highly successful, NSF-funded Comet supercomputer. But with innovations in cloud integration and composable systems, as well as continued support for science gateways and distributed computing via the Open Science Grid, Expanse will allow researchers to push the boundaries of computing and answer questions previously not possible.”

Intel Labs Unveils Pohoiki Beach 64-Chip Neuromorphic System

At the DARPA ERI summit this week, Intel Labs director Rich Uhlig unveiled “Pohoiki Beach” – a 64-Loihi Chip Neuromorphic system capable of simulating eight million neurons. Now available to the broader research community, the Pohoiki Beach enables researchers to experiment with Intel’s brain-inspired research chip, Loihi, which applies the principles found in the biological brains to computer architectures. 

Evolve Project in the EU to Tackle Big Data Processing

The European Commission is planning to tackle the challenges of big data processing with the Evolve Project as part of the Horizon 2020 Research and Innovation program. Evolve aims to take concrete steps in bringing the big data, high-performance computing, and cloud computing technology into a testbed that will increase researchers ability to extract value from massive and demanding datasets. This could impact the way that big data applications by enabling researchers to process large amounts of data much faster.