Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Supercomputing Graphene Applications in Nanoscale Electronics

Researchers at North Carolina State University are using the Blue Waters Supercomputer to explore graphene’s applications, including its use in nanoscale electronics and electrical DNA sequencing. “We’re looking at what’s beyond Moore’s law, whether one can devise very small transistors based on only one atomic layer, using new methods of making materials,” said Professor Jerry Bernholc, from North Carolina University. “We are looking at potential transistor structures consisting of a single layer of graphene, etched into lines of nanoribbons, where the carbon atoms are arranged like a chicken wire pattern. We are looking at which structures will function well, at a few atoms of width.”

HACC: Fitting the Universe inside a Supercomputer

Nicholas Frontiere from the University of Chicago gave this talk at the DOE CSGF Program Review meeting. “In response to the plethora of data from current and future large-scale structure surveys of the universe, sophisticated simulations are required to obtain commensurate theoretical predictions. We have developed the Hardware/Hybrid Accelerated Cosmology Code (HACC), capable of sustained performance on powerful and architecturally diverse supercomputers to address this numerical challenge. We will investigate the numerical methods utilized to solve a problem that evolves trillions of particles, with a dynamic range of a million to one.”

TACC Podcast Looks at AI and Water Management

In this TACC podcast, Suzanne Pierce from the Texas Advanced Computing Center describes her upcoming panel discussion on AI and water management and the work TACC is doing to support efforts to bridge advanced computing with Earth science. “It’s about letting the AI help us be better decision makers. And it helps us move towards answering, discussing, and exploring the questions that are most important and most critical for our quality of life and our communities so that we can develop a future together that’s brighter.”

Call for Papers: International Workshop on In Situ Visualization

The 3rd International Workshop on In Situ Visualization has issued it’s Call for Papers. Held in conjunction with ISC 2018, WOIV 2018 takes place June 28 in Frankfurt, Germany. “Our goal is to appeal to a wide-ranging audience of visualization scientists, computational scientists, and simulation developers, who have to collaborate in order to develop, deploy, and maintain in situ visualization approaches on HPC infrastructures. We hope to provide practical take-away techniques and insights that serve as inspiration for attendees to implement or refine in their own HPC environments and to avoid pitfalls.”

PASC18 Keynote to Focus on Extreme-Scale Multi-Physics Earthquake Simulations

Today the PASC18 conference announced that Alice-Agnes Gabriel from Ludwig-Maximilian-University of Munich will deliver a keynote address on earthquake simulation. ” This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behavior. It will present the largest-scale dynamic earthquake rupture simulation to date, which models the 2004 Sumatra-Andaman event – an unexpected subduction zone earthquake which generated a rupture of over 1,500 km in length within the ocean floor followed by a series of devastating tsunamis.”

Supercomputing the Origin of Mass

In this video, Professor Derek Leinweber from the University of Adelaide presents his research in Lattice Quantum Field Theory; revealing the origin of mass in the universe. “While the fundamental interactions are well understood, elucidating the complex phenomena emerging from this quantum field theory is fascinating and often surprising. My explorations of QCD-vacuum structure featured in Professor Wilczek’s 2004 Physics Nobel Prize Lecture. Our approach to discovering the properties of this key component of the Standard Model of the Universe favors fundamental first-principles numerical simulations of QCD on supercomputers. This field of study is commonly referred to as Lattice QCD.”

Reconstructing Nuclear Physics Experiments with Supercomputers

For the first time, scientists have used HPC to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries. “By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.”

Hayward Fault Earthquake Simulations Increase Fidelity of Ground Motions

Researchers at LLNL are using supercomputers to simulate the onset of earthquakes in California. “This study shows that powerful supercomputing can be used to calculate earthquake shaking on a large, regional scale with more realism than we’ve ever been able to produce before,” said Artie Rodgers, LLNL seismologist and lead author of the paper.”

HPC4Manufacturing Program Seeks Industry Proposals

The Department of Energy is seeking industry proposals for public/private projects aimed at applying high performance computing to industry challenges for the advancement of energy innovation. “We are seeing some significant successes with orders of magnitude reduction in simulation times and higher fidelity simulations that more closely match the reality of the manufacturing process. With this solicitation we plan to continue to expand the reach of our program to new companies to help solve new and different problems.”

PASC18 Keynote to Focus on Kilometre-Scale Earth System Simulations

Today the PASC18 conference announced that Nils P. Wedi from ECMWF will be one of its keynote speakers. “This talk will illustrate the need for and practicality of producing ensembles of km-scale simulations, summarize progress on accelerating state-of-the-art global weather and climate predictions, and discuss outstanding issues and future directions on producing and analysing big weather data while balancing time-critical customer needs with energy- and time-to-solution.”