Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


SDSC Awarded NSF Grant for Triton Stratus

The National Science Foundation has awarded SDSC a two-year grant worth almost $400,000 to deploy a new system called Triton Stratus. “Triton Stratus will provide researchers with improved facilities for utilizing emerging computing paradigms and tools, namely interactive and portal-based computing, and scaling them to commercial cloud computing resources. Researchers, especially data scientists, are increasingly using toolssuch as Jupyter notebooks and RStudio to implement computational and data analysis functions and workflows.”

NSF Funds $10 Million for ‘Expanse’ Supercomputer at SDSC

SDSC has been awarded a five-year grant from the NSF valued at $10 million to deploy Expanse, a new supercomputer designed to advance research that is increasingly dependent upon heterogeneous and distributed resources. “As a standalone system, Expanse represents a substantial increase in the performance and throughput compared to our highly successful, NSF-funded Comet supercomputer. But with innovations in cloud integration and composable systems, as well as continued support for science gateways and distributed computing via the Open Science Grid, Expanse will allow researchers to push the boundaries of computing and answer questions previously not possible.”

Michael Zentner to Lead SDSC Sustainable Scientific Software Group

Today the San Diego Supercomputer Center (SDSC) announced the appointment of Michael Zentner as director of Sustainable Scientific Software, effective immediately. “Having worked with Michael as a co-PI of SGCI since 2016, I’m confident that he has the skills needed to move the institute forward as PI,” said Wilkins-Diehr. “I’m thrilled that Michael has accepted this position with SDSC, and am excited about the expanded role he’ll play as the Center sharpens its focus on software sustainability.”

Video: Supercomputing Dynamic Earthquake Ruptures

Researchers are using XSEDE supercomputers to model multi-fault earthquakes in the Brawley fault zone, which links the San Andreas and Imperial faults in Southern California. Their work could predict the behavior of earthquakes that could potentially affect millions of people’s lives and property. “Basically, we generate a virtual world where we create different types of earthquakes. That helps us understand how earthquakes in the real world are happening.”

Call for Presentations: MVAPICH User Group in August

The 7th annual MVAPICH User Group (MUG) meeting has issued its Call for Presentations. MUG will take place from August 19-21, 2019 in Columbus, Ohio. “MUG aims to bring together MVAPICH2 users, researchers, developers, and system administrators to share their experience and knowledge and learn from each other. The event includes keynote talks, invited tutorials, invited talks, contributed presentations, open MIC session, hands-on sessions with MVAPICH developers, etc.”

Sylabs boosts HPC Containers with SingularityPRO 3.1

Today Sylabs announced the release of SingularityPRO 3.1 in what the company is calling a watershed moment for enterprise customers everywhere. “SingularityPRO 3.1 is the most highly anticipated release of our enterprise software ever,” said Gregory Kurtzer, founder and CEO of Sylabs. “With this release, we’re rapidly advancing container science, making it a truly opportune time for those seeking to containerize the most demanding enterprise performance computing workloads in the most trusted way.”

Video: Managing large-scale cosmology simulations with Parsl and Singularity

Rick Wagner from Globus gave this talk at the Singularity User Group “We package the imSim software inside a Singularity container so that it can be developed independently, packaged to include all dependencies, trivially scaled across thousands of computing nodes, and seamlessly moved between computing systems. To date, the simulation workflow has consumed more than 30M core hours using 4K nodes (256K cores) on Argonne’s Theta supercomputer and 2K nodes (128K cores) on NERSC’s Cori supercomputer.”

Podcast: Supercomputing Synthetic Biomolecules

Researchers are using HPC to design potentially life-saving proteins. In this TACC podcast, host Jorge Salazar discusses this groundbreaking work with the science team. “The scientists say their methods could be applied to useful technologies such as pharmaceutical targeting, artificial energy harvesting, ‘smart’ sensing and building materials, and more.”

Supercomputing the Complexities of Brain Waves

Scientists are using the Comet supercomputer at SDSC to better understand the complexities of brain waves. With a goal of better understanding human brain development, the HBN project is currently collecting brain scans and EEG recordings, as well as other behavioral data from 10,000 New York City children and young adults – the largest such sample ever collected. “We hope to use portals such as the EEGLAB to process this data so that we can learn more about biological markers of mental health and learning disorders in our youngest patients,” said HBN Director Michael Milham.

SDSC and Sylabs Gather for Singularity User Group

The San Diego Supercomputer Center (SDSC) at UC San Diego, and Sylabs.io recently hosted the first-ever Singularity User Group meeting, attracting users and developers from around the nation and beyond who wanted to learn more about the latest developments in an open source project known as Singularity. Now in use on SDSC’s Comet supercomputer, Singularity has quickly become an essential tool in improving the productivity of researchers by simplifying the development and portability challenges of working with complex scientific software.