Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Sylabs boosts HPC Containers with SingularityPRO 3.1

Today Sylabs announced the release of SingularityPRO 3.1 in what the company is calling a watershed moment for enterprise customers everywhere. “SingularityPRO 3.1 is the most highly anticipated release of our enterprise software ever,” said Gregory Kurtzer, founder and CEO of Sylabs. “With this release, we’re rapidly advancing container science, making it a truly opportune time for those seeking to containerize the most demanding enterprise performance computing workloads in the most trusted way.”

Video: Managing large-scale cosmology simulations with Parsl and Singularity

Rick Wagner from Globus gave this talk at the Singularity User Group “We package the imSim software inside a Singularity container so that it can be developed independently, packaged to include all dependencies, trivially scaled across thousands of computing nodes, and seamlessly moved between computing systems. To date, the simulation workflow has consumed more than 30M core hours using 4K nodes (256K cores) on Argonne’s Theta supercomputer and 2K nodes (128K cores) on NERSC’s Cori supercomputer.”

Podcast: Supercomputing Synthetic Biomolecules

Researchers are using HPC to design potentially life-saving proteins. In this TACC podcast, host Jorge Salazar discusses this groundbreaking work with the science team. “The scientists say their methods could be applied to useful technologies such as pharmaceutical targeting, artificial energy harvesting, ‘smart’ sensing and building materials, and more.”

Supercomputing the Complexities of Brain Waves

Scientists are using the Comet supercomputer at SDSC to better understand the complexities of brain waves. With a goal of better understanding human brain development, the HBN project is currently collecting brain scans and EEG recordings, as well as other behavioral data from 10,000 New York City children and young adults – the largest such sample ever collected. “We hope to use portals such as the EEGLAB to process this data so that we can learn more about biological markers of mental health and learning disorders in our youngest patients,” said HBN Director Michael Milham.

SDSC and Sylabs Gather for Singularity User Group

The San Diego Supercomputer Center (SDSC) at UC San Diego, and Sylabs.io recently hosted the first-ever Singularity User Group meeting, attracting users and developers from around the nation and beyond who wanted to learn more about the latest developments in an open source project known as Singularity. Now in use on SDSC’s Comet supercomputer, Singularity has quickly become an essential tool in improving the productivity of researchers by simplifying the development and portability challenges of working with complex scientific software.

Supercomputing Sea Fog Development to Prevent Maritime Disasters

Over at the XSEDE blog, Kim Bruch from SDSC writes that an international team of researchers is using supercomputers to shed new light on how and why a particular type of sea fog forms. Through for simulation, they hope to provide more accurate fog predictions that help reduce the number of maritime mishaps. “The researchers have been using the Comet supercomputer based at the San Diego Supercomputer Center (SDSC) at UC San Diego. To-date, the team has used about 2 million core hours.”

Supercomputer Simulations help fight Dengue Virus

Researchers are using supercomputers to combat the Dengue virus and related diseases spread by mosquitos. “Advanced software and the rapid calculation speeds of both Comet and Bridges made the current simulations possible. In particular, the graphics processing units (GPUs) on Comet enabled the team to simulate the motion more efficiently than possible if they had used only the central processing units (CPUs) present on most supercomputers.”

The Use of HPC to Model the California Wildfires

Ilkay Altintas from the San Diego Supercomputer Center gave this talk at the HPC User Forum. “WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire’s rate of spread.”

BioBurst: Leveraging Burst Buffer Technology for Campus Research Computing

In this video from the DDN User Group at SC17, Ron Hawkins from the San Diego Supercomputer Center presents: BioBurst — Leveraging Burst Buffer Technology for Campus Research Computing. Under an NSF award, SDSC will implement a separately scheduled partition of TSCC with technology designed to address key areas of bioinformatics computing including genomics, transcriptomics, […]

Comet Supercomputer Assists in Latest LIGO Discovery

This week’s landmark discovery of gravitational and light waves generated by the collision of two neutron stars eons ago was made possible by a signal verification and analysis performed by Comet, an advanced supercomputer based at SDSC in San Diego. “LIGO researchers have so far consumed more than 2 million hours of computational time on Comet through OSG – including about 630,000 hours each to help verify LIGO’s findings in 2015 and the current neutron star collision – using Comet’s Virtual Clusters for rapid, user-friendly analysis of extreme volumes of data, according to Würthwein.”