Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Data Storage Best Practices for Life Science Workflows

“Unchecked data growth and data sprawl are having a profound impact on life science workflows. As data volumes continue to grow, researchers and IT leaders face increasingly difficult decisions about how to manage this data yet keep the storage budget in check. Learn how these challenges can be overcome through active data management and leveraging cloud technology. The concepts will be applied to an example architecture that supports both genomic and bioimaging workflows.”

HPC Advisory Council China Conference Returns to Xi’an Oct. 26

The HPC Advisory Council has posted their agenda for their upcoming China Conference. The event takes place Oct. 26 in Xi’an, China. “We invite you to join us on Wednesday, October 26th, in Xi’an for our annual China Conference. This year’s agenda will focus on Deep learning, Artificial Intelligence, HPC productivity, advanced topics and futures. Join fellow technologists, researchers, developers, computational scientists and industry affiliates to discuss recent developments and future advancements in High Performance Computing.”

Video: User Managed Virtual Clusters in Comet

Rick Wagner from SDSC presented this talk at the the 4th Annual MVAPICH User Group. “At SDSC, we have created a novel framework and infrastructure by providing virtual HPC clusters to projects using the NSF sponsored Comet supercomputer. Managing virtual clusters on Comet is similar to managing a bare-metal cluster in terms of processes and tools that are employed. This is beneficial because such processes and tools are familiar to cluster administrators.”

Video: Mellanox Powers Open Science Grid on Comet Supercomputer

“We are pioneering the area of virtualized clusters, specifically with SR-IOV,” said Philip Papadopoulos, SDSC’s chief technical officer. “This will allow virtual sub-clusters to run applications over InfiniBand at near-native speeds – and that marks a huge step forward in HPC virtualization. In fact, a key part of this is virtualization for customized software stacks, which will lower the entry barrier for a wide range of researchers by letting them project an environment they already know onto Comet.”

New Ocean Current Simulations Reflect Climate Change

Researchers are using the Gordon supercomputer at SDSC to paint a new picture of global warming’s impact on the complex processes that drive ocean mixing in the vast eddies swirling off the California coast. “Nearly a fifth of the worldwide ocean productivity is in these zones, and no one has really looked with this level of detail at the climate change implications for these precious marine areas,” said Renault.”

Live Report from LUG 2016 Day 3

In this special guest feature, Ken Strandberg offers this live report from Day 3 of the Lustre User Group meeting in Portland. “Rick Wagner from San Diego Supercomputing Center presented progress on his team’s replication tool that allows copying large blocks of storage from object storage to their disaster recovery durable storage system. Because rsync is not a tool for moving massive amounts of data, SDSC created recursive worker services running in parallel to have each worker handle a directory or group of files. The tool uses available Lustre clients, a RabbitMQ server, Celery scripts, and bash scripts.”

Comet Supercomputer at SDSC Helps Confirm Gravitational Wave Discovery

The NSF-funded Comet supercomputer at SDSC was one of several high-performance computers used by researchers to help confirm that the discovery of gravitational waves before a formal announcement was made.

Second Intel Parallel Computing Center Opens at SDSC

Intel has opened a second parallel computing center at the San Diego Supercomputer Center (SDSC), at the University of California, San Diego. The focus of this new engagement is on earthquake research, including detailed computer simulations of major seismic activity that can be used to better inform and assist disaster recovery and relief efforts.

Gordon Supercomputer Aids Search for New Antibiotics

Researchers using the Gordon Supercomputer at SDSC have identified a class of possible antibiotics with the potential to disable previously drug-resistant bacteria. In essence, these new agents were found to attack the bacteria along two fronts: its external lipid cellular wall and its internal factory responsible for generating cellular energy in the form of adenosine triphosphate or ATP.

Video: Dell Panel Discussion on the NSCI initiative from SC15

In this video from SC15, Rich Brueckner from insideHPC moderates a panel discussion on the NSCI initiative. “As a coordinated research, development, and deployment strategy, NSCI will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.”