Flatiron Institute to Repurpose Gordon Supercomputer

The majority of SDSC’s data-intensive Gordon supercomputer will be used by Simons for ongoing research following completion of the system’s tenure as an NSF resource on March 31.”We are delighted that the Simons Foundation has given Gordon a new lease on life after five years of service as a highly sought after XSEDE resource,” said SDSC Director Michael Norman, who also served as the principal investigator for Gordon. “We welcome the Foundation as a new partner and consider this to be a solid testimony regarding Gordon’s data-intensive capabilities and its myriad contributions to advancing scientific discovery.”

SDSC Seismic Simulation Software Exceeds 10 Petaflops on Cori Supercomputer

Researchers at SDSC have developed a new seismic software package with Intel Corporation that has enabled the fastest seismic simulation to-date. SDSC’s ground-breaking performance of 10.4 Petaflops on earthquake simulations used 612,000 Intel Xeon Phi processor cores of the new Cori Phase II supercomputer at NERSC.

Over 10,000 Users and Counting for Comet Supercomputer at SDSC

Today the San Diego Supercomputer Center (SDSC) announced that the comet supercomputer has easily surpassed its target of serving at least 10,000 researchers across a diverse range of science disciplines, from astrophysics to redrawing the tree of life. “In fact, about 15,000 users have used Comet to run science gateways jobs alone since the system went into production less than two years ago.”

Data Storage Best Practices for Life Science Workflows

“Unchecked data growth and data sprawl are having a profound impact on life science workflows. As data volumes continue to grow, researchers and IT leaders face increasingly difficult decisions about how to manage this data yet keep the storage budget in check. Learn how these challenges can be overcome through active data management and leveraging cloud technology. The concepts will be applied to an example architecture that supports both genomic and bioimaging workflows.”

HPC Advisory Council China Conference Returns to Xi’an Oct. 26

The HPC Advisory Council has posted their agenda for their upcoming China Conference. The event takes place Oct. 26 in Xi’an, China. “We invite you to join us on Wednesday, October 26th, in Xi’an for our annual China Conference. This year’s agenda will focus on Deep learning, Artificial Intelligence, HPC productivity, advanced topics and futures. Join fellow technologists, researchers, developers, computational scientists and industry affiliates to discuss recent developments and future advancements in High Performance Computing.”

Video: User Managed Virtual Clusters in Comet

Rick Wagner from SDSC presented this talk at the the 4th Annual MVAPICH User Group. “At SDSC, we have created a novel framework and infrastructure by providing virtual HPC clusters to projects using the NSF sponsored Comet supercomputer. Managing virtual clusters on Comet is similar to managing a bare-metal cluster in terms of processes and tools that are employed. This is beneficial because such processes and tools are familiar to cluster administrators.”

Video: Mellanox Powers Open Science Grid on Comet Supercomputer

“We are pioneering the area of virtualized clusters, specifically with SR-IOV,” said Philip Papadopoulos, SDSC’s chief technical officer. “This will allow virtual sub-clusters to run applications over InfiniBand at near-native speeds – and that marks a huge step forward in HPC virtualization. In fact, a key part of this is virtualization for customized software stacks, which will lower the entry barrier for a wide range of researchers by letting them project an environment they already know onto Comet.”

New Ocean Current Simulations Reflect Climate Change

Researchers are using the Gordon supercomputer at SDSC to paint a new picture of global warming’s impact on the complex processes that drive ocean mixing in the vast eddies swirling off the California coast. “Nearly a fifth of the worldwide ocean productivity is in these zones, and no one has really looked with this level of detail at the climate change implications for these precious marine areas,” said Renault.”

Live Report from LUG 2016 Day 3

In this special guest feature, Ken Strandberg offers this live report from Day 3 of the Lustre User Group meeting in Portland. “Rick Wagner from San Diego Supercomputing Center presented progress on his team’s replication tool that allows copying large blocks of storage from object storage to their disaster recovery durable storage system. Because rsync is not a tool for moving massive amounts of data, SDSC created recursive worker services running in parallel to have each worker handle a directory or group of files. The tool uses available Lustre clients, a RabbitMQ server, Celery scripts, and bash scripts.”

Comet Supercomputer at SDSC Helps Confirm Gravitational Wave Discovery

The NSF-funded Comet supercomputer at SDSC was one of several high-performance computers used by researchers to help confirm that the discovery of gravitational waves before a formal announcement was made.