Sign up for our newsletter and get the latest big data news and analysis.
Daily
Weekly

SDSC Picks Habana AI Training and Inference Chips for Voyager HPC System

The San Diego Supercomputer Center has selected Habana Labs’ AI training and inference accelerators for SDSC’s Voyager supercomputer, scheduled to be in service this fall. Habana said the HPC system, housed at the University of California, San Diego, will utilize Habana’s interconnectivity technology to scale AI training capacity with 336 Habana Gaudi training processors, which […]

XSEDE-Allocated Supercomputers, Comet and Stampede2, Accelerate Alzheimer’s Research

By Kimberly Mann Bruch, San Diego Supercomputer Center Communications Since 2009, Daniel Tward and his collaborators have analyzed more than 47,000 images of human brains via MRI Cloud — a gateway created to collect and share quantitative information from human brain images, including subtle changes in shape and cortical thickness. The latter was the topic of […]

MIT Researchers Develop Neural Networks for Computational Chemistry Using SDSC, PSC Supercomputers

Even though computational chemistry represents a challenging arena for machine learning, a team of researchers from the Massachusetts Institute of Technology (MIT) may have made it easier. Using Comet at the San Diego Supercomputer Center at UC San Diego and Bridges at the Pittsburgh Supercomputing Center, they succeeded in developing an artificial intelligence (AI) approach to detect electron correlation – the interaction between a system’s electrons – which is vital but expensive to calculate in quantum chemistry.

Thomas Sterling Eulogizes Rich Brueckner, Ann Redelfs, Steve Tuecke, Lucy Nowell: 4 Leaders Lost to the HPC Community

At his annual keynote address closing out the ISC 2020 conference, Thomas Sterling, Professor of Intelligent Systems Engineering at the University of Indiana, eulogized four members of the HPC community who died over the past year. Here are excerpts from his remarks: It’s my sad duty to, but certainly a responsibility, to note some of […]

SDSC makes Comet Supercomputer available for COVID-19 research

With the U.S. and many other countries working ‘round the clock to mitigate the devastating effects of the COVID-19 disease, SDSC is providing priority access to its high-performance computer systems and other resources to researchers working to develop an effective vaccine in as short a time as possible. “For us, it absolutely crystalizes SDSC’s mission, which is to deliver lasting impact across the greater scientific community by creating innovative end-to-end computational and data solutions to meet the biggest research challenges of our time. That time is here.”

Bursting into the public Cloud: Experiences at large scale for IceCube

Igor Sfiligoi from SDSC gave this talk at the ECSS Symposium. “I have recently helped IceCube expand their resource pool by a few orders of magnitude, first to 380 PFLOP32s for a few hours and later to 170 PFLOP32s for a whole workday. In this session I will explain what was done and how, alongside an overview of why IceCube needs so much compute.”

Supercomputing Ocean Wave Energy

Researchers are using XSEDE supercomputers to help develop ocean waves into a sustainable energy source. “We primarily used our simulation techniques to investigate inertial sea wave energy converters, which are renewable energy devices developed by our collaborators at the Polytechnic University of Turin that convert wave energy from large bodies of water into electrical energy,” explained study co-author Amneet Pal Bhalla from SDSU.

SDSC Expanse Supercomputer from Dell Technologies to serve 50,000 Users

In this special guest feature, Janet Morss at Dell Technologies writes that the company will soon deploy a new flagship supercomputer at SDSC. “Expanse will deliver the power of 728 dual-socket Dell EMC PowerEdge C6525 servers with 2nd Gen AMD EPYC processors connected with Mellanox HDR InfiniBand. The system will have 93,000 compute cores and is projected to have a peak speed of 5 petaflops. That will almost double the performance of SDSC’s current Comet supercomputer, also from Dell Technologies.”

Second GPU Cloudburst Experiment Paves the Way for Large-scale Cloud Computing

Researchers at SDSC and the Wisconsin IceCube Particle Astrophysics Center have successfully completed a second computational experiment using thousands of GPUs across Amazon Web Services, Microsoft Azure, and the Google Cloud Platform. “We drew several key conclusions from this second demonstration,” said SDSC’s Sfiligoi. “We showed that the cloudburst run can actually be sustained during an entire workday instead of just one or two hours, and have moreover measured the cost of using only the two most cost-effective cloud instances for each cloud provider.”