Sign up for our newsletter and get the latest big data news and analysis.
Daily
Weekly

San Diego Supercomputer Center to Offer Two Summer Institutes

April 7, 2022 — The San Diego Supercomputer Center at UC San Diego has planned summer institutes for June and August, one focused on cyberinfrastructure-enabled machine learning and the on high-performance computing (HPC) and data science. Application deadlines are April 15 and May 13, respectively. The Cyberinfrastructure-Enabled Machine Learning (CIML) Summer Institute will be held June 27-29 […]

SDSC and Fungible Claim Record HPC Storage Performance

SANTA CLARA, Calif. & SAN DIEGO — Fungible Inc., a data-centric computing company, and the San Diego Supercomputer Center (SDSC) at the University of California San Diego, today announced they have shattered the NVMe over TCP storage initiator performance world record, achieving 10M IOPS,* beating the old record of 6.55 IOPS. The tests were administered […]

SDSC Houses Novel Metabolomics Data Repository

September 14, 2021 — How is the “normal” resting heart rate determined? How does the American Diabetes Association establish the “normal” fasting glucose value? Understanding these “normal” ranges for metabolism is complex, especially because the human body may contain tens of thousands of metabolites at any one time; each individual molecule could be tied to […]

SDSC Part of New NSF Project Supporting Transboundary Aquifer Resiliency

August 11, 2021 — Transboundary aquifers, which are deep subsurface water sources shared by multiple countries, have long been a critical source of water for communities along the borders of the U.S. and Mexico. Recent decline in water levels and quality – coupled with increased use – provoked concern regarding long-term sustainability of several transboundary aquifers […]

SDSC Team Awarded Funding for NSF GO FAIR Symposium

Aug. 10, 2021 — The San Diego Supercomputer Center’s (SDSC) Research Data Services (RDS) Chief Strategist Melissa Cragin and Division Director Christine Kirkpatrick were recently awarded a grant by the National Science Foundation to fund a GO FAIR symposium in the next several months. The production of Findable, Accessible, Interoperable and Reusable (FAIR) data and […]

Modeling on SDSC’s Comet Supercomputer Reveals Findings on Pregnancy-related Hypertension

According to the Centers for Disease Control and Prevention (CDC), preeclampsia, or pregnancy-related hypertension, occurs in roughly one in 25 pregnancies in the United States. The causes are unknown and childbirth is the only remedy, which can sometimes lead to adverse perinatal outcomes, such as preterm delivery. To better understand this serious pregnancy complication, which reduces blood supply to the fetus, researchers used Comet at the San Diego Supercomputer Center (SDSC) at UC San Diego to conduct cellular modeling to detail the differences between normal and preeclampsia placental tissue. 

SDSC Names Interim Director

San Diego — Frank Würthwein, the lead of Distributed High-Throughput Computing at the San Diego Supercomputing Center, has been named SDSC’s interim director effective July 1. SDSC, located at UC San Diego, said it is conducting a formal search for permanent director to replace outgoing Director Michael Norman. “I am pleased to announce that Professor […]

SDSC, Core Scientific in HPC Composable Partnership to Extend ‘Expanse’ Supercomputer

The San Diego Supercomputer Center (SDSC) at UC San Diego has announced a partnership with Core Scientific, an infrastructure and software solutions provider for artificial intelligence and blockchain, to offer HPC capabilities to industrial users. SDSC and Core will integrate the Core Scientific Plexus AI software stack with the Expanse petascale supercomputer, launched by SDSC late last year with […]

SDSC Picks Habana AI Training and Inference Chips for Voyager HPC System

The San Diego Supercomputer Center has selected Habana Labs’ AI training and inference accelerators for SDSC’s Voyager supercomputer, scheduled to be in service this fall. Habana said the HPC system, housed at the University of California, San Diego, will utilize Habana’s interconnectivity technology to scale AI training capacity with 336 Habana Gaudi training processors, which […]

XSEDE-Allocated Supercomputers, Comet and Stampede2, Accelerate Alzheimer’s Research

By Kimberly Mann Bruch, San Diego Supercomputer Center Communications Since 2009, Daniel Tward and his collaborators have analyzed more than 47,000 images of human brains via MRI Cloud — a gateway created to collect and share quantitative information from human brain images, including subtle changes in shape and cortical thickness. The latter was the topic of […]