Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Mentorship fosters a Career in STEM

In this special guest feature, Faith Singer-Villalobos from TACC continues her series profiling Careers in STEM. It’s the inspiring story of Je’aime Powell, a TACC System Administrator and XSEDE Extended Collaborative Support Services Consultant. “Options, goals, and hope are what can set you on a path that can change your life,” Powell said.

NSF Sponsors EPiQC ‘expedition’ for Practical Quantum Computing

University of Chicago computer scientists will lead a $10 million “expedition” into the burgeoning field of quantum computing, bringing applications of the nascent technology for computer science, physics, chemistry, and other fields at least a decade closer to practical use. Called EPiQC, the $10 million NSF ‘expedition’ for practical quantum computing is designed to help scientists realize the potential of quantum computing more rapidly.

Big 3 Cloud Providers join with NSF to Support Data Science

“NSF’s participation with major cloud providers is an innovative approach to combining resources to better support data science research,” said Jim Kurose, assistant director of NSF for Computer and Information Science and Engineering (CISE). “This type of collaboration enables fundamental research and spurs technology development and economic growth in areas of mutual interest to the participants, driving innovation for the long-term benefit of our nation.”

Dark Energy Survey Releases First Three Years of Data

Today scientists from the Dark Energy Survey (DES) released their first three years of data. This first major release of data from the Survey includes information on about 400 million astronomical objects, including distant galaxies billions of light-years away as well as stars in our own galaxy. “There are all kinds of discoveries waiting to be found in the data. While DES scientists are focused on using it to learn about dark energy, we wanted to enable astronomers to explore these images in new ways, to improve our understanding of the universe,” said Dark Energy Survey Data Management Project Scientist Brian Yanny of the U.S. Department of Energy’s Fermi National Accelerator Laboratory.

Comet Supercomputer Assists in Latest LIGO Discovery

This week’s landmark discovery of gravitational and light waves generated by the collision of two neutron stars eons ago was made possible by a signal verification and analysis performed by Comet, an advanced supercomputer based at SDSC in San Diego. “LIGO researchers have so far consumed more than 2 million hours of computational time on Comet through OSG – including about 630,000 hours each to help verify LIGO’s findings in 2015 and the current neutron star collision – using Comet’s Virtual Clusters for rapid, user-friendly analysis of extreme volumes of data, according to Würthwein.”

NSF Announces $17.7 Million Funding for Data Science Projects

Today the National Science Foundation (NSF) announced $17.7 million in funding for 12 Transdisciplinary Research in Principles of Data Science (TRIPODS) projects, which will bring together the statistics, mathematics and theoretical computer science communities to develop the foundations of data science. Conducted at 14 institutions in 11 states, these projects will promote long-term research and training activities in data science that transcend disciplinary boundaries. “Data is accelerating the pace of scientific discovery and innovation,” said Jim Kurose, NSF assistant director for Computer and Information Science and Engineering (CISE). “These new TRIPODS projects will help build the theoretical foundations of data science that will enable continued data-driven discovery and breakthroughs across all fields of science and engineering.”

How Extreme Energy Jets Escape a Black Hole

Researchers are using XSEDE supercomputers to better understand the forces at work at the center of the Milky Way galaxy. The work could reveal how instabilities develop in extreme energy releases from black holes. “While nothing – not even light – can escape a black hole’s interior, the jets somehow manage to draw their energy from the black hole.”

Podcast: A Retrospective on Great Science and the Stampede Supercomputer

TACC will soon deploy Phase 2 of the Stampede II supercomputer. In this podcast, they celebrate by looking back on some of the great science computed on the original Stampede machine. “In 2017, the Stampede supercomputer, funded by the NSF, completed its five-year mission to provide world-class computational resources and support staff to more than 11,000 U.S. users on over 3,000 projects in the open science community. But what made it special? Stampede was like a bridge that moved thousands of researchers off of soon-to-be decommissioned supercomputers, while at the same time building a framework that anticipated the eminent trends that came to dominate advanced computing.”

NCSA Blue Waters Report Shows Economic Benefits of HPC

The importance of supercomputing on local and national economic prosperity has been highlighted by a recent study which reported that its Blue Waters project to be worth more than $1.08 billion for the Illinois’ economy. The study was completed by the published by the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Advanced Clustering Installs New Supercomputer at Clarkson University

This week Advanced Clustering installed a new supercomputer at Clarkson University in New York. “Our project is a small-scale super computer with a lot of horsepower for computation ability,” Liu said. “It has many servers, interconnected to look like one big machine. Research involving facial recognition, iris recognition and fingerprint recognition requires a lot of computing power, so we’re investigating how to perfect that capability and make biometrics run faster.”