Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Fighting the West Nile Virus with HPC & Analytical Ultracentrifugation

Researchers are using new techniques with HPC to learn more about how the West Nile virus replicates inside the brain. “Over several years, Demeler has developed analysis software for experiments performed with analytical ultracentrifuges. The goal is to facilitate the extraction of all of the information possible from the available data. To do this, we developed very high-resolution analysis methods that require high performance computing to access this information,” he said. “We rely on HPC. It’s absolutely critical.”

Podcast: A Retrospective on Great Science and the Stampede Supercomputer

TACC will soon deploy Phase 2 of the Stampede II supercomputer. In this podcast, they celebrate by looking back on some of the great science computed on the original Stampede machine. “In 2017, the Stampede supercomputer, funded by the NSF, completed its five-year mission to provide world-class computational resources and support staff to more than 11,000 U.S. users on over 3,000 projects in the open science community. But what made it special? Stampede was like a bridge that moved thousands of researchers off of soon-to-be decommissioned supercomputers, while at the same time building a framework that anticipated the eminent trends that came to dominate advanced computing.”

Supercomputing DNA Packing in Nuclei at TACC

Aaron Dubrow writes that researchers at the University of Texas Medical Branch are exploring DNA folding and cellular packing with supercomputing power from TACC. “In the field of molecular biology, there’s a wonderful interplay between theory, experiment and simulation,” Pettitt said. “We take parameters of experiments and see if they agree with the simulations and theories. This becomes the scientific method for how we now advance our hypotheses.”

Podcast: Combining Cryo-electron Microscopy with Supercomputer Simulation

Scientists have taken the closest look yet at molecule-sized machinery called the human preinitiation complex. It basically opens up DNA so that genes can be copied and turned into proteins. The science team formed from Northwestern University, Berkeley National Laboratory, Georgia State University, and UC Berkeley. They used a cutting-edge technique called cryo-electron microscopy and combined it with supercomputer analysis. They published their results May of 2016 in the journal Nature.

Podcast: Supercomputing Better Soybeans

In this TACC Podcast, Researchers describe how XSEDE supercomputing resources are helping them grow a better soybean through the SoyKB project based from the University of Missouri-Columbia. “The way resequencing is conducted is to chop the genome in many small pieces and see the many, many combinations of small pieces,” said Xu. “The data are huge, millions of fragments mapped to a reference. That’s actually a very time consuming process. Resequencing data analysis takes most of our computing time on XSEDE.”

Podcast: Supercomputing Better Ways to Produce Gamma Rays

In this podcast, researchers from the University of Texas at Austin discuss how they are using TACC supercomputers to find a new way to make controlled beams of gamma rays. “The simulations done on the Stampede and Lonestar systems at TACC will guide a real experiment later this summer in 2016 with the recently upgraded Texas Petawatt Laser, one of the most powerful in the world. The scientists say the quest for producing gamma rays from non-radioactive materials will advance basic understanding of things like the inside of stars. What’s more, gamma rays are used by hospitals to eradicate cancer, image the brain, and they’re used to scan cargo containers for terrorist materials. Unfortunately no one has yet been able to produce gamma ray beams from non-radioactive sources. These scientists hope to change that.”

Podcast: Supercomputing Gels with Stampede

In this TACC Podcast, Jorge Salazar looks at how researchers are using the Stampede supercomputer to shed light on the microscale world of colloidal gels — liquids dispersed in a solid medium as a gel. “Colloidal gels are actually soft solids, but we can manipulate their structure to produce ‘on-demand’ transitions from liquid-like to solid-like behavior that can be reversed many times,” Zia said. Zia is an Assistant Professor of Chemical and Biomolecular Engineering at Cornell University.

Podcast: Simulating how Lasers can Transform Materials

Researchers are using XSEDE compute resources to study how lasers can be used to make useful materials. In this podcast, Dr. Zhigilei discusses the practical applications of zapping surfaces with short laser pulses. Laser ablation, which refers to the ejection of materials from the irradiated target, generates chemical-free nanoparticles that can be used in medical applications, for example.

Video: TACC Powers Stampede Supercomputer with Dell Servers and Intel Omni Path

In this video, Tommy Minyard from TACC describes how Dell helped develop the 9.6 Petaflop Stampede supercomputer for scientific computing. “The Texas Advanced Computing Center supports the University of Texas System and National Science Foundation researchers with the newest version of their Stampede High Performance Computing cluster. Stampede uses Dell PowerEdge servers, Intel Xeon processors and the new Dell Networking H-Series switches and adapters based on Intel Omni-Path Architecture. These newly implemented technologies greatly reduce latency, optimize traffic flow, and give Stampede a peak performance of 10 petaflops.”

Podcast: Supercomputing Powers Efforts to Save Ocean Coral

What can we do to help ocean coral survive Global Warming? In this TACC podcast, Jorge Salazar looks at how researchers are using the Stampede supercomputer to investigate how Corals can genetically adapt to warmer waters.