Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Component Architecture for Scientific HPC

The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance com- puting. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individu- als or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components

Life Sciences with HPC Systems

The term next generation sequencing (NGS) is really a misnomer. NGS implies a single methodology, but the fact is that over the past 10 to 15 years there have been multiple generations and the end is nowhere in sight. Technological advances in the field are continuing to emerge at a record setting pace.

Weather and Climate Forecasting

In the pantheon of HPC grand challenges, weather forecasting and long-term climate simulation rank right up there with the most complex and com- putationally demanding problems in astrophysics, aeronautics, fusion power, exotic materials,and earthquake prediction, to name just a few. Modern weather prediction requires cooperation in the collection of observed data and sharing of forecasts output among all nations, a collabora- tion that has been ongoing for decades. This data is used to simulate effects on a range of scales— from events, such as the path of tornados, that change from minute to minute and move over distances measured in meters, to turnover of water layers in the ocean, a process that is measured in decades or even hundreds of years, and spans thousands of miles.

Parallel Storage Solutions for Better Performance

Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.

Genomics High Performance Computing Guide

There are times when a convergence of technologies happens that can benefit a very large number of humans in order to improve their well-being. A number of technological innovations are coming together that can greatly enhance the recovery from life-threatening illnesses and prolong and improve the quality of life. With a combination of faster and more accurate genomics sequencing, faster computer systems and new algorithms, the movement of discovering what medicine will work best on individual patients has moved from research institutions to bedside doctors. Physicians and other healthcare providers now have better, faster, and more accurate tools and data to determine optimal treatment plans based on more patient data. This is especially true for pediatric cancer patients. These fast-moving technologies have become the center of a national effort to help millions of people overcome certain diseases.

Life Sciences: IBM Computing Solutions

Whether engaged in genome sequencing, drug design, product analysis or risk management, life sciences research teams need high-performance technical environments with the ability to process massive amounts of data and support increasingly sophisticated simulations and analyses. Organizations helping to find causes and cures for diseases need speed, agility and control across the clinical development lifecycle to increase productivity, foster innovation and compete more effectively.

MIPT in Moscow Develops New Method of Calculating Protein Interaction

Biologists and mathematicians from the Moscow Institute of Physics and Technology (MIPT) have accelerated the rate at which a computer can predict the structure of protein complexes in a cell. “The new method enables us to model the interaction of proteins at the genome level. This will give us a better understanding of how our cells function and may enable drug development for diseases caused by ‘incorrect’ protein interactions,” commented Dima Kozakov, a professor at Stony Brook and adjunct professor at MIPT.

Gordon Supercomputer Aids Search for New Antibiotics

Researchers using the Gordon Supercomputer at SDSC have identified a class of possible antibiotics with the potential to disable previously drug-resistant bacteria. In essence, these new agents were found to attack the bacteria along two fronts: its external lipid cellular wall and its internal factory responsible for generating cellular energy in the form of adenosine triphosphate or ATP.

insideHPC Guide to Personalized Medicine & Genomics

The insideHPC Guide to Personalized Medicine and Genomics explains how genomics will accelerate personalized medicine including several case studies.