Energy Exploration with High Performance Computing

As energy exploration becomes increasingly challenging, oil and gas firms deploy ever more powerful computing and storage solutions to stay ahead.

Component Architecture for Scientific HPC

The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance com- puting. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individu- als or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components

Life Sciences with HPC Systems

The term next generation sequencing (NGS) is really a misnomer. NGS implies a single methodology, but the fact is that over the past 10 to 15 years there have been multiple generations and the end is nowhere in sight. Technological advances in the field are continuing to emerge at a record setting pace.

Parallel Storage Solutions for Better Performance

Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.

Weather and Climate Forecasting

In the pantheon of HPC grand challenges, weather forecasting and long-term climate simulation rank right up there with the most complex and com- putationally demanding problems in astrophysics, aeronautics, fusion power, exotic materials,and earthquake prediction, to name just a few. Modern weather prediction requires cooperation in the collection of observed data and sharing of forecasts output among all nations, a collabora- tion that has been ongoing for decades. This data is used to simulate effects on a range of scales— from events, such as the path of tornados, that change from minute to minute and move over distances measured in meters, to turnover of water layers in the ocean, a process that is measured in decades or even hundreds of years, and spans thousands of miles.

Genomics High Performance Computing Guide

There are times when a convergence of technologies happens that can benefit a very large number of humans in order to improve their well-being. A number of technological innovations are coming together that can greatly enhance the recovery from life-threatening illnesses and prolong and improve the quality of life. With a combination of faster and more accurate genomics sequencing, faster computer systems and new algorithms, the movement of discovering what medicine will work best on individual patients has moved from research institutions to bedside doctors. Physicians and other healthcare providers now have better, faster, and more accurate tools and data to determine optimal treatment plans based on more patient data. This is especially true for pediatric cancer patients. These fast-moving technologies have become the center of a national effort to help millions of people overcome certain diseases.

insideHPC Guide to Deep Learning

Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. This insideHPC special report explores the technologies, components and software required for creating successful deep learning environments.

Life Sciences: IBM Computing Solutions

Whether engaged in genome sequencing, drug design, product analysis or risk management, life sciences research teams need high-performance technical environments with the ability to process massive amounts of data and support increasingly sophisticated simulations and analyses. Organizations helping to find causes and cures for diseases need speed, agility and control across the clinical development lifecycle to increase productivity, foster innovation and compete more effectively.

insideHPC Guide to Personalized Medicine & Genomics

The insideHPC Guide to Personalized Medicine and Genomics explains how genomics will accelerate personalized medicine including several case studies.