Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Searching and Researching: DDN Solutions for Life Sciences

Bio and life sciences is the third-largest commercial vertical market segment for the use of HPC, including “biomedical research and development organizations in such areas as: pharmaceuticals, medical research, agriculture, environmental engineering, etc.”1 A great deal of additional usage of HPC for life sciences occurs at public-sector (academic and government) research labs, or even in other industries, such as an oil company pursuing research in bio fuels. To learn more download this white paper.

Research for New Technology Using Supercomputers

This paper presents our approach to research and development in relation to four applications in which utilization of simulations in super-large-scale computation systems is expected to serve useful purposes.

Science and Industry using Supercomputers

This paper is intended for people interested in High Performance Computing (HPC) in general, in the performance development of HPC systems from the beginning in the 1970s and, above all, in HPC applications in the past, today and tomorrow. Readers do not need to be supercomputer experts.

Weather and Ocean Modeling with Super Computers

The practical impact of weather, climate and ocean prediction on the world’s population and economy drives the usage of high performance computing (HPC) for earth system modeling. The socioeconomic impacts of improved predictive capabilities are well-recognized by scientists as well as government leaders. The earth’s environment plays an important role in shaping economies and infrastructures, and touches upon nearly every aspect of our daily lives, including recreational activities, food supplies and energy resources.

Energy Exploration with High Performance Computing

As energy exploration becomes increasingly challenging, oil and gas firms deploy ever more powerful computing and storage solutions to stay ahead.

Component Architecture for Scientific HPC

The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance com- puting. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individu- als or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components

Life Sciences with HPC Systems

The term next generation sequencing (NGS) is really a misnomer. NGS implies a single methodology, but the fact is that over the past 10 to 15 years there have been multiple generations and the end is nowhere in sight. Technological advances in the field are continuing to emerge at a record setting pace.

Weather and Climate Forecasting

In the pantheon of HPC grand challenges, weather forecasting and long-term climate simulation rank right up there with the most complex and com- putationally demanding problems in astrophysics, aeronautics, fusion power, exotic materials,and earthquake prediction, to name just a few. Modern weather prediction requires cooperation in the collection of observed data and sharing of forecasts output among all nations, a collabora- tion that has been ongoing for decades. This data is used to simulate effects on a range of scales— from events, such as the path of tornados, that change from minute to minute and move over distances measured in meters, to turnover of water layers in the ocean, a process that is measured in decades or even hundreds of years, and spans thousands of miles.

Parallel Storage Solutions for Better Performance

Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.

insideHPC Guide to Deep Learning

Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. This insideHPC special report explores the technologies, components and software required for creating successful deep learning environments.