Video: The State of Bioinformatics in HPC

“In the last few years DNA sequencing technologies have become extremely cheap enabling us to quickly generate terabytes of data for a few thousand dollars. Analysis of this data has become the new bottleneck. Novel compute-intensive streaming approaches that leverage this data without the time-costly step of genome assembly and how UWA’s Edwards group leveraged these approaches to find new breeding targets in crop species are presented.”

Video: Technical Challenges in Complex Bioinformatics Environments

In this video from the 2017 DDN User Group meeting at ISC, Dean Flanders, Head of Informatics/CIO at Friedrich Miescher Institute presents: Technical Challenges in Complex Bioinformatics Environments. “Dean Flanders has been Head of Informatics at the Friedrich Miescher Institute since 2000. He has been involved in many activities to enable researchers by improving IT at the national and international levels.”

Earlham Institute Moves HPC Workloads to Iceland

In this video, Dr Tim Stitt from the Earlham Institute describes why moving their HPC workload to Iceland made economic sense. Through the Verne Global datacenter, the Earlham Institute will have access to one of the world’s most reliable power grids producing 100% geothermal and hydro-electric renewable energy. As EI’s HPC analysis requirements continue to grow, Verne Global will enable the institute to save up to 70% in energy costs (based on 14p to 4p KWH rate and with no additional power for cooling, significantly benefiting the organization in their advanced genomics and bioinformatics research of living systems.

Data Storage Best Practices for Life Science Workflows

“Unchecked data growth and data sprawl are having a profound impact on life science workflows. As data volumes continue to grow, researchers and IT leaders face increasingly difficult decisions about how to manage this data yet keep the storage budget in check. Learn how these challenges can be overcome through active data management and leveraging cloud technology. The concepts will be applied to an example architecture that supports both genomic and bioimaging workflows.”

Panasas & Western Digital to Power Life Science Research with iRODS

“Our solutions ultimately make data readily available for users, applications and analytics, helping to facilitate faster results and better decisions,” said Gary Lyng, senior director of marketing, Data Center Systems at Western Digital. “We are excited to be working with Panasas as the volume, velocity, variety and value of data generated by modern lab equipment along with varying application and workflow requirements make implementing the right solution all the more challenging – and we have the right solution.”

Searching and Researching: DDN Solutions for Life Sciences

Bio and life sciences is the third-largest commercial vertical market segment for the use of HPC, including “biomedical research and development organizations in such areas as: pharmaceuticals, medical research, agriculture, environmental engineering, etc.”1 A great deal of additional usage of HPC for life sciences occurs at public-sector (academic and government) research labs, or even in other industries, such as an oil company pursuing research in bio fuels. To learn more download this white paper.

Weather and Ocean Modeling with Super Computers

The practical impact of weather, climate and ocean prediction on the world’s population and economy drives the usage of high performance computing (HPC) for earth system modeling. The socioeconomic impacts of improved predictive capabilities are well-recognized by scientists as well as government leaders. The earth’s environment plays an important role in shaping economies and infrastructures, and touches upon nearly every aspect of our daily lives, including recreational activities, food supplies and energy resources.

Research for New Technology Using Supercomputers

This paper presents our approach to research and development in relation to four applications in which utilization of simulations in super-large-scale computation systems is expected to serve useful purposes.

Component Architecture for Scientific HPC

The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance com- puting. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individu- als or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components

Energy Exploration with High Performance Computing

As energy exploration becomes increasingly challenging, oil and gas firms deploy ever more powerful computing and storage solutions to stay ahead.