In this video, Dr Tim Stitt from the Earlham Institute describes why moving their HPC workload to Iceland made economic sense. Through the Verne Global datacenter, the Earlham Institute will have access to one of the world’s most reliable power grids producing 100% geothermal and hydro-electric renewable energy. As EI’s HPC analysis requirements continue to grow, Verne Global will enable the institute to save up to 70% in energy costs (based on 14p to 4p KWH rate and with no additional power for cooling, significantly benefiting the organization in their advanced genomics and bioinformatics research of living systems.
“Modern bioinformatics is driven by the generation of ever increasing volumes of genomic data requiring large and collaborative computing resources to help process it quickly and at scale. At EI, we have some of the largest computational platforms for the Life Sciences in Europe and the demand for our computing capability is only increasing, putting pressure on the capacity and operational costs of our existing data centers,” said Dr Tim Stitt, Head of Scientific Computing at EI. “We are, therefore, very excited to be partnering with Verne Global in Iceland, who not only can supply medium and high power computing density at significantly lower energy costs, but who can also deliver excellent global network communications and data center security.”
As a leader in the Life Sciences HPC community, EI aims to better understand complex scientific issues and their impact to society by categorizing, processing, and analyzing the DNA of various crops, animals, insects and microbes. This includes their flagship project, bread wheat, which is one of the most perplexing genomes to study, with a genome sequence five times bigger and more complex than the human genome.
One of EI’s primary goals is to understand crop genomes so new varieties can be developed to secure food supply in the face of a growing population and environmental change. The cutting-edge, high-throughput DNA sequencing instruments generate large amounts of data, from a few hundred gigabytes to several terabytes per run. The output requires significant computational effort, making the storage, processing, analysis and sharing of the data extremely challenging.