MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


CIPRES Offers Gateway to Supercomputing Life Sciences

In this special guest feature, Lance Farrell from Science Node writes that CIPRES is a web-based gateway that allows researchers to easily explore evolutionary relationships between species using NSF supercomputers. “In the six years since it was established, CIPRES has enabled 2,300+ scientific publications, while only occupying about 1% of the NSF supercomputing resources. That’s an impressive return on investment.”

Exascale Architecture Trends and Implications for Programming Systems

John Shalf presented this talk at EASC2016 in Stockholm. “This talk will describe the challenges of programming future computing systems. It will then provide some highlights from the search for durable programming abstractions more closely track emerging computer technology trends so that when we convert our codes over, they will last through the next decade.”

Video: The Vital Importance of HPC to U.S. Competitiveness and National Security

In this video, ITIF hosts a hearing on the The Vital Importance of High-Performance Computing to U.S. Competitiveness and National Security. Their recently published report urges U.S. policymakers to take decisive steps to ensure the United States continues to be a world leader in high-performance computing.

Job of the Week: Research Computing Facilitator at OSC

Ohio State University is seeking a Research Computing Facilitator in our Job of the Week. “The Ohio Supercomputer Center (OSC) provides high-performance computing (HPC) services for Ohio’s university researchers and industrial clients. The HPC Client Services Group delivers the client experience at OSC through client engagement and administration.”

Podcast: TACC Powers Zika Hackathon to Fight Disease

In this TACC podcast, Ari Kahn from the Texas Advanced Computing Center and Eddie Garcia from Cloudera describe a recent Hackathon in Austin designed to tackle data challenges in the fight against the Zika virus. The Texas Advanced Computing Center provided time on the Wrangler data intensive supercomputer as a virtual workspace for the Zika hackers.

Larry Smarr Presents: Using Supercomputers to Reveal your Inner Microbiome

“I have been collecting massive amounts of data from my own body over the last ten years, which reveals detailed examples of the episodic evolution of this coupled immune-microbial system. An elaborate software pipeline, running on high performance computers, reveals the details of the microbial ecology and its genetic components. A variety of data science techniques are used to pull biomedical insights from this large data set. We can look forward to revolutionary changes in medical practice over the next decade.”

Supercomputing Hidden Lava Lakes

The Piz Daint supercomputer spotted a large reservoir of magma right below the tiny South Korean island of Ulleung. No harm to humans is expected, but the origin of the magma pool remains unclear.

Video: Best Practices in HPC Software Development

“Scientific code developers have increasingly been adopting software processes derived from the mainstream (non-scientific) community. Software practices are typically adopted when continuing without them becomes impractical. However, many software best practices need modification and/or customization, partly because the codes are used for research and exploration, and partly because of the combined funding and sociological challenges. This presentation will describe the lifecycle of scientific software and important ways in which it differs from other software development. We will provide a compilation of software engineering best practices that have generally been found to be useful by science communities, and we will provide guidelines for adoption of practices based on the size and the scope of the project.”

Slimming Down Supercomputer Power Bills

Any performance improvements that could be wrung out of supercomputers by adding more power have long been exhausted. New supercomputers demand new options that will give scientists a sleek, efficient partner in making new discoveries such as the new supercomputer called Summit that’s being developed and is to arrive at Oak Ridge National Lab in the next couple of years. “If necessity is the mother of invention, we’ll have some inventions happening soon,” says deputy division director of Argonne Leadership Computing Facility Susan Coghlan.

Interview: Getting Started with HPC Using the New IBM LSF Platform Suites

Getting started with HPC can be a challenge for SMEs, but managing a cluster doesn’t have to be a struggle. IBM’s Platform Computing group has been helping users to stand up and run clusters efficiently for years. Now, with the recently announced IBM Platform LSF Suites for Workgroups and HPC, the company has made it easier than ever to get kick the tires on High Performance Computing. “So basically, we would give you all the tools that would allow you to easily migrate from a loose collection of work stations to a small cluster environment. And we would handle the bare metal provisioning and then installing the software that you need really to manage your workload.”