MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Long Live the King – The Complicated Business of Upgrading Legacy HPC Systems

“Upgrading legacy HPC systems relies as much on the requirements of the user base as it does on the budget of the institution buying the system. There is a gamut of technology and deployment methods to choose from, and the picture is further complicated by infrastructure such as cooling equipment, storage, networking – all of which must fit into the available space. However, in most cases it is the requirements of the codes and applications being run on the system that ultimately define choice of architecture when upgrading a legacy system. In the most extreme cases, these requirements can restrict the available technology, effectively locking a HPC center into a single technology, or restricting the application of new architectures because of the added complexity associated with code modernization, or porting existing codes to new technology platforms.”

In Search Of: A Quantum Leap in Processors

The fastest supercomputers are built with the fastest microprocessor chips, which in turn are built upon the fastest switching technology. But, even the best semiconductors are reaching their limits as more is demanded of them. In the closing months of this year, came news of several developments that could break through silicon’s performance barrier and herald an age of smaller, faster, lower-power chips. It is possible that they could be commercially viable in the next few years.

HPC Matters to Aerospace

In this video from the SC15 HPC Matters series, NASA Aerospace Engineer Dr. Shishir Pandya describes how high performance computing helps advance airplane and rocket technologies. “Why does high-performance computing matter? Because science matters! Discovery matters! Human beings are seekers, questers, questioners. And when we get answers, we ask bigger questions. HPC extends our reach, putting more knowledge, more discovery, and more innovation within our grasp. With HPC, the future is ours to create! HPC Matters!”

Video: Prologue O/S – Improving the Odds of Job Success

“When looking to buy a used car, you kick the tires, make sure the radio works, check underneath for leaks, etc. You should be just as careful when deciding which nodes to use to run job scripts. At the NASA Advanced Supercomputing Facility (NAS), our prologue and epilogue have grown almost into an extension of the O/S to make sure resources that are nominally capable of running jobs are, in fact, able to run the jobs. This presentation describes the issues and solutions used by the NAS for this purpose.”

Evolution of NASA Earth Science Data Systems in the Era of Big Data

Christopher Lynnes from NASA presented this talk at the HPC User Forum. “The Earth Observing System Data and Information System is a key core capability in NASA’s Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA’s Earth science data from various sources—satellites, aircraft, field measurements, and various other programs.”

NASA Charts Sea Level Rise

“Sea level rise is one of the most visible signatures of our changing climate, and rising seas have profound impacts on our nation, our economy and all of humanity,” said Michael Freilich, director of NASA’s Earth Science Division. “By combining space-borne direct measurements of sea level with a host of other measurements from satellites and sensors in the oceans themselves, NASA scientists are not only tracking changes in ocean heights but are also determining the reasons for those changes.”

Video: High-Throughput Processing of Space Debris Data

“Space Debris are defunct objects in space, including old space vehicles (such as satellites or rocket stages) or fragments from collisions. Space debris can cause great damage to functional space ships and satellites. Thus detection of space debris and prediction of their orbital paths are essential for today’s operation of space missions. The talk shows the Python-based infrastructures BACARDI for gathering and storing space debris data from sensors and Skynet for high-throughput data processing and orbital collision detection.”

Maximizing Benefits of HPC with the National Strategic Computing Initiative

“What we’re seeing in President Obama’s Executive Order is a major proof point of the importance of high-end computer technology in bolstering and redefining national competitiveness. In the past, a country’s competitiveness and global power was defined by economic growth and defense capabilities. But now we’re seeing the advent of actionable technological insight—especially derived from the power of big data—becoming a factor of a country’s power.”

Pleiades Supercomputer Moves Up the Ranks with Haswell

NASA reports that it’s newly upgraded Pleiades supercomputer ranks number 11 on the July 2015 TOP500 list of the most powerful supercomputers. And while the LINPACK computing power of Pleiades jumped nearly 21 percent, its ranking at number 5 on the new HPCG benchmark list reflects its ability to tackle real world applications.

Video: Supercomputing Exoplanets

Astronomers Erika Nesvold (UMBC) and Marc Kuchner (NASA Goddard) essentially created a virtual Beta Pictoris in the computer and watched it evolve over millions of years. It is the first full 3-D model of a debris disk where scientists can watch the development of asymmetric features formed by planets, like warps and eccentric rings, and also track collisions among the particles at the same time.