HPCG Benchmark Gains Traction for Ranking Supercomputers

Print Friendly, PDF & Email
Michael Heroux, Sandia National Laboratories

Michael Heroux,
Sandia National Laboratories

The High Performance Conjugate Gradients (HPCG) benchmark continues to gain traction in the high-performance computing community. The software program that ranks supercomputers on their ability to solve complex problems rather than on raw speed alone.

More than 60 supercomputers were ranked by the HPCG benchmark, in ratings released at SC15 in late November. Eighteen months earlier, only 15 supercomputers were on the list.

“HPCG is designed to complement the traditional High Performance Linpack (HPL) benchmark used as the official metric for ranking the top 500 systems,” said Sandia National Laboratories researcher Mike Heroux, who developed the HPCG program in collaboration with Jack Dongarra and Piotr Luszczek from the University of Tennessee.

The current list contains the same entries as many of the top 50 systems from Linpack’s TOP500 but significantly shuffles HPL rankings, indicating that HPCG puts different system characteristics through their paces.

This is because the different measures provided by HPCG and HPL act as bookends on the performance spectrum of a given system, said Heroux. “While HPL tests supercomputer speed in solving relatively straightforward problems, HPCG’s more complex criteria test characteristics such as high-performance interconnects, memory systems and fine-grain cooperative threading that are important to a different and broader set of applications.”

Heroux said only time will tell whether supercomputer manufacturers and users gravitate toward HPCG as a useful test. “All major vendor computing companies have invested heavily in optimizing our benchmark. All participating system owners have dedicated machine time to make runs. These investments are the strongest confirmation that we have developed something useful.

Many benchmarks have been proposed as complements or even replacements for Linpack,” he said. “We have had more success than previous Oefforts. But there is still a lot of work to keep the effort going.”

Sign up for our insideHPC Newsletter