Hyperion Research HPC Market Update from ISC 2019

Print Friendly, PDF & Email

In this video from ISC 2019, Hyperion Research hosts their annual HPC Market Update briefing. The company helps IT professionals, business executives, and the investment community make fact-based decisions on technology purchases and business strategy.

Our industry experts are the former IDC high performance computing (HPC) analyst team, which remains intact and continues all of its global activities. The group is comprised of the world’s most respected HPC industry analysts who have worked together for more than 25 years.

Agenda:

  • Example Hyperion Research Projects
  • Update on the HPC Market
  • Cloud Computing Update and Our New Scorecard Tool
  • Quantum Computing Update
  • The Exascale Race
  • The ISC19 Innovation Award Winners
  • Conclusions and Predictions

The HPC Innovation Excellence Award recognized achievements by users of high performance computing (HPC) technologies. The program’s main goals are to help other users understand the benefits of adopting HPC and justify HPC investments, especially for small and medium-size businesses, to demonstrate the value of HPC to funding bodies, to expand public support for increase HPC investments, and to showcase return on investment and scientific success stories involving HPC.

Bill Gropp accepts HPC Innovation Excellence Award on behalf of NCSA

Award winners and project leaders include:

  • University of Illinois at Urbana-Champaign/NCSA. Project members have developed a way to harness computing power to integrate their legal framework that embeds political science, law, AI and statistics to help to quantify partisan gerrymandering. Their algorithms are the first to amalgamate wide and varied interests to identify elector maps that are acceptable to a broad swathe of society. Through the utilization of 131,000 processors on the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, the team has created a scalable algorithm to help tackle this problem and could save hundreds of millions of dollars spent in law suits and improve democratic society by supplying the missing districting information. Project leaders: Wendy K. Tam Cho and Yan Liu.
  • University of California, Davis. Engineers use the Cori supercomputer to develop Project DisCo, the first distributed HPC application of a physics-based, data-driven technique called “local causal states.” This technique offers a principled and theoretically well-motivated alternative to deep learning for the goal of scientific discovery from data. The team at UC Davis used this technique on climate data from the CAM5.1 global atmospheric model, processing almost 90 terabytes of data in less than 7 minutes on the Cori supercomputer. New data-driven methods are required that discover and mathematically describe complex emergent phenomena, uncover the physical and causal mechanisms underlying these phenomena, and are better able to predict these phenomena and how they evolve over time. The local causal states have the potential to do exactly this, and to do so directly from unlabeled data. Tools like the DisCo implementation of local causal states will allow scientists to ask complicated questions of their data sets without knowing all the technical details of how to properly answer these questions. Project leader: Adam Rupe.
  • Lawrence Livermore National Laboratory. A team at LLNL used the Trinity (Haswell) computer to seek out successful modes of laser-driven fusion implosions by building an enormous database for supervised training of a machine learned (ML) surrogate representation of their ICF simulation model. During their physics investigation, they found a new mode of physics performance, which represented only 0.25% of the 60,000 simulations run, their production-quality data set. Without the aid of their ML tool, they could never have found such an isolated simulation neighborhood. In the near term, this project has sparked deep investigations into the new capabilities presented by the merger of HPC, ML, and experimental data. In the long term, the team believes that they have put their finger on a way to empower scientists to grapple with increasingly complex and high-volume simulated and experimental data. Project leader: Brian Spears.
  • California Institute of Technology. Team members at Caltech poured through 10 years’ worth of Southern California seismic data to identify nearly two million previously unidentified tiny earthquakes that occurred between 2008 and 2017. This expands the earthquake catalog of Southern California by a factor of 10 from 2008 to 2017 with the addition of more than 1.8 million earthquakes. The team used a technique called “template matching” to accomplish this feat, and their work will help to change the way earthquakes are detected. Project leader: Dr. Zachary Ross.

Hyperion Research welcomes new award entries from around the world. New winners will be announced twice a year, at ISC in June and SC in November. Submissions must contain a clear description of the impact their project had on the world or their industry. The HPC User Forum Steering Committee performs an initial ranking of submissions, after which domain and vertical experts are called on, as needed, to evaluate the submissions.

Check out our insideHPC Events Calendar