Sign up for our newsletter and get the latest big data news and analysis.

Exascale: ECP’s QMCPACK Project for Predicting and Controlling Materials

In this episode of the Let’s Talk Exascale podcast, produced by  DOE’s Exascale Computing Project, the topic is an ECP subproject called QMCPACK, which aims to find, predict, and control materials from first principles with predictive accuracy. This episode offers a conversation with QMCPACK’s principle investigator, Paul Kent, a distinguished R&D staff member at Oak Ridge National Laboratory. The discussion

@HPCpodcast: Horst Simon on DOE’s Post-Exascale HPC Vision – More Flexibility, More Vendor Diversity; the Start of a New Leadership-class Supercomputing Market?

Following DOE’s next-gen supercomputing Request for Information (RFI) issued last week, we discussed what it all may mean with Dr. Horst Simon, Special Advisor to the Laboratory Director at Lawrence Berkeley National Laboratory and co-editor of the TOP500 list since 2000. He takes us through the current (though fungible) state of DOE’s post-exascale vision, the implications of an Advanced Computing Ecosystem (ACE) outlined in the RFI and the possible emergence of a new, more vendor-diverse leadership-class systems market.

@HPCpodcast: Google’s Lifelike LaMDA AI Chatbot and Questions of Being or Nothingness

When a tech news story gets talked about on sports radio, you know it’s gone very viral. That’s what happened last week with the story about a Google engineer, Blake Lemoine, who declared that the company’s AI chatbot, LaMDA, is a person with rights. Lemoine promptly got suspended by Google for his trouble, and he says he won’t be surprised if he gets fired. In this episode of the @HPCpodcast, Shahin Khan of and insideHPC editor-in-chief Doug Black talk about LaMDA’s amazingly lifelike conversational capability, how it can ingest books and research papers and share insights about them in real time (i.e., during conversations), deep fake-related ethical questions raised by LaMDA, the urgency of thoughtful social policies based on ethical and legal frameworks and philosophical issues of sentience, being and nothingness – artificial and otherwise.

@HPCpodcast: Parallel Processing Systems Pioneer Dr. Thomas Sterling on the State of HPC

Following his highly anticipated and always-insightful closing night keynote at the recent ISC conference, we caught up with Prof. Thomas Sterling to discuss the state of HPC.  Dr. Sterling is professor of intelligent systems engineering at Indiana University School of Informatics, Computing, and Engineering, and president and co-founder of Simultac, a technology company focused on […]

@HPCpodcast: On the Scene at ISC 2022 – HPE, AMD Make TOP500 News; Intel Makes News of Its Own

ISC 2022 in Hamburg was notable for a number of reasons – it was not only the first in-person ISC since 2019, it also provided a plethora of major news. This included big changes at the top of the TOP500 list of the world’s most powerful supercomputers, and the Frontier HPC system at Oak Ridge National Lab surpassing of the exascale milestone. While AMD, whose chips power Frontier, and HPE, which built Frontier, were the conference’s spotlight vendors, Intel also made some impressive product announcements, as analyzed in this discussion by Shahin Khan. You can find our podcasts at insideHPC’s @HPCpodcast page, on Twitter and at the blog. Here’s the RSS feed.

@HPCpodcast: Google Cloud’s 9 Exaflop AI Supercomputer, Chip Price Hikes, HPC Helps Photograph a Black Hole in Our Galaxy and IBM’s Quantum Ambitions

@HPCpodcast was so pleased with our last episode, an extensive interview with the University of Tennessee’s (and Oak Ridge National Laboratory’s) Jack Dongarra (our followers liked it too), that we decided to let that episode stand as a special, double podcast edition podcast. Now Shahin and Doug are back behind our mics talking about a raft of interesting news in the HPC/AI world, including what Google Cloud bills as the fastest AI supercomputer, price hikes on chips from TSMC and Samsung, the role of the Frontera supercomputer in the visualization of a black hole in our Milky Way galaxy and IBM’s ambitious and well-executed quantum computing roadmap.

@HPCpodcast: Oak Ridge Assoc. Director Dr. Jeff Nichols on Frontier, on History-making HPC – and on His Retirement

In this episode of the @HPCpodcast, join us for a rare, behind-the-scenes glimpse at the Frontier exascale supercomputer, how it was built in the middle of a pandemic and how it’s being prepared for full user-readiness. Frontier is a $600 million, 30 MW system comprised of 50-60 million parts in more than 100 cabinets, deployed at the Oak Ridge….

@HPCpodcast: The GTC Cornucopia

Last week’s rendition of NVIDIA’s bi-annual GTC extravaganza unveiled a raft of new HPC/AI announcements, the latest public performance of a company in its prime led by a leather-clad CEO generally regarded as a master marketer. Ok, roll your eyes at that gushing statement if you like, but it reflects the sentiment of Wall Street, which pushed NVIDIA stock up 10 percent the day CEO Jensen Huang delivered his GTC keynote, and of most (if not all) in the HPC industry analyst community. In this episode of the @HPCpodcast….

@HPCpodcast: Dan Reed on the Challenges to U.S. Global Supercomputing Competitiveness

In a recently published paper, “Reinventing High Performance Computing: Challenges and Opportunities,” three HPC luminaries have started an important discussion about the future of HPC and its impact on American competitiveness. In this episode of the @HPCpodcast, we talk with one of the authors, Dan Reed of the University of Utah, on the challenges facing the United States as it strives to compete globally in high-end supercomputing.

@HPCpodcast: Argonne’s Rick Stevens on AI for Science (Part 2) – Coming Breakthroughs, Ethics and the Replacement of Scientists by Robots

In part 2 of our not-to-be-missed @HPCpodcast with Argonne National Laboratory Associate Director Rick Stevens, he discusses some of the important advances that had, by 2015, likely ended the cycle of AI for science winters. He also delves into the major challenges in AI for science, such as building models that are transparent and unbiased while also robust and secure. And Stevens looks at important upcoming AI for science breakthrough use cases, including the welcome news – for researchers beset by mountains of scientific papers – of utilizing large natural language modeling to ingest and collate existing knowledge of a scientific problem, enabling analysis of the literature that, Stevens said, goes well beyond a Google search….