MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

RDMA Enabling Storage Technology Revolution

Michael Kagan, CTO Mellanox Technologies

With the explosion of data over the past few years, data storage has become a hot topic among corporate decision makers. It is no longer sufficient to have adequate space for the massive quantities of data that must be stored; it is just as critical that stored data be accessible without any bottlenecks that impede the ability to process and analyze data in real time.

Seagate to Lead Sage Storage Project for Exascale Horizon 2020


“We are excited that the H2020 SAGE Project gives us the opportunity to research and move HPC storage into the Exascale age,” said Ken Claffey, vice president and general manager, Seagate HPC systems business. “Seagate will contribute its unique skills and device technology to address the convergence of Exascale and Big Data, with an excellent selection of participants each bringing their own capabilities together to build the future of storage on an unprecedented scale.”

Stepping up to the performance challenge

David Lecomber, CEO, Allinea

The performance-savvy HPC developer is in high demand today. Leaps in intra-node parallelism, memory performance and capacity are set to meet applications struggling to exploit existing systems head-on.

Video: On the Role of Flash in Large-Scale Storage Systems


Nathan Rutman from Seagate presented this talk at the LAD’15 Conference. “So why is a spinning disk company talking about Flash? Last year, Seagate acquired Avago LSI’s flash division. We now have an array of flash-based storage. So I have nothing against Flash. This presentation is really on: Where does Flash make sense? I also have a personal agenda because I hate the term “Burst Buffer.” Everyone says “Burst Buffer” instead of saying “Flash.” It drives me crazy. So I’m going to explain what a Burst Buffer is and what it is not.”

HPC in Seismic Processing and Interpretation and Reservoir Modeling

Katie Garrison, Marketing Commuincations, One Stop Solutions

Oil and gas are becoming increasingly harder to find. This article looks at how oil and gas companies are using cutting edge technology, like HPC servers, compute accelerators and flash storage arrays for applications such as seismic processing, seismic interpretation and reservoir modeling.

Altair, Intel and Amazon Offer HPC Challenge


For companies looking to test the viability of engineering in the cloud, Altair has teamed with Intel and Amazon Web Services (AWS) to offer an “HPC Challenge” for product design. In a nutshell, the program provides free cycles on AWS for up to 60 days, where users can run compute-intensive jobs for computer-aided engineering (CAE).

Reducing Your Data Center “Water Guilt”


Concerns over data center water usage have become topical both in the industry and even in the general press of late. This is not a bad thing as data center water usage is a legitimate concern. The reality is that the problem is rooted in today’s established approaches to data center cooling.

Video: Argonne’s Pete Beckman Describes the Challenges of Exascale


“Argonne National Laboratory is one of the laboratories helping to lead the exascale push for the nation with the DOE. We lead in a numbers of areas with software and storage systems and applied math. And we’re really focusing, our expertise is focusing on those new ideas, those novel new things that will allow us to sort of leapfrog the standard slow evolution of technology and get something further out ahead, three years, five years out ahead. And that’s where our research is focused.”

New Intel® Omni-Path White Paper Details Technology Improvements

Rob Farber

The Intel Omni-Path Architecture (Intel® OPA) whitepaper goes through the multitude of improvements that Intel OPA technology provides to the HPC community. In particular, HPC readers will appreciate how collective operations can be optimized based on message size, collective communicator size and topology using the point-to-point send and receive primitives.

Pushing the Boundaries of Combustion Simulation with Mira


“Researchers at the U.S. Department of Energy’s Argonne National Laboratory will be testing the limits of computing horsepower this year with a new simulation project from the Virtual Engine Research Institute and Fuels Initiative (VERIFI) that will harness 60 million computer core hours to dispel those uncertainties and pave the way to more effective engine simulations.”