So, if I had to put this talk in a nutshell, Yes, I am very bullish on how Docker (containers) and virtualization are going to change the face of HPC. If you want to understand why, check out what my friends Christian Kniep and Josh Simons have been up to on those fronts.
“The significance of the Coral announcement goes far beyond the specialism of high-performance computing (HPC) into enterprise computing, where the technologies being developed for HPC could transform this much wider and financially more important sector of the economy, according to the members of the winning partnership of IBM, Nvidia, and Mellanox. ‘There are game-changing elements to what we are doing,’ Ken King, general manger of OpenPower Alliances at IBM, told Scientific Computing World.”
In this video from the Nov. 17 Seagate User Group at SC14, Henry Newman from Instrumental presents: Tape vs. Disk 2015-2020. Transcript: So, I’m the last speaker between you and the opening gala event so I will talk quickly. Ken heard about a study we did for a government agency looking at long term archival issues […]
In this video, the Radio Free HPC team meets at SC14 in New Orleans to discuss the recent news that Nvidia & IBM will build two Coral 150+ Petaflop Supercomputers in 2017 for Lawrence Livermore and Oak Ridge National Laboratories. The two machines will feature IBM POWER9 processors coupled with Nvidia’s future Volta GPU technology. NVLink will be a critical piece of the architecture as well, along with a system interconnect powered by Mellanox.
“Broadly speaking, Battery invests globally in cutting-edge, category-defining businesses in markets including enterprise IT, software and services, e-commerce, digital media and industrial technologies. And we invest at every stage of a company’s lifecycle from seed to buyout. From there, we are looking for big markets. Entrepreneurs who are building a great team and attacking a huge market opportunity are always of interest to us. We are also watching the tectonic shifts brought on by trends like cloud computing, big data and mobile, and working to identify the companies that are taking advantage of those trends to help define, or in many cases re-define, industries.”
The doors will soon open, the curtains will rise – and what really #HPCMatters will shine in the floodlights of New Orleans. It will be the applications of HPC that define this SC conference – where the life/business/world-impacting results are found. Applications are the sharp end of the mission. But who or what lies behind application successes?
In an 1883 lecture on “The Practical Applications of Electricity”, Scottish physicist Lord Kelvin stated: “… when you can measure what you are speaking about, and express it in numbers, you know something about it …” High Performance Computing (HPC), therefore, inherited a healthy predisposition towards monitoring. Fast forwarding in time to the present, monitoring HPC clusters remains topical. And while I expect we can all agree upon the ongoing relevance, it is clear that there are very different perspectives as to how monitoring should be modernized. Whereas passive monitoring using meta-toolkits may address needs temporarily, unified solutions that combine monitoring with provisioning and management deliver value on an ongoing, sustainable basis.
This Week in HPC: Exascale Bill Faces Uncertain Future in U.S. Senate, and HPC Gets in the Entrepreneurial Spirit
While many of us old guard HPC codgers may remember a time when Microsoft was making a big push into HPC some years ago, the company’s current HPC strategy is pretty much non-existent from what I can tell. They do have a solid “Big Comput”e strategy, however, and when I came across their Azure site today, I learned a thing or two.