MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Video: HP R&D and HPC

Joseph George, Director, HP Apollo Servers, HPC & Big Data Solutions

“The HP Apollo 8000 supercomputing platform approaches HPC from an entirely new perspective as the system is cooled directly with warm water. This is done through a “dry-disconnect” cooling concept that has been implemented with the simple but efficient use of heat pipes. Unlike cooling fans, which are designed for maximum load, the heat pipes can be optimized by administrators. The approach allows significantly greater performance density, cutting energy consumption in half and creating synergies with other building energy systems, relative to a strictly air-cooled system.”

Video: NOAA Software Engineering for Novel Architectures (SENA) Project


“NOAA will acquire software engineering support and associated tools to re-architect NOAA’s applications to run efficiently on next generation fine-grain HPC architectures. From a recent procurement document: “Finegrain architecture (FGA) is defined as: a processing unit that supports more than 60 concurrent threads in hardware (e.g. GPU or a large core-count device).”

EU/European HPC Plans for the Next Decade


Bob Sorensen from IDC presented this talk at the HPC User Forum. In a recent study, IDC assessed the EU’s progress towards their 2012 action plan and made recommendations for funding exascale systems and fostering industrial HPC in the coming decade.

IDC Presents an Update on ROI with HPC

Earl Joseph, IDC

Earl Joseph from IDC presented this talk at the HPC User Forum. “This study investigates how HPC investments can improve economic success and increase scientific innovation. This research is focused on the common good and should be useful to DOE, other government agencies, industry, and academia.”

Video: Leveraging Containers in Elastic Environments


In this video from the Disruptive Technologies session at the 2015 HPC User Forum, Nick Ihli from Adaptive Computing presents: Leveraging Containers in Elastic Environments.

Video: Dell’s New HPC Vision, Strategy, and Plans

Jim Ganthier, Dell

Jim Ganthier from Dell presented this talk at the HPC User Forum. “Dell HPC solutions are deployed across the globe as the computational foundation for industrial, academic and governmental research critical to scientific advancement and economic and global competitiveness. With the richness of the Dell enterprise portfolio, HPC customers are increasingly relying on Dell HPC experts to provide integrated, turnkey solutions and services resulting in enhanced performance, reliability and simplicity.”

ExaNeSt Technology: Targeting Exascale in 2018


Peter Hopton from Iceotope presented this talk at the HPC User Forum. “ExaNeSt will develop, evaluate, and prototype the physical platform and architectural solution for a unified Communication and Storage Interconnect and the physical rack and environmental structures required to deliver European Exascale Systems. The consortium brings technology, skills, and knowledge across the entire value chain from computing IP to packaging and system deployment; and from operating systems, storage, and communication to HPC with big data management, algorithms, applications, and frameworks. Building on a decade of advanced R&D, ExaNeSt will deliver the solution that can support exascale deployment in the follow-up industrial commercialization phases.”

Intelligent Light: Breaking the Disk IO Bottleneck in CFD by Eliminating It


“FieldView products and services from Intelligent Light have been specifically developed to help CFD users get more reliable results in less time from their CFD investments. Post-processing can be the most important step in the CFD process -this is where the “pay off” occurs – where you gain insight and make decisions. Yet it is often overlooked when planning effective CFD workflows.”

User Agency Panel Discussion on the NSCI Initiative


In this video (with transcript) from the 2015 HPC User Forum in Broomfield, Bob Sorenson from IDC moderates a User Agency panel discussion on the NSCI initiative. “You all have seen that usable statement inside the NSCI, and we are all about trying to figure out how to make usable machines. That is a key critical component as far, as we’re concerned. But the thing that I think we’re really seeing, we talked about the fact that a single thread performance is not increasing, and so what we’re doing is we’re simply increasing the parallelism and then the physics limitations, if you will, of how you cool and distribute power among the parts that are there. That really is leading to a paradigm shift from something that’s based on how fast you can crunch the numbers to how fast you can feed the chips with data. It’s really that paradigm shift, I think, more than anything else that’s really going to change the way that we have to do our computing.”

Optalysys: Disruptive Optical Processing Technology for HPC


In this video from the Disruptive Technologies Session at the 2015 HPC User Forum, Nick New from Optalysis describes the company’s optical processing technology. “Optalysys technology uses light, rather than electricity, to perform processor intensive mathematical functions (such as Fourier Transforms) in parallel at incredibly high-speeds and resolutions. It has the potential to provide multi-exascale levels of processing, powered from a standard mains supply. The mission is to deliver a solution that requires several orders of magnitude less power than traditional High Performance Computing architectures.”