Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Interview: European cHiPSet Event focuses on High-Performance Modeling and Simulation for Big Data Applications

The cHIPSet Annual Plenary Meeting takes place in France next month. To learn more, we caught up with the Vice-Chair for the project, Dr. Horacio González-Vélez, Associate Professor and Head of the Cloud Competency Centre at the National College of Ireland. “The plenary meeting will feature a workshop entitled “Accelerating Modeling and Simulation in the Data Deluge Era”. We are expecting keynote presentations and panel discussions on how the forthcoming exascale systems will influence the analysis and interpretation of data, including the simulation of models, to match observation to theory.”

Supercomputing Frontiers Europe Announces Keynote Speakers

The Supercomputing Frontiers conference has announced it Full Agenda and Keynote Speakers. The event takes place March 12-15 in Warsaw, Poland. “Supercomputing Frontiers is an annual international conference that provides a platform for thought leaders from both academia and industry to interact and discuss visionary ideas, important visionary trends and substantial innovations in supercomputing.”

Report: Future Software and Data Ecosystem for Scientific Inquiry

“The tremendous progress that we’re making toward the achievement of exascale systems, both here in the United States and in the European Union and Asia, will be undermined unless we can create a shared distributed computing platform to manage the logistics of massive, multistage data workflows with their sources at the network edge. Backhauling these rivers of data to the supercomputing center or the commercial cloud will not be a viable option for many, if not most applications.”

VMware moves Virtualized HPC Forward at SC17

In this video from SC17, and Martin Yip and Josh Simons from VMware describe how the company is moving Virtualized HPC forward. “In recent years, virtualization has started making major inroads into the realm of High Performance Computing, an area that was previously considered off-limits. In application areas such as life sciences, electronic design automation, financial services, Big Data, and digital media, people are discovering that there are benefits to running a virtualized infrastructure that are similar to those experienced by enterprise applications, but also unique to HPC.”

Video: PASC18 to Focus on Big Data & Computation

In this video, Florina Ciorba from University of Basel describes the theme of the upcoming PASC18 conference. With a focus on the convergence of Big Data and Computation, the conference takes place from July 2-4, 2018 in Basel, Switzerland. “PASC18 is the fifth edition of the PASC Conference series, an international platform for the exchange of competences in scientific computing and computational science, with a strong focus on methods, tools, algorithms, application challenges, and novel techniques and usage of high performance computing.”

Quantum Drives High Performance Storage at SC17

In this video from SC17, Molly Presley from Quantum describes how the company’s high performance storage systems power HPC. “So why have an autonomous car at a Supercomputing show? The answer is Big Data. The Autonomous Stuff vehicle in this video is actually a rolling software development platform equipped with sensors that generate a whopping 30 Terabytes of data per day. Now just imagine if there were millions of vehicles on the road generating this kind of data. Only HPC could deal with that problem at scale. Companies like Quantum are stepping up to help solve this big data problem, both in the vehicle, on the edge, and in the datacenter.”

DDN Powers Big Data and Machine Learning at UTC in Tennessee

Last week at SC17, Data Direct Networks announced that The University of Tennessee, Chattanooga (UTC) has selected DDN’s GS14KX parallel file system appliance with 1.1PB of storage to replace its aging big data storage system and to support a diversifying range of data-intensive research projects. The Center of Excellence in Applied Computational Science and Engineering (SimCenter) at UTC needed a big data storage solution that could scale easily to support growing research programs focused on computational fluid dynamics (CFD), machine learning, data analytics, smart cities and molecular biology.

Designing HPC, Big Data, & Deep Learning Middleware for Exascale

DK Panda from Ohio State University presented this talk at the HPC Advisory Council Spain Conference. “This talk will focus on challenges in designing HPC, Big Data, and Deep Learning middleware for Exascale systems with millions of processors and accelerators. For the HPC domain, we will discuss about the challenges in designing runtime environments for MPI+X (PGAS OpenSHMEM/UPC/CAF/UPC++, OpenMP, and CUDA) programming models. Features and sample performance numbers from MVAPICH2 libraries will be presented.”

Video: Evolution of MATLAB

Cleve Moler from MathWorks gave this talk at the 2017 Argonne Training Program on Extreme-Scale Computing. “MATLAB is a high-performance language for technical computing. It integrates computation, visualization, and programming in an easy-to-use environment where problems and solutions are expressed in familiar mathematical notation. Typical uses include: Data analysis, exploration, and visualization.”

HPC in Agriculture: NCSA and Syngenta’s Dynamic Partnership

In this video, Jim Mellon from Sygenta describes how the company’s partnership with NCSA is helping the company answer the agricultural challenges of the future. “Together, we’re solving some of the toughest issues in agriculture today, like how to feed our rapidly growing population knowing that the amount of land we have for growing crops is finite. NCSA Industry provides the HPC resources that Syngenta’s scientists need to solve these issues, as well as an industry focus on security, performance, and availability, with the consultancy to better understand how to maximize these resources.”