MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Challenges for Climate and Weather Prediction in the Era of Heterogeneous Architectures

Beth Wingate from the University of Exeter presented this talk at the PASC16 conference in Switzerland. “For weather or climate models to achieve exascale performance on next-generation heterogeneous computer architectures they will be required to exploit on the order of million- or billion-way parallelism. This degree of parallelism far exceeds anything possible in today’s models even though they are highly optimized. In this talk I will discuss the mathematical issue that leads to the limitations in space- and time-parallelism for climate and weather prediction models – oscillatory stiffness in the PDE.”

Simulations of Hydrogen Ingestion Flashes in Giant Stars

“My team at the University of Minnesota has been collaborating with the team of Falk Herwig at the University of Victoria to simulate brief events in the lives of stars that can greatly affect the heavy elements they synthesize in their interiors and subsequently expel into the interstellar medium. These events are caused by the ingestion of highly combustible hydrogen-rich fuel into the convection zone above a helium burning shell in the deeper interior. Although these events are brief, it can take millions of time steps to simulate the dynamics in sufficient detail to capture subtle aspects of the hydrogen ingestion. To address the computational challenge, we exploit modern multicore and many-core processors and also scale the simulations to run efficiently on over 13,000 nodes of NSF’s Blue Waters machine at NCSA.”

IBM POWER8 System to Advance Genomic Health Research at University of Calgary

Today IBM and the University of Calgary announced a five-year collaboration to accelerate and expand genomic research into common childhood conditions such as autism, congenital diseases and the many unknown causes of illness. As part of the collaboration, IBM will augment the existing research capacity at the Cumming School of Medicine’s Alberta Children’s Hospital Research Institute by installing a POWER8-based computing and storage infrastructure along with advanced analytics and cognitive computing software.

Slidecast: Announcing Mellanox ConnectX-5 100G InfiniBand Adapter

“Today, scalable compute and storage systems suffer from data bottlenecks that limit research, product development, and constrain application services. ConnectX-5 will help unleash business potential with faster, more effective, real-time data processing and analytics. With its smart offloading, ConnectX-5 will enable dramatic increases in CPU, GPU and FPGA performance that will enhance effectiveness and maximize the return on data centers’ investment.”

Learnings from Operating 200 PB of Disk-Based Storage

Gleb Budman from Backblaze presented this talk at the 2016 MSST Conference. “For Q1 2016 we are reporting on 61,590 operational hard drives used to store encrypted customer data in our data center. In Q1 2016, the hard drives in our data center, past and present, totaled over one billion hours in operation to date. That’s nearly 42 million days or 114,155 years worth of spinning hard drives. Let’s take a look at what these hard drives have been up to.”

Video: Speeding Up Code with the Intel Distribution for Python

David Bolton from Slashdot shows how ‘embarrassingly parallel’ code can be sped up over 2000x (not percent) by utilizing Intel tools including the Intel Python compiler and OpenMP. “The Intel Distribution for Python* 2017 Beta program is now available. The Beta product adds new Python packages like scikit-learn, mpi4py, numba, conda, tbb (Python interfaces to Intel Threading Building Blocks) and pyDAAL (Python interfaces to Intel Data Analytics Acceleration Library). “

Video: Extreme-Scale Multigrid Components within PETSc

In this video from the PASC16 conference, Patrick Sanan from USi Lugano & ETC Zurich presents: Extreme-Scale Multigrid Components within PETSc. “Elliptic partial differential equations (PDEs) frequently arise in continuum descriptions of physical processes relevant to science and engineering. Multilevel preconditioners represent a family of scalable techniques for solving discrete PDEs of this type and thus are the method of choice for high-resolution simulations.”

Chris Johnson Presents: Big Data Visual Analysis

“We live in an era in which the creation of new data is growing exponentially such that every two days we create as much new data as we did from the beginning of mankind until the year 2003. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most important tools to understand such large and often complex data. In this talk, I will present state-of-the-art visualization techniques, applied to important Big Data problems in science, engineering, and medicine.”

Marc Snir Presents: Exascale Computing and Beyond

“The US, like the EU and other countries, is engaged in a national initiative that aims to deploy exascale computing platforms early in the next decade. The outlines of such platforms are starting to emerge. We shall survey, in our talk, the current roadmap for exascale computing and the main challenges this roadmap entails. We shall also discuss the likely evolution of HPC beyond exascale, in the “post-Moore” era.”

Video: Adoption Trends for Solid State in Big Data Sites

Bret Weber from DDN presented this talk at the 2016 MSST Conference. “SSDs and all flash arrays are being marketed as a panacea. This may be true if you’re a small to medium enterprise that simply needs more performance for email servers or wants to speed-up just a few hundred VMs. But, for Enterprise At-scale and High Performance Computing environments, identifying and removing I/O bottlenecks is much more complex than simply exchanging spinning disk drives with flash devices. Aside from performance – efficiency, scalability and integration are also critical success factors in larger and non-standard environments. In this domain, selecting a partner with the tools, technology and experience to holistically examine and optimize your entire I/O path can deliver orders of magnitude greater acceleration and competitive advantage to your organization.”