Thomas Sterling Presents: ParalleX Execution Model for Exascale Astrophysics

Thomas Sterling

Thomas Sterling

In this slidecast, Thomas Sterling from Indiana University presents: ParalleX Execution Model for Exascale Astrophysics. The talk was presented at the 2013 Computing in Astrophysics Conference in Ascona, Switzerland.

High performance computing is experiencing a phase change with the challenges of programming and management of heterogeneous multicore systems architectures and large scale system configurations. It is estimated that by the end of the next decade Exaflops computing systems requiring hundreds of millions of cores demanding multi-billion-way parallelism with a power budget of 50 Gflops/watt may emerge. At the same time, there are many scaling-challenged applications that although taking many weeks to complete, cannot scale even to a thousand cores using conventional distributed programming models. This paper describes an experimental methodology, ParalleX, that addresses these challenges through a change in the fundamental model of parallel computation from that of the communicating sequential processes (e.g., MPI) to an innovative synthesis of concepts involving message-driven work-queue execution in the context of a global address space. The focus of this work is a new runtime system required to test, validate, and evaluate the use of ParalleX concepts for extreme scalability. This paper describes the ParalleX model and the HPX runtime system and discusses how both strategies contribute to the goal of extreme computing through dynamic asynchronous execution. The paper presents the first early experimental results of tests using a proof-of-concept runtime-system implementation. These results are very promising and are guiding future work towards a full scale parallel programming and runtime environment.

Download the paper (PDF): ParalleX: An Advanced Parallel Execution Model for Scaling- Impaired Applications.

Comments

  1. the ‘download the paper’ link is wrong. (file:///…)

Resource Links: