Report: Future Software and Data Ecosystem for Scientific Inquiry

Print Friendly, PDF & Email

In this special guest feature, Jack Dongarra from the University of Tennessee writes that a New Report summarizes four years of progress of the Big Data and Extreme-scale Computing (BDEC) project.

At a birds-of-a-feather session during the recent SC17 conference, the Big Data and Extreme-scale Computing (BDEC) project issued a report covering the five international workshops that it has staged over four years to explore the highly disruptive effects of the Big Data revolution on the development of future cyberinfrastructure for science. Beginning in 2013 as the Big Data revolution was gathering steam, this series included workshops in the US, Japan, the EU and China, as well as supplemental BoF’s and meetings at annual SC and ISC conferences. These meetings drew participation from all the main institutional branches of the HPC community, including academia, government laboratories, and private industry; and they included leaders from major scientific domains and from all the different segments of the research community involved in designing, developing, and deploying cyberinfrastructure for science.

The primary motivation for initiating the BDEC meetings was the widespread recognition of a growing split between the software ecosystems of traditional HPC and emerging high-end data analysis (HDA) and machine learning technologies, where the latter are based largely on open-source products from commercial cloud vendors and designed to run on those infrastructures. At a time when scientific communities are striving to become more international, more interdisciplinary, and more collaborative than ever, the major technical differences between these ecosystems raise a host of conceptual, political, economic, and cultural challenges that threaten to obstruct future cooperation and progress.

But as the BDEC workshops progressed, an even deeper problem became apparent. In particular, it became clear that the landscape of scientific computing is being radically reshaped by the explosive growth in the number and power of digital data generators, ranging from major scientific instruments to the Internet of Things (IoT) and the unprecedented volume and diversity of the data they generate. The problem is that the most rapidly expanding front of this data tsunami is occurring in “edge environments” (i.e., across the network from both HPC and commercial cloud machine rooms), where the processing and buffer/storage resources necessary to manage the logistics of such massive data flows are almost entirely lacking.

These next few years are likely to prove transformative for the scientific computing and cyberinfrastructure communities,” said Jack Dongarra, one of the founders of the BDEC effort. “The tremendous progress that we’re making toward the achievement of exascale systems, both here in the United States and in the European Union and Asia, will be undermined unless we can create a shared distributed computing platform to manage the logistics of massive, multistage data workflows with their sources at the network edge. Backhauling these rivers of data to the supercomputing center or the commercial cloud will not be a viable option for many, if not most applications.”

In “Pathways to Convergence: Towards a Shaping Strategy for a Future Software and Data Ecosystem for Scientific Inquiry,” the BDEC community report examines the progress toward or potential for convergence at three different levels.

  • Examples of successful convergence in large scale scientific applications: Several different application communities have already shown that HPC and HDA technologies can be combined and integrated in multi-stage application workflows in order to open new frontiers of research.
  • Progress toward convergence on large scale HPC platforms: The potential for a fusion between simulation-centric and data analysis–centric methods has motivated substantial progress toward the more flexible forms of system management that such workflows will require. The report itemizes and analyses some of the different challenges and opportunities that remain.
  • Absence of community convergence on a next generation distributed services platform: The erosion of the classic Internet paradigm under the onslaught of Big Data—and the proliferation of competing approaches to create a new type of distributed services platform (e.g., “edge” or “fog” computing) to replace it—is perhaps the most important challenge that the cyberinfrastructure community will confront in the coming decade. To help frame the discussion moving forward, the BDEC report seeks to articulate the most fundamental requirements—common, open, shareable, interoperable—that might be reasonably agreed to as a starting point for any community codesign effort in this area.

Download the report

Sign up for our insideHPC Newsletter