In this video, Dan Reed from the University of Iowa describes the era of Exascale Computing and Big Data. In a recent paper co-authored with Jack Dongarra, Reed makes an impassioned plea for hardware and software integration and cultural convergence.
“We will describe recent progress and successes obtained in predicting properties of matter by quantum simulations, and discuss algorithmic challenges in connection with the use of evolving high-performance computing architectures. We will also discuss open issues related to the validation of the approximate, first principles theories used in large-scale quantum simulations.”
Daniel Gutierrez, Managing Editor, of insideBIGDATA has put together a terrific Guide to Scientific Research. The goal of this paper is to provide a road map for scientific researchers wishing to capitalize on the rapid growth of big data technology for collecting, transforming, analyzing, and visualizing large scientific data sets.
“We have made substantial progress towards three transformative contributions: (1) we are the first team to formally link high-resolution astrodynamics design and coordination of space assets with their Earth science impacts within a Petascale “many-objective” global optimization framework, (2) we have successfully completed the largest Monte Carlo simulation experiment for evaluating the required satellite frequencies and coverage to maintain acceptable global forecasts of terrestrial hydrology (especially in poorer countries), and (3) we have evaluated the limitations and vulnerabilities of the full suite of current satellite precipitation missions including the recently approved Global Precipitation Measurement (GPM) mission. This work illustrates the tradeoffs and consequences of a collapse in the current portfolio of rainfall missions.