Today ISC Events announced that Prof. Dr. Satoshi Matsuoka of Tokyo Institute of Technology will be the program chairman for ISC 2016. “As the program chair, Dr. Matsuoka will be actively involved in leading the ISC program team to define the ISC 2016 program, especially the focus topics, whilst also working with the steering committee in a multi-year effort to further elevate the value of ISC for the HPC community.”
ISC 2015 will host a number of sessions on Exascale computing next month in Frankfurt. In what looks to be one of the highlights of the conference, Bill Gropp, Georg Hager, and Paul Kelly will discuss Programming Models on the Road to Exascale. To learn more, we caught up with the Session Chair, Dr Michèle Weiland, who serves as a Project Manager at the EPCC supercomputing center at the University of Edinburgh.
“The ability to accurately and efficiently study the absorption spectra of large chemical systems necessitates the development of new algorithms and the use of different architectures. We have developed a highly parallelizable algorithm in order to study excited state properties with ab initio electronic structure theory. This approach has recently been implemented to take advantage of graphical processing units to further improve efficiency.”
“Algorithmic adaptations are required to use anticipated exascale hardware near its potential, since the code base has been engineered to squeeze out flops. Instead, algorithms must now squeeze out synchronizations, memory, and transfers, while extra flops on locally cacheable data represent small costs in time and energy.”
“Over the three decades supercomputer throughput rates have increased dramatically from 100s of MFLOPs in 1985 to tens of PFLOPS today, making U.S. universities pre-eminent in the world in computational S&E, which is a tribute to both NSF and Centers’ dedication and leadership. Looking ahead to the impacts of synaptic computing, high-performance computing coupled with data analysis, and cloud computing coupled with the Internet of Things (IOT), the challenges and opportunities in computational S&E in all sectors of U.S. Society is projected to advance dramatically.”
“Exascale computing will enable combustion simulations in parameter regimes relevant to next-generation combustors burning alternative fuels. The first principles direct numerical simulations (DNS) are needed to provide the underlying science base required to develop vastly more accurate predictive combustion models used ultimately to design fuel efficient, clean burning vehicles, planes, and power plants for electricity generation.”
Satoshi Matsuoka from the Tokyo Institute of Technology discusses Big Data at the NCSA Blue Waters Symposium. “The trend towards convergence is not only strategic however but rather inevitable as the Moore’s law ends such that sustained growth in data capabilities, not compute, will advance the capacity and thus the overall capacities towards accelerating research and ultimately the industry.”
“SCEC’s multi-disciplinary research team is using NCSA Blue Waters to develop physics-based computational models of earthquake processes. During the past year, we integrated more realistic physics into our computational software to model frequency dependent attenuation, small-scale heterogeneities, free-surface topography, and non-linear yielding effects, all of which become increasingly important when simulating high frequency ground motions.”