In the last issue of The Exascale Report, we posted two reader-submitted questions. The editors’ choices for the best responses from the community are listed below.
We also offer this comment from Argonne’s Rick Stevens, not as a specific response but as consideration at a higher level:
“I don’t understand why everyone automatically assumes that existing programming paradigms will not scale. It’s not the programming paradigm that usually is the problem but the algorithm. To say we need new algorithms is of course nearly obvious. In my thinking, scale itself is not the problem we *might* need new programming models for. Our challenge is to address issues relating to managing alternative memory hierarchies, architectural changes for power management, computing embedded in memory, reliability etc. It is likely that only if we fail to get these right that we will need new programming models.