Component Architecture for Scientific HPC

White Papers > HPC Systems Management > Component Architecture for Scientific HPC

The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance computing. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individuals or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components

Historically, the principal concerns of software developers for high-performance scientific computing have centered on increasing the scope and fidelity of their simulations, and then increasing the performance and efficiency to address the exceedingly long execution times that can accompany these goals. Initial successes with computational simulations have led to the desire for solutions to larger, more sophisticated problems and the improvement of models to reflect greater levels of detail and accuracy. Efforts to address these new demands necessarily have included improvements to scientific methodology, algorithms, and programming models, and virtually always each advance has been accompanied by increases in the complexity of the underlying software. At the same time, the computer industry has continued to create ever larger and more complex hardware in an attempt to satisfy the increasing demand for simulation capabilities. These architecture tend to exacerbate the complexity of software running on these systems, as in nearly all cases, the increased complexity is exposed to the programmer at some level and must be explicitly managed to extract the maximum possible performance. In scientific high-performance computing, relatively little attention has been paid to improv- ing the fundamental software development process and finding ways to manage the ballooning complexity of the software and operating environment.

Simultaneously, in other domains of software development, complexity rather than runtime performance has been a primary concern. For example, in the business area the push has been less for increasing the size of “the problem” than for interconnecting and integrating an ever- increasing number of applications that share and manipulate business-related information, such as word processors, spreadsheets, databases, and web servers. More recently, with the fast pace of the internet boom, the flood of new software technology used in business and commercial applications has increased both the degree of interoperability desired and the number of applications and tools to be integrated. This situation has led to extreme software complexity, which at several levels is not unlike the complexity now being seen in high-performance scientific computing. For example, scientific codes regularly attempt the integration of multiple numerical libraries and/or pro- gramming models into a single application. Recently, efforts have increased to couple multiple stand-alone simulations together into multi-physics and multi-scale applications for models with better overall physical fidelity.

Contact Info

Work Email*
First Name*
Last Name*
Zip/Postal Code*

Company Info

Company Size*
Job Role*

All information that you supply is protected by our privacy policy. By submitting your information you agree to our Terms of Use.
* All fields required.