“By attending ISC, you will be connecting with 300 expert speakers who can share their perspectives on the latest developments in high performance computing. Featured this year, are topics such as the future direction of HPC technologies, life science applications, big data, quantum computing, and the real-world value of HPC.”
“Computers are becoming an increasingly cheaper, more powerful tool that cannot be ignored by professionals. Computer simulation reproduces the behavior of natural and man-made systems to help us understand, predict, and communicate. In this series kick-off, we will show you how computer simulation is used by LLNL scientists on the worlds fastest computers. We will also show you how you can get started doing your own computer simulations with free, open-source tools for class projects or just for fun.”
Professor Gianluca laccarino from Stanford presented this talk at the Stanford HPC Conference. “The Thermal & Fluid Sciences Affiliates Program (TFSA) is the industrial liaison program of the Thermosciences Group and Flow Physics and Computation Group of the Mechanical Engineering Department at Stanford University. The program was started over 45 years ago to establish and maintain close ties between the Stanford faculty and engineers in industry.”
Although many initially thought that liquid and servers should probably never mix – what if the server cooling is done in a completely controlled and secured environment? Liquid submersion cooling has the potential to revolutionize the design, construction, and energy consumption of data centers around the world.
Matt Bidwell from NREL presented this talk at SC13. “NREL’s HPC center is home to the largest HPC system in the world dedicated to advancing renewable energy and energy efficiency technologies. The HPC capabilities of the center propel technology innovation as a research tool by which scientists and engineers find new ways to tackle our nation’s energy challenges—challenges that cannot be addressed through traditional experimentation alone.”
“Systems like Argonne’s Mira, an IBM Blue Gene/Q system with nearly a million cores, can enable breakthroughs in science, but to use them productively requires expertise in computer architectures, parallel programming, mathematical software, data management and analysis, performance analysis tools, software engineering, and so on. Our training program exposes the participants to all those topics and provides hands-on exercises for experimenting with most of them.”