NCSA is now accepting applications for the Blue Waters Graduate Program. This unique program lets graduate students from across the country immerse themselves in a year of focused high-performance computing and data-intensive research using the Blue Waters supercomputer to accelerate their research.
Does your research generate, analyze, and/or visualize data using advanced digital resources? In its recent Call for Participation, the CADENS project is looking for scientific data to visualize or existing data visualizations to weave into larger documentary narratives in a series of fulldome digital films and TV programs aimed at broad public audiences. Visualizations of your work could reach millions of people, amplifying its greater societal impacts!
Video: NCSA’s Ed Seidel Testifies on the Networking and Information Technology Research and Development Program
In the video, Ed Seidel from NSCA testifies at a House hearing on the Networking and Information Technology Research and Development (NITRD) Program. NITRD provides a framework in which many Federal agencies come together to coordinate their networking and information technology (IT) research and development (R&D) efforts.
In this video (with transcript) from the 2015 HPC User Forum in Broomfield, Bob Sorenson from IDC moderates a User Agency panel discussion on the NSCI initiative. “You all have seen that usable statement inside the NSCI, and we are all about trying to figure out how to make usable machines. That is a key critical component as far, as we’re concerned. But the thing that I think we’re really seeing, we talked about the fact that a single thread performance is not increasing, and so what we’re doing is we’re simply increasing the parallelism and then the physics limitations, if you will, of how you cool and distribute power among the parts that are there. That really is leading to a paradigm shift from something that’s based on how fast you can crunch the numbers to how fast you can feed the chips with data. It’s really that paradigm shift, I think, more than anything else that’s really going to change the way that we have to do our computing.”
Today Cray announced a world record by scaling ANSYS Fluent to 129,000 compute cores. “Less than a year ago, ANSYS announced Fluent had scaled to 36,000 cores with the help of NCSA. While the nearly 4x increase over the previous record is significant, it tells only part of the story. ANSYS has broadened the scope of simulations allowing for applicability to a much broader set of real-world problems and products than any other company offers.”
In this video from the AIAA Aviation Conference 2015, panelists discuss Supercomputing: Roadmap and its Future Role in Aerospace Engineering. “Supercomputing has made significant contributions in aerospace engineering in recent decades, including advances in computational fluid dynamics that has fundamentally altered the way aircraft are designed. And the relentless growth in high-performance computing power holds promise of huge leaps in engine performance and other aerospace technology.”
In this video, attendees discuss highlights from Day 3 of the ISC 2015 conference, including the conference’s industry track, exhibit floor, and vendor parties. The program includes short interviews with Ed Seidel from NCSA, Sverre Jarp from CERN, Horst Simon from LBNL, Jack Dongarra from the University of Tennessee, and Jürgen Kohler from Daimler AG.
A new high-resolution science documentary about the dynamics of the Sun will feature data-driven supercomputer visualizations produced by NCSA. Narrated by Benedict Cumberbach, Solar Superstorms debuts June 30 at the Louisiana Art & Science Museum in Baton Rouge before heading out to more than a dozen planetariums and science centers around the world.
“We have made substantial progress towards three transformative contributions: (1) we are the first team to formally link high-resolution astrodynamics design and coordination of space assets with their Earth science impacts within a Petascale “many-objective” global optimization framework, (2) we have successfully completed the largest Monte Carlo simulation experiment for evaluating the required satellite frequencies and coverage to maintain acceptable global forecasts of terrestrial hydrology (especially in poorer countries), and (3) we have evaluated the limitations and vulnerabilities of the full suite of current satellite precipitation missions including the recently approved Global Precipitation Measurement (GPM) mission. This work illustrates the tradeoffs and consequences of a collapse in the current portfolio of rainfall missions.
“Supercomputing has reached a level of maturity and capability where many areas of science and engineering are not only advancing rapidly due to computing power, they cannot progress without it. I will illustrate examples from NCSA’s Blue Waters supercomputer, and from major data-intensive projects including the Large Synoptic Survey Telescope, and give thoughts on what will be needed going forward.”