The NSF has awarded $5 million to a team of Indiana University Bloomington computer scientists working to improve how researchers across the sciences empower big data to solve problems.
“Speed is essential for disaster planning, and the new simulations will take advantage of developments in supercomputing to increase the speed of the simulation, so that converting large databases of weather and topographical data into storm surge predictions can be completed within an hour — half the current time. The project will also be a significant exercise in recasting legacy software for future generations of supercomputers.”
A new computational method has made it possible to detect genetic changes responsible for the onset and progression of tumors in a simple, quick and precise way. The SMUFIN (Somatic Mutations Finder) method is capable of analyzing the complete genome of a tumor and identifying its mutations in a few hours. In addition, it is able to identify alterations which had previously not been revealed, even using methods which require the use of supercomputers over several weeks.
In this video from the 2014 Argonne Training Program on Extreme-Scale Computing, James Reinders presents: Computer Architecture and Structured Parallel Programming. “At ATPESC 2014, we captured 67 hours of lectures in 86 videos of presentations by pioneers and elites in the HPC community on topics ranging from programming techniques and numerical algorithms best suited for leading-edge HPC systems to trends in HPC architectures and software most likely to provide performance portability through the next decade and beyond.”
“SURF allows you to set up a light path: a direct, secure, and fast connection between 2 points, for example between a researcher in the Netherlands and a telescope in China. This offers unique, worldwide possibilities, for example for more efficient research, for sharing facilities or large datasets, or for creating backups.”