“My team at the University of Minnesota has been collaborating with the team of Falk Herwig at the University of Victoria to simulate brief events in the lives of stars that can greatly affect the heavy elements they synthesize in their interiors and subsequently expel into the interstellar medium. These events are caused by the ingestion of highly combustible hydrogen-rich fuel into the convection zone above a helium burning shell in the deeper interior. Although these events are brief, it can take millions of time steps to simulate the dynamics in sufficient detail to capture subtle aspects of the hydrogen ingestion. To address the computational challenge, we exploit modern multicore and many-core processors and also scale the simulations to run efficiently on over 13,000 nodes of NSF’s Blue Waters machine at NCSA.”
Scientists from a wide range of disciplines gathered at the SwissTech Convention Center in Lausanne from 8 to 10 June 2016 for this year’s PASC Conference. The use of high-performance computing in their research was the common bond that brought them together. From 8 to 10 June, the SwissTech Convention Center played host to around 360 scientists from Europe, the USA, Canada, Japan, Russia, Saudi Arabia, South Africa and Singapore, offering plenty of space and scope for them to engage in an interdisciplinary discourse on current issues relating to scientific computing.
“We live in an era in which the creation of new data is growing exponentially such that every two days we create as much new data as we did from the beginning of mankind until the year 2003. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most important tools to understand such large and often complex data. In this talk, I will present state-of-the-art visualization techniques, applied to important Big Data problems in science, engineering, and medicine.”
New HPC products and technologies. Compelling demos. Insights from top Intel HPC architects. More than 60 presentations from Intel and industry experts. Additional details about Intel® Scalable System Framework. Intel will have something for everyone at this year’s International Supercomputing Conference in Frankfurt, Germany.
The ISAV2016 Workshop has issued it Call for Participation. Held at SC16 in cooperation with with SIGHPC, the In Situ Infrastructures for Enabling Extreme-Scale Analysis and Visualization Workshop takes place Sunday, November 13th, 2016. The considerable interest in the HPC community regarding in situ analysis and visualization is due to several factors. First is an I/O cost […]
Today Vela Software announced that it has acquired Tecplot, a leading provider of fluid dynamics visualization and analysis software for engineers and scientists in the aerospace and oil & gas vertical markets.
Today AMD announced the Multiuser GPU (MxGPU) for blade servers, a new graphics virtualization solution that provides workstation-class experience. Now available HPE ProLiant WS460c Gen9 blade servers, the AMD FirePro S7100X GPU is the industry’s first and only hardware-virtualized GPU that is compliant with the SR-IOV (Single Root I/O Virtualization) PCIe virtualization standard. The AMD FirePro S7100X GPU is […]
“With NVIDIA GPU technology on IBM Cloud, we are one step closer to offering supercomputing performance on a pay-as-you-go basis, which makes this new approach to tackling big data problems accessible to customers of all sizes,” says Jerry Gutierrez, HPC leader for SoftLayer, an IBM Company. “We’re at an inflection point in our industry, where GPU technology is opening the door for the next wave of breakthroughs across multiple industries.”
The Piz Daint supercomputer spotted a large reservoir of magma right below the tiny South Korean island of Ulleung. No harm to humans is expected, but the origin of the magma pool remains unclear.
Researchers are using the Magnus supercomputer at the Pawsey Centre to explore the mysteries of two shipwrecks involved in Australia’s greatest naval disaster. “The process of generating 3D models from the photographs we’ve taken is very computationally intensive. The time it would take to process half a million photographs using our conventional techniques, using our standard computers, would take about a thousand years, so we needed to do something to bring that time down to something achievable.”