OpenCL Optimization Case Study: Diagonal Sparse Matrix Vector Multiplication
This article discusses performance optimizations for AMD GPUs and CPUs using as a case study a simple, yet widely used computationally intensive kernel: Diagonal Sparse Matrix Vector Multiplication. We look at several topics which come up during OpenCL performance optimization and apply them to our case study.
3-D model of blood flow by supercomputer predicts heart attacks
EPFL Laboratory of Multiscale Modeling of Materials, in Switzerland, has developed a flowing 3D model of the cardiovascular system that should allow for predictions of certain heart diseases before they become dangerous.
Processor Whispers — About Partnerships and Partner Chips
The IT economy is doing better and better, despite the renewed spitting – and now even cloudy twittering – of the Icelandic volcano Eyjafjallajökull. This positive development could very well lead to new partnerships and almost-forsaken projects like Intel’s Larrabee graphics chip might make a comeback.
CSC makes $317M play for NOAA supercomputer
Now that hurricane season 2010 is open, there will be plenty of talk of severe weather in the coming months. And whether it is tornadoes, heat waves or floods there is little that can be done about it. But improved forecasting would go a long way toward ameliorating some of the more devastating effects. That’s a big reason why the National Oceanic and Atmospheric Administration’s mission, signed Computer Sciences Corp. to a nine-year $317 million to build a supercomputer for modeling weather patterns.
SGI advances Linux on the HPC front
Perhaps it is this acceptance of open source that led SGI’s CTO, Dr Eng Lim Goh to ditch a fancy conference setting and instead talk with a bunch of technologists at a Greater London Linux User Group (GLLUG) meeting held at University College London (UCL). Goh’s belief in his firm’s technology and that of the open source community was given away by the title of his talk, “Linux is supercomputing”. Underlying that claim was the fact that SGI has managed to get the standard Linux kernel, available from the kernel.org repository, to work on systems having 4,096 cores.
Researchers Aim to Achieve Cleaner Coal Through Computation
It’s not possible to see inside the flue where the gases are interacting, so Wilcox simulates the interactions of these particles using the Ranger supercomputer at the Texas Advanced Computing Center (TACC). Her studies of the dynamics of trace metals inside the flue of a power plant are helping her design and improve the technologies capable of removing heavy metals from the combustion process.
Hutchinson Center Receives $10.1M for HPC Cluster and Datacenter
Fred Hutchinson Cancer Research Center has received two grants totaling $10.1 million from the National Institutes of Health to fund a new high-performance computing cluster and the creation of a campus-based facility to consolidate and safeguard research data.
Fujitsu Supercomputer Achieves World Record in Computational Quantum Chemistry
Fujitsu Limited and Chuo University of Japan today announced that a team of researchers from Chuo University, Kyoto University, Tokyo Institute of Technology and Japan’s Institute of Physical and Chemical Research (known as Riken) employed the T2K Open Supercomputer — which was delivered by Fujitsu to Kyoto University’s Academic Center for Computing and Media Studies — to successfully compute with high precision, as a world first, an optimization problem to reveal the molecular behavior of ethane (CH3 only), ammonia (NH3) and oxygen (O2).
[…] Link and Run for 05/29/2010 […]