Sign up for our newsletter and get the latest HPC news and analysis.

A conversation with TotalView Technologies

As we move more fully into the pan-petaFLOPS era, the scaling limitations in our current set of application development tools is becoming more clear. And if we are to make effective use of the exascale machines our community will begin deploying by the end of this decade, a dramatic shift in our community’s development environment is needed. During SC09 we talked to TotalView Technologies’ Chris Gottbrath about what the company has in store for developers in 2010, and how they are moving to address the problems of extreme scale development.

Verari auction in progress

Ever since we broke the news that Verari was out of business earlier this month we’ve kept track of employees — who said the company was not only dead, but the evil dead — and what was left of the management team, who said “the doors are still open.” The doors were clearly not open […]

Open source Ocelot moves CUDA to x86

Randall at VizWorld is pointing to a project hosted at Google Code called Ocelot A new project on Google Code called ‘Ocelot’ aims to compile CUDA programs for execution on NVidia GPUs and x86 CPU’s. Ocelot is a dynamic compilation framework for heterogeneous systems, accomplishing this by providing various backend targets for CUDA programs. Ocelot […]

Add "green IT" to next gen workforce training list

I’ve been reading all of the materials from the three International Exascale Software Project this week (I’m 300 slides in so far, not counting all the background material) — one thing that group is thinking about as it sketches out a billion-way parallel future is education and training. It doesn’t get a lot of attention […]

University of Tasmania Buys SGI ICE for Climate Research

SGI announced today that the Tasmanian Partnership for Advanced Computing [TPAC] at the University of Tasmania’s [UTAS] supercomputing facility has purchased a new SGI ICE cluster.  The new machine is destined to crunch workloads for computing Antarctic climate research.  The 64 blade ICE cluster has 512 cores and a terabyte of memory.  “Katabatic”, as its […]

Plans for world's largest solar farm blocked by senator

This is tangentially related to HPC — some of the folks I know with reasonably big centers are looking into augmenting their commercial power consumption with locally produced (mostly solar) energy. Evidently even in the land of progressive Earth love renewable energy projects aren’t a lock Yesterday, California’s Senator Diane Feinstein introduced legislation that would […]

AMD releases new Stream SDK, support for OpenCL 1.0

AMD announced in its forums on the 21st that they’ve released version 2.0 of the Stream SDK, with support for OpenCL. You’ll recall that OpenCL is the emerging standard for expressing work destined for both multicore and accelerated (ie, GPUs today) processing. At Timothy Prickett Morgan at The Register points out in his coverage, Stream […]

InsideTrack: TotalView Technologies sold to Rogue Wave Software [CONFIRMED]

It seems that TotalView Technologies, maker of the eponymous debugger and several other tools to help with development of large scale parallel applications, may have gotten married over the past several days. It turns out that TotalView trades over the counter on the Norwegian stock exchange under the ticker symbol TVTI (click here and scroll […]

InfiniBand workshop in Lugano, Switzerland

I can think of about a million worse places to have an HPC-related workshop than beautiful Lugano, Switzerland. Happily, I don’t have to. The city is on Lake Lugano, close to the border with Italy, and will be hosting the upcoming HPC Advisory Council workshop that will be all about InfiniBand. From the website The […]

Czech Meteorological Service Deploys Big NEC Vector

NEC has announced that it has deployed a large vector supercomputer to the Meteorological Service CzechTrade [CHMI].  Beginning in early 2010, the center will run their operational forecast on the new NEC SX-9 vector supercomputer.  The goal of the procurement was to purchase a resource that delivered more accurate forecasts in a more timely manner. […]