HIV research aside, not everyone happy with the Blue Gene in the UK

Print Friendly, PDF & Email

Yesterday IBM and the University of Edinburgh announced they were teaming up in a five year project to to increase the efficacy of HIV drugs

IBM logoThe project includes powerful computing technology, including IBM’s Blue Gene supercomputer, combined with new experimental characterization aimed at targeting the infection process itself by designing inhibitors for the part of the virus responsible for allowing the virus’ genetic material to enter the human cell.

But not everyone in the UK thinks that their use of Blue Gene is a good idea

Conservative party leader, David Cameron promised today that if his party was elected there would be no more IT projects “like Labour’s hubristic NHS supercomputer”.

The part that he’s unhappy about is that, in his view, the BG is a proprietary system. Cameron is evidently a big open source fan

“The basic reason for the problems [in government IT programmes] is Labour’s addiction to the mainframe model -large, centralised systems for the management of information.

…“We will follow private sector best practice which is to introduce ‘open standards’ that enables IT contracts to be split up into modular components. So never again could there be projects like Labour’s hubristic NHS supercomputer.”

I guess he’d be happier with a white box cluster built in someone’s garage running Yellow Dog Linux mounted from iPods? It’s fascinating when science and politics mix. In the same way that train wrecks, landslides, and earthquakes are fascinating.

Comments

  1. Rich Hickey says

    Hey John. It’s Edinburgh not Edinburg. Sheesh… You Colonials. 🙂

    It doesn’t say where the machine will be located, but no one here knows anything about this collaboration. I’m in the Advanced Computing Facility here outside of Edinburgh (notice the h?) where the existing Bluegene/L is and I just got a bunch of blank looks when I mentioned this article.

    It’ll be intersting to learn more about this.

  2. Rich – first, apologies. Chalk it up to never having been to Edinburgh. The “h” just doesn’t live for me. 🙂

    How very interesting that you don’t know about the collaboration. The location of the machine was conspicuously absent from everything I found. I wonder if they’ll actually run on resources internal to IBM, like at TJ Watson. I got the sense, but again I need to clarify this, but the relationship might be more about code tuning and tweaking rather than about hardware resources.

  3. I think the problem here is simply that David Cameron doesn’t understand that this isn’t so much an IT project as it is a science project. You’d be daft to say one should split up, say, a world-class telescope into smaller components since it’s obvious the individual components can’t perform the same tasks as the whole, yet those without any knowledge of HPC don’t seem to understand that for these types of computational science projects, the same holds true. The whole is, in a sense, greater than the sum of its parts.

    I’d be interested in knowing more about the types of simulations they intend to run, and whether this was pursued by IBM as a counterbalance to the HECToR (Cray XT4) system which Edinburgh’s EPCC is also involved with.

  4. Brian – I think that’s a great example of why Cameron’s argument doesn’t make sense. I’m going to start using it in my own conversations with these sorts of people.

  5. As HPC becomes more pervasive into mainstream scientific/research culture, I expect we will see more of these stories popping up across the globe. The price tag associated with large compute of any sort is always hard for the politicians [ergo, accountants] to swallow. In many cases they deal exclusively in tangibles, for which scientific research doesn’t always produce: hence being “research”. 🙂