Microsoft unwraps technical computing initiative, leaves most to the imagination

Print Friendly, PDF & Email

In an email posted on the “executive e-mail blog” earlier this month Microsoft Server and Tools Business president Bob Muglia announced Microsoft’s Technical Computing Initiative

Microsoft logoOur goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at to discuss trends, challenges and shared opportunities.

They say it’s good to have a vision, and Microsoft is long on vision in the announcement

One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems.

In order to realize this vision Microsoft says it will be investing in three core areas: cloud tools for technical computing (this is the angle that Microsoft evidently talked up as it was pre-briefing reporters), tools for writing parallel apps, and tools that facilitate technical computing. Since it’s not clear from my shorthand how that last one is different from the second one, here’s what Muglia had to say

Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology.

So what’s at The homepage (which takes a long time to load) is a heavy silverlight app with a bunch of marketing videos about why science and computing are Good. Useful as far as it goes, if you are trying to inspire young people. Nota bene: HPC Rock Star Thomas Sterling is featured, as are HPC luminaries like Tony Hey, Horst Simon, and Dan Reed; HPC Rock Star Bill Kramer is scheduled to debut later, along with Gordon Bell, Jack Dongarra, Burton Smith, and others. That’s part one of the site.

Part two is the “social ecosystem,” which gathers tweets related to technical computing. Two columns of three tweets each that scroll. With the promise of more in the future — things like abstracts from journals. So…that’s part two.

B for effort, for content…not so much

To be honest, the launch left me wanting to know more, and irritated that Microsoft wasn’t putting it out there, especially given that they launched a new web site dedicated specifically to the idea. They obviously put in a lot of effort creating a snazzy site and building all those videos (but, seriously, no copy and paste? Of any text? What the hell?). And kudos for the possible good the videos might do in inspiring someone to pick a science and engineering field.

But with some of the smartest computational people of this and the past two generations on Microsoft’s payroll, I thought surely there was more involved. What’s really going on here?

So yesterday I talked with Microsoft’s Kyril Faenov about the announcement. Kyril leads the Technical Computing Group, which includes Parallel Computing Platform and Windows HPC Server. One of the first things I asked him was, essentially, “where’s the beef?

Faenov says that this announcement put a “stake in the ground” for Microsoft (his words), and that it marks the “beginning of a conversation” that Microsoft will be having with the community over the coming months and years as it looks to develop tools to bring more technical computing to more people.

15 million users are a big target

According to Faenov, Microsoft’s own analysis indicates that there are 15 million “technical computing” users out there — domain specialists, analysts, and so on — who could potentially benefit from more powerful tools. As the most ubiquitous computing platform on the planet, Microsoft wants to be the provider of choice of tools that build an easier to use infrastructure and workflow for these users.

But with so much diversity in the types of work that are being lumped together in that 15 million, where is the opportunity for one-size-fits-most software? “We will be helping users with specialized requirements through our partners,” Faenov says, “but there is still a lot of commonality in basic tasks across those user groups, and those are situations in which Microsoft can add value. Areas like visualization, and tools for prototyping mathematical models.” When I started to dig in and look for specific examples, Faenov would only say that they are still working on the strategy and they hope to be able to say more later this fall.

The pain points, and a bottle of Azure salve

“Microsoft sees three top pain points for technical computing users,” explains Faenov. “Skill sets and tools for parallel programming, cost efficiency for computing at large scale, and infrastructure challenges.” And here is where we get to the part of this technical computing initiative where Microsoft has something to offer in the very near term.

According to Faenov Microsoft will be integrating support for burst computing — grabbing cycles from somewhere else to satisfy a transient computing demand — into this summer’s release of HPC Server. Initially this support will be for scavenging cycles from the other Windows computers on your enterprise’s network, but next year this will be expanded to include integration with Azure, Microsoft’s cloud computing offering. More of an Amazon EC2 kind of gal? Just point your cycle scavenger in that direction and you can use Amazon’s cloud instead of machines scattered around your own network.

Same mistake, second verse

When I first dug into Muglia’s executive letter, I was dramatically underwhelmed — upset even — by the degree to which Microsoft seemed to have missed the boat. After having dug into it with Kyril, I’m at least cautiously optimistic that Microsoft has a plan and is executing against it. But for the amount of attention this release got, perhaps unintentionally from Microsoft’s perspective, I would suggest in hindsight that they should have waited a year to make this push. It really needs to have a few accomplishments and an articulated strategy in order to be taken seriously.

Microsoft made this same mistake with the first round of their HPC operating system development. Big fanfare, with not much to show for it at the time. Over subsequent years they’ve managed to build out a capability that people are finally ready to take seriously.

It appears they didn’t take any lessons at all from that first experience and the backlash from the HPC community. Again they are making a big announcement about something they are going to start doing, real soon now.

I find this frustrating, but I’m still listing myself as a Microsoft supporter in technical computing. Of all the companies currently in the market, only Microsoft and Intel have the wherewithal to bring forth a complete, mature solution for the technical computing ecosystem that dramatically expands the number of people who do technical and high performance computing. I hope at least one of them succeeds.