Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Making the Cloud Easier to Use

Tom Wilkie, Scientific Computing World

Tom Wilkie, Scientific Computing World

HPC is sometimes perceived as difficult to use by scientists and engineers. The same issue affects uptake of cloud computing, as Tom Wilkie from Scientific Computing World notes in his third report from the ISC Cloud conference.

The cloud is ready for high-performance computing (HPC). Enthusiasm is large, but progress is slow,” Karsten Gaier from the software vendor Nice told the ISC Cloud conference in Heidelberg at the end of September. Why should this be so since, as Thomas Goepel from HP pointed out, the cloud ought to be attractive to small and medium sized enterprises that cannot afford their own supercomputing clusters?

David Pellerin, Business Development Principal, High Performance Cloud Computing, at Amazon Web Services, assured his audience that cloud providers themselves offer added value to their customers: “We are not offering infrastructure-as-a-service. All cloud companies offer services layered on top, for example remote desktops in the cloud.”

Other companies have been launched specifically to bring the advantages of cloud computing to scientists and engineers. SimScale is a relatively recent spin-out from the Technical University in Munich. Founded as an engineering consultancy company, with a focus on numerical simulation, in 2010, SimScale has now developed its browser-based CAE platform. According to its managing director, David Heiny, it sees itself as a native cloud provider. ‘Our target customers are people completely new to simulation,’ he said.

Part of the reason for the relatively slow uptake in cloud computing by science and engineering companies, according to Oliver Tennert, director of technology management at Transtec, is that it is only comparatively recently that such simulation tools and other advances – for example, those that make remote visualisation possible – have become available.

As in so much of high-performance computing, where uptake depends on ISVs developing end-user application software that engineers and scientists find easy to use without having to turn to specialists in parallel programming, so Tennert’s analysis seemed to extend this point to technical computing in the cloud. Intel’s Stephan Gillich endorsed the point that ease of use was an issue across the whole of simulation and not unique to the cloud. Thomas Goepel, from HP, remarked that HPC was stillnot the ‘killer app’ for innovation and economic growth because of its complexity, a shortage of skills, limited scalability of existing software and high cost of hardware. “When we looked at the cloud,” he said, “these issues were the same.”

Despite David Pellerin’s assurances that cloud providers were giving their customers added value, ease of use featured as a major issue in widespread adoption of the cloud. Wim Slagter, from Ansys, pointed out that small and medium-sized companies perceive different barriers from those recognised by the big companies – and ease of use was foremost among them.

It was therefore a good call by the conference organisers to dedicate a specific session to the concerns and interests of the ISVs because, if this analysis holds good, then the advent of ISVs, such as the ones featured in that session of the conference, who are interested in reaching out to end-user scientists and engineers through the cloud can only accelerate the uptake of the cloud.

At a fundamental level, there are two clouds. Michael Feldman of intersect360 Research pointed out that his company’s data indicated that only one third of cloud-related spending in high-performance computing is in the public cloud; some two-thirds is on private clouds. His point was echoed by Transtec’s Tennert: ‘We help customers build up their own private cloud and have not made the leap to public clouds. Moving customers into the public cloud means that the customers have to trust that their data will be safe.’ High-performance computing constitutes a vital part of the data chain for many companies, he continued, which is why there is an issue for them in letting the data off their premises. Gaier, from Nice, pointed out that there is no standard process to access the cloud through company firewalls, so the simple act of connectivity is itself difficult. In the words of Ansys’s Wim Slagter: “One size does not fit all,” when it comes to cloud deployment.

However, SimScale’s Heiny thought the issue was more about perception than reality: ‘What we see is the diffuse fear of users about putting the data up there – the fear of the risk.’ But even though, in his opinion, ‘Technically, everything is there for security,’ he conceded that ‘supercritical data will never be transferred to the cloud.’

CST’s Felix Wolfheimer also accepted that: ‘The cloud will never be for the really business critical confidential information. What is safer or more secure: my own environment or the cloud environment?’

A foggy future for cloud computing? was the first report from the ISC Cloud conference and The Cloud of Unknowing was the second.

Sign up for our insideHPC Newsletter.

Resource Links: