The University of Washington has won three recent awards from the National Science Foundation related to cloud computing.
…Howe’s project aims to provide that interactivity for tens of thousands of gigabytes of simulation results. He created a tool, GridFields, to visualize the polygonal mesh of climate simulation output, and is now working to redesign GridFields to be efficient in a cloud computing environment.
…[A second grant] will prepare astronomers to deal with data coming from telescopes scheduled to come online in coming years, such as the Large Synoptic Survey Telescope, of which the UW is a founding institution. The telescope’s 27-foot mirror is connected to a 3.2 billion-pixel camera that takes pictures every 15 seconds. It is expected to record more than 30,000 gigabytes of data and detect more than 100 million astronomical sources every night.
“Cloud computing enables us to scale to the point where we can actually analyze that sort of data,” Connolly said.
There is a third grant to support the development of a cloud computing curriculum.
The advantage here is that, for problems where a standard scale out solution provides acceptable performance (not tightly coupled solutions), it will probably make sense for researchers to rent time than buy. No infrastructure problems, no admins, no technology obsolescence. But the existing infrastructures (EC2, etc.) are more complicated than scientists care to deal with. Projects like these, and the proteomics package discussed earlier last week, are bridging this gap.