The UK’s newly formed e-Infrastructure Leadership Council is conducting a survey to help plan future activities and investments. The ELC oversees investment and coordination for the government-funded UK e-Infrastructure, especially HPC, including a recently-announced £158m investment.
This is your chance to directly influence not only how this funding is divvied up, but also affect how the UK views data-intensive science, business, and digital technologies for years to come.
The survey is targeted at UK businesses & SMEs in particular, but all respondents are welcome. Read the Full Story.
At a White House event yesterday, NSF Director Subra Suresh announced a new Big Data solicitation including a $10 million Expeditions in Computing award and awards in cyberinfrastructure, geosciences, and training.
Data are motivating a profound transformation in the culture and conduct of scientific research in every field of science and engineering,” Suresh said. “American scientists must rise to the challenges and seize the opportunities afforded by this new, data-driven revolution. The work we do today will lay the groundwork for new enterprises and fortify the foundations for U.S. competitiveness for decades to come.”
NSF released a solicitation, “Core Techniques and Technologies for Advancing Big Data Science & Engineering,” or “Big Data,” jointly with NIH. This program aims to extract and use knowledge from collections of large data sets in order to accelerate progress in science and engineering research. Specifically, it will fund research to develop and evaluate new algorithms, statistical methods, technologies, and tools for improved data collection and management, data analytics and e-science collaboration environments.
An Energy Department effort to create an exascale computer will receive $126 million during the coming federal fiscal year under a Senate Appropriations Committee markup of the DOE spending bill. In a report accompanying the bill, Senate appropriators say Energy plans to deploy the first exascale system in 2018, despite challenges.
An exascale system made from today’s technology “would probably cost $100,000,000,000, require $100,000,000 a year to operate, need its own dedicated power plant to power the computing system, and be very unreliable,” the report states. A computer capable of executing an exaflop would be able to conduct one million trillion calculations per second.
After taking some lumps in the trade press recently, the EU is stepping up with a new initiative called PROSPECT, an open European association of leading suppliers and users of supercomputers from industry and academia. As a Technology Platform for HPC, PROSPECT’s aim is to help Europe realize the potential benefits of High Performance Computing for science and industry.
The European Technology Platform on HPC will substantially contribute to increasing European competitiveness in the development and ownership of novel and independent HPC technologies,” said Francesc Subirada, Chairman of the PROSPECT Executive board. “PROSPECT acts according to democratic principles and remains open to any organization that might want to join us in supporting European industry, science and society.”
In this video, Austan Goolsbee, Chairman of the Council of Economic Advisers, explains the President’s plan to reform the patent system so that great technology ideas can be turned into the jobs. According to Goolsbee, the Patent Office now has a backlog of over 700,000 applications, with an average wait time of over three years.
I believe that these efforts are important to our nation’s ability to compete, and I would also applaud any effort to streamline the patent dispute process, which can drag on for years as well.
And as a side note, I’m wondering who on Earth writes on a white board with a seriph font?
Ed Note: Slashdot reports: “The US Senate is congratulating itself for passing a ‘landmark’ piece of patent reform legislation. Some key elements are ‘first to file’ instead of first to invent, and ending fee diversion, which means fees paid to the Patent Office will actually fund the Patent Office. Curiously, this practice has resulted in a backlog of 700,000 patent applications. The House is reportedly working on a similar bill, and soon harmony and rationality will triumph.”
Steve Lohr over at the New York Times writes that performance gains from algorithm improvements often far outpace application speedups from faster processors attributable to Moore’s Law.
There are no such laws in software. But the White House advisory report cited research, including a study of progress over a 15-year span on a benchmark production-planning task. Over that time, the speed of completing the calculations improved by a factor of 43 million. Of the total, a factor of roughly 1,000 was attributable to faster processor speeds, according to the research by Martin Grotschel, a German scientist and mathematician. Yet a factor of 43,000 was due to improvements in the efficiency of software algorithms.
The Presidential Advisory PCAST report mentioned in this story ecommends investment in software as well as supercomputing hardware technologies in order to maintain U.S. competitiveness.
On Friday I awoke to the sound of helicopters flying low over my home in Portland. Things really get shaken up around here when the President comes to town.
In this video, President Obama addresses on the importance of education to this nation’s global competitiveness:
Companies like Intel are proving that we can compete – that instead of just being a nation that buys what’s made overseas, we can make things in America and sell them around the globe. Winning this competition depends on the ingenuity and creativity of our private sector – which was on display in my visit today. But it’s also going to depend on what we do as a nation to make America the best place on earth to do business.
Patrick Thibodeau writes that, for the first time, the 2012 budget proposal explicitly references “exascale.” If approved, the DOE will get $126 million for exascale development. The DOE budget had budgeted just over $24 million in 2011, but this was in context of “extreme scale” computing.
The DOE has not yet said how exascale funding will be used, but the supercomputing research community has active research efforts in progress. In the interim, DOE is now building 10 petaflop systems, such as the recently announced IBM system planned at Argonne National Laboratory.
In this video, US Department of Energy Secretary Steven Chu holds a media briefing on the Fiscal Year 2012 Budget proposed by President Obama. While far from passage, the budget includes a salary and bonus freeze for National Laboratory site and facility management contractor employees. The money saved will reportedly be reinvested in the Labs.
A tip of the hat goes to Chris O’Neal for pointing us to this story.
It was great to hear the President talking about supercomputing in his recent State of the Union address. He must have gotten an invite to check it out for himself up here in Portland.
Today the White House announced that President Obama will visit Intel Corporation in Hillsboro, Oregon. While there, he will tour one of the world’s largest and most advanced semiconductor manufacturing facilities with Intel CEO Paul Otellini. The President will also learn more about Intel’s STEM (Science, Technology, Engineering and Math) education programs and Intel’s efforts to better prepare the next generation to compete for high-tech jobs and be the minds behind the next great inventions.
According to the Oregonian newspaper, the President will visit on Feb. 18, 2011.
In his State of the Union address this week, President Obama cited work being performed at ORNL as an example of cutting edge research aimed at solving the energy challenge.
If they assemble teams of the best minds in their fields, and focus on the hardest problems in clean energy, we’ll fund the Apollo Projects of our time. At the California Institute of Technology, they’re developing a way to turn sunlight and water into fuel for our cars. At Oak Ridge National Laboratory, they’re using supercomputers to get a lot more power out of our nuclear facilities. With more research and incentives, we can break our dependence on oil with biofuels, and become the first country to have 1 million electric vehicles on the road by 2015.”
Bill Gropp from the University of Illinois looks at the recent PCAST report, which calls for “substantial and sustained” investment in a broad range of basic research for HPC. As part of the team that authored the report, Gropp calls on the community to embrace the idea that we need to rethink every aspect of high performance computing:
The list of topics, all areas in which too little is known and in which there is currently too little research, is quite sobering. Breakthroughs in only a few of these would transform computing. The NSF, DARPA, and DOE are taking the first steps to address these, but they need to be able to do more. And the community, in particular the HPC community, needs to be willing to take more risks. The past two decades have seen significant stability in HPC; even with a factor of 10,000 or more increase in scale, the programming models and many of the algorithms have only slowly changed. This is the time to rethink all aspects of computing—the hardware, the software, and the algorithms. Without a sustained investment in basic research into HPC, the historic increase in performance of HPC systems will slow down and eventually end. With such an investment, HPC will continue to provide scientists and engineers with the ability to solve the myriad of challenges that we face.
With help from federal stimulus funds from the National Science Foundation, Colorado State University is getting ready to dedicate a new Cray supercomputer that will be available to all university researchers.
As the anchor system for the Information Science and Technology Center (ISTeC), Colorado’s new Cray XT6m model features 1,248 cores, 1.6 terabytes of main memory, and 32 terabytes of disk storage. The Cray system will support much larger and more complex problems in science and engineering, especially for data intensive applications; add greater physical fidelity to existing models; facilitate application of computing to new areas of research and discovery; and support training to attract new researchers to computational science, engineering and mathematics. Full Story
Contrary to what you might have heard, Global Climate Change is not a new idea. Our Video Sunday feature continues with the late Isaac Asimov describing Global Warming as the biggest story in Science of 1988.
I have been talking about the greenhouse effect for 20 years at least,” says Asimov in the video. “And there are other people who have talked about it before I did. I didn’t invent it.”
A tip of the hat goes to Dr. Amber Jenkins NASA’s Jet Propulsion Laboratory for pointing us to this story. Her blog on climate change is very sobering, and I’m afraid that, unless more of us start paying attention, humanity won’t be around to see the wonder of Asimov’s future worlds ever come to pass.
Microsoft’s Elizabeth L. Grossman reflects on passage of the America COMPETES Reauthorization Act by the U.S. Congress:
One area highlighted in the America COMPETES Reauthorization Act for additional research is cloud computing. The broad potential of this field is demonstrated by the current partnership between Microsoft and the National Science Foundation, through which Microsoft will provide free access to advanced cloud computing resources for select NSF-funded researchers for the next three years.
The message here is all about job creation. The enabling technologies may lie in the Cloud, but the rain dance starts in Washington, D.C..