In this video, D-Wave Systems Founder Eric Ladizinsky presents: The Coming Quantum Computing Revolution. “Despite the incredible power of today’s supercomputers, there are many complex computing problems that can’t be addressed by conventional systems. Our need to better understand everything, from the universe to our own DNA, leads us to seek new approaches to answer the most difficult questions. While we are only at the beginning of this journey, quantum computing has the potential to help solve some of the most complex technical, commercial, scientific, and national defense problems that organizations face.”
Today Cycle Computing announced its continued involvement in optimizing research spearheaded by NASA’s Center for Climate Simulation (NCCS) and the University of Minnesota. Currently, a biomass measurement effort is underway in a coast-to-coast band of Sub-Saharan Africa. An over 10 million square kilometer region of Africa’s trees, a swath of acreage bigger than the entirety […]
In this video, Steven Pawson discussed how NASA uses computer models to build up a complete three-dimensional picture of El Niño in the ocean and atmosphere. Pawson is an atmospheric scientist and the chief of the Global Modeling and Assimilation Office at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.
NASA Ames reports that SGI has completed an important upgrade to Pleiades supercomputer. “As of July 1, 2016, all of the remaining racks of Intel Xeon X5670 (Westmere) processors were removed from Pleiades to make room for an additional 14 Intel Xeon E5-2680v4 (Broadwell) racks, doubling the number of Broadwell nodes to 2,016 and increasing the system’s theoretical peak performance to 7.25 petaflops. Pleiades now has a total of 246,048 CPU cores across 161 racks containing four different Intel Xeon processor types, and provides users with more than 900 terabytes of memory.”
Over at NICS, Scott Gibson writes that researchers are using XSEDE supercomputing resources to simulate the gaseous outflows from black holes known as astrophysical jets. “These jets can affect galaxy formation and evolution by, for example, heating up the surroundings and suppressing star formation, expelling the surrounding gas and thereby reducing the mass supply to the black hole.”
Altair is making a big investment toward uniting the whole HPC community to accelerate the state of the art (and the state of actual production operations) for HPC scheduling. Altair is joining the OpenHPC project with PBS Pro. They are focused on longevity – creating a viable, sustainable community to focus on job scheduling software that can truly bridge the gap in the HPC world.
Today Spectra Logic announced the Spectra TFinity ExaScale Edition, the world’s largest and most richly-featured tape storage system. “Since 2008, Spectra Logic has worked with engineers in the NASA Advanced Supercomputing (NAS) Division at NASA’s Ames Research Center, in California’s Silicon Valley, first deploying a Spectra tape library with 22 petabytes of capacity. According to NASA, the Spectra tape library’s capacity has grown to approximately one half an Exabyte of archival storage today. After extensive testing over the past year, NASA recently deployed a Spectra TFinity ExaScale Edition in their 24×7 production HPC environment.”
“As a research area, quantum computing is highly competitive, but if you want to buy a quantum computer then D-Wave Systems, founded in 1999, is the only game in town. Quantum computing is as promising as it is unproven. Quantum computing goes beyond Moore’s law since every quantum bit (qubit) doubles the computational power, similar to the famous wheat and chessboard problem. So the payoff is huge, even though it is expensive, unproven, and difficult to program.”
“Upgrading legacy HPC systems relies as much on the requirements of the user base as it does on the budget of the institution buying the system. There is a gamut of technology and deployment methods to choose from, and the picture is further complicated by infrastructure such as cooling equipment, storage, networking – all of which must fit into the available space. However, in most cases it is the requirements of the codes and applications being run on the system that ultimately define choice of architecture when upgrading a legacy system. In the most extreme cases, these requirements can restrict the available technology, effectively locking a HPC center into a single technology, or restricting the application of new architectures because of the added complexity associated with code modernization, or porting existing codes to new technology platforms.”
The fastest supercomputers are built with the fastest microprocessor chips, which in turn are built upon the fastest switching technology. But, even the best semiconductors are reaching their limits as more is demanded of them. In the closing months of this year, came news of several developments that could break through silicon’s performance barrier and herald an age of smaller, faster, lower-power chips. It is possible that they could be commercially viable in the next few years.