In this special guest feature from Scientific Computing World, Tom Wilkie reports on two US initiatives for future supercomputers, announced at the ISC in Frankfurt in July.
Today the Texas Advanced Computing Center (TACC) announced that its new Chameleon testbed is in full production for researchers across the country. Designed to help investigate and develop the promising future of cloud-based science, the NSF-funded Chameleon is a configurable, large-scale environment for testing and demonstrate new concepts.
Today Intel Corporation and Micron Technology unveiled 3D XPoint technology, a non-volatile memory that has the potential to revolutionize any device, application or service that benefits from fast access to large sets of data. Now in production, 3D XPoint technology is a major breakthrough in memory process technology and the first new memory category since the introduction of NAND flash in 1989.
Today Univa joins Google, IBM, and other world-class companies founding members of the Cloud Native Computing Foundation (CNCF). The new CNCF organization will accelerate the development of cloud native applications and services by advancing a technology stack for data center containerization and microservices.
Today IBM along with Nvidia and two U.S. Department of Energy National Laboratories today announced a pair of Centers of Excellence for supercomputing – one at the Lawrence Livermore National Laboratory and the other at the Oak Ridge National Laboratory. The collaborations are in support of IBM’s supercomputing contract with the U.S. Department of Energy. They will enable advanced, large-scale scientific and engineering applications both for supporting DOE missions, and for the Summit and Sierra supercomputer systems to be delivered respectively to Oak Ridge and Lawrence Livermore in 2017 and to be operational in 2018.
In this podcast, the Radio Free HPC team looks at how the KatRisk startup is using GPUs on the Titan supercomputer to calculate global flood maps. “KatRisk develops event-based probabilistic models to quantify portfolio aggregate losses and exceeding probability curves. Their goal is to develop models that fully correlate all sources of flood loss including explicit consideration of tropical cyclone rainfall and storm surge.”
Today IBM Research announced that working with alliance partners at SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering it has produced the semiconductor industry’s first 7nm node test chips with functional transistors. According to IBM, the breakthrough underscores the company’s continued leadership and long-term commitment to semiconductor technology research.
Today JISC in the U.K. announced that Rolls-Royce is the first company to join its industrial supercomputing initiative. Designed to break down barriers between industry and academia, JISC will provide Rolls-Royce with easy access to supercomputing equipment at the Engineering and Physical Sciences Research Council (EPSRC) HPC Midlands.
Daniel Gutierrez, Managing Editor, of insideBIGDATA has put together a terrific Guide to Scientific Research. The goal of this paper is to provide a road map for scientific researchers wishing to capitalize on the rapid growth of big data technology for collecting, transforming, analyzing, and visualizing large scientific data sets.