Does your research generate, analyze, and/or visualize data using advanced digital resources? In its recent Call for Participation, the CADENS project is looking for scientific data to visualize or existing data visualizations to weave into larger documentary narratives in a series of fulldome digital films and TV programs aimed at broad public audiences. Visualizations of your work could reach millions of people, amplifying its greater societal impacts!
“We’ve had a great time here in Austin talking about data centric computing– the ability to use IBM Spectrum Scale and Platform LSF to do Cognitive Computing. Customers, partners, and the world have been talking about how we can really bring together file, object, and even business analytics workloads together in amazing ways. It’s been fun.”
At SC15, Intel talked about some transformational high-performance computing technologies and the architecture—Intel® Scalable System Framework (Intel® SSF). Intel describes Intel SSF as “an advanced architectural approach for simplifying the procurement, deployment, and management of HPC systems, while broadening the accessibility of HPC to more industries and workloads.” Intel SSF is designed to eliminate the traditional bottlenecks; the so called power, memory, storage, and I/O walls that system builders and operators have run into over the years.
SC15 has announced the winners of the Student Cluster Competition, which took place last week in Austin. Team Diablo, a team of undergraduate students from Tsinghua University in China, was named the overall winner. “The competition is a real-time, non-stop, 48-hour challenge in which teams of six undergraduates assemble a small cluster at SC15 and race to complete a real-world workload across a series of scientific applications, demonstrate knowledge of system architecture and application performance, and impress HPC industry judges.”
In this special guest feature from Scientific Computing World, Robert Roe writes that software scalability and portability may be more important even than energy efficiency to the future of HPC. “As the HPC market searches for the optimal strategy to reach exascale, it is clear that the major roadblock to improving the performance of applications will be the scalability of software, rather than the hardware configuration – or even the energy costs associated with running the system.”
In this video from SC15, Larry Jones from Seagate provides an overview of the company’s revamped HPC storage product line, including a new 10,000 RPM ClusterStor hard disk drive tailor-made for the HPC market. “ClusterStor integrates the latest in Big Data technologies to deliver class-leading ingest speeds, massively scalable capacities to more than 100PB and the ability to handle a variety of mixed workloads.”
In this video from SC15, Dr. Eng Lim Goh from SGI describes how the company is embracing new HPC technology trends such as new memory hierarchies. With the convergence of HPC and Big Data as a growing trend, SGI is envisions a “Zero Copy Architecture” that would bring together a traditional supercomputer with a Big Data analytics machine in a way that would not require users to move their data between systems.
This week at SC15, E4 Computer Engineering from Italy announced its active participation in the OpenPOWER Foundation, an open technical community based on the POWER architecture, enabling collaborative development and opportunity for member differentiation and industry growth. Visit the OpenPOWER Foundation homepage for more information. E4 Computer Engineering has been developing innovative platforms for a number of years, specifically focused on solutions applied to HPC environments. E4 Computer Engineering’s participation in OpenPOWER is a natural next step in the company’s commitment to next generation technology. POWER architecture-based products enable customers to boost performance of their infrastructure while increasing efficiency and scalability.