Trio of SGI Systems to Drive Innovation at SKODA AUTO

Today SGI announced that ŠKODA AUTO has deployed an SGI UV and two SGI ICE high performance computing systems to further enhance its computer-aided engineering capabilities. “Customer satisfaction and the highest standard of production are at the very core of our brand and is the driving force behind our innovation processes,” said Petr Rešl, head of IT Services, ŠKODA AUTO. “This latest installation enables us to conduct complex product performance and safety analysis that will in turn help us to further our commitment to our customer’s welfare and ownership experience. It helps us develop more innovative vehicles at an excellent value-to-price ratio.”

SGI Update: Zero Copy Architecture (ZCA)

“In high performance computing, data sets are increasing in size and workflows are growing in complexity. Additionally, it is becoming too costly to have copies of that data and, perhaps more importantly, too time and energy intensive to move them. Thus, the novel Zero Copy Architecture (ZCA) was developed, where each process in a multi-stage workflow writes data locally for performance, yet other stages can access data globally. The result is accelerated workflows with the ability to perform burst buffer operations, in-situ analytics & visualization without the need for a data copy or movement.”

Changes Afoot from the HPC Crystal Ball

In this special guest feature from Scientific Computing World, Andrew Jones from NAG looks ahead at what 2016 has in store for HPC and finds people, not technology, to be the most important issue. “A disconcertingly large proportion of the software used in computational science and engineering today was written for friendlier and less complex technology. An explosion of attention is needed to drag software into a state where it can effectively deliver science using future HPC platforms.”

Long Live the King – The Complicated Business of Upgrading Legacy HPC Systems

“Upgrading legacy HPC systems relies as much on the requirements of the user base as it does on the budget of the institution buying the system. There is a gamut of technology and deployment methods to choose from, and the picture is further complicated by infrastructure such as cooling equipment, storage, networking – all of which must fit into the available space. However, in most cases it is the requirements of the codes and applications being run on the system that ultimately define choice of architecture when upgrading a legacy system. In the most extreme cases, these requirements can restrict the available technology, effectively locking a HPC center into a single technology, or restricting the application of new architectures because of the added complexity associated with code modernization, or porting existing codes to new technology platforms.”

SGI to Deliver Advanced Data Processing for Nagaoka University of Technology

Today SGI Japan announced that the Nagaoka University of Technology has selected the SGI UV 300, SGI UV 30EX and SGI Rackable servers and SGI InfiniteStorage 5600 for its next education and research integrated high-performance computing system. With a tenfold performance increase over the previous system, the new supercomputer will will start operation on March 1, 2016.

NCAR Selects SGI Supercomputer for Advanced Modeling and Research

Today the National Center for Atmospheric Research announced that it has selected SGI to build one of the world’s most advanced compute systems used to develop models for predicting the impact of climate change and severe weather events on both a global and local scale. As part of a new procurement coming online in 2017, an SGI ICE XA system named “Cheyenne” will perform some of the world’s most data intensive calculations for weather and climate modeling to improve the resolution and precision by orders of magnitude. As a result, NCAR’s scientists will provide more actionable projections about the impact of climate change for specific regions and assist agencies throughout the world develop more accurate weather predictions on a local and global scale.

How HPC is Helping Solve Climate and Weather Forecasting Challenges

Data accumulation is just one of the challenges facing today weather and climatology researchers and scientists. To understand and predict Earth’s weather and climate, they rely on increasingly complex computer models and simulations based on a constantly growing body of data from around the globe. “It turns out that in today’s HPC technology, the moving of data in and out of the processing units is more demanding in time than the computations performed. To be effective, systems working with weather forecasting and climate modeling require high memory bandwidth and fast interconnect across the system, as well as a robust parallel file system.”

insideHPC Guide to Personalized Medicine & Genomics

The insideHPC Guide to Personalized Medicine and Genomics explains how genomics will accelerate personalized medicine including several case studies.

Video: SGI Looks to Zero Copy Architecture for HPC and Big Data

In this video from SC15, Dr. Eng Lim Goh from SGI describes how the company is embracing new HPC technology trends such as new memory hierarchies. With the convergence of HPC and Big Data as a growing trend, SGI is envisions a “Zero Copy Architecture” that would bring together a traditional supercomputer with a Big Data analytics machine in a way that would not require users to move their data between systems.

HPC Helps Drive Climate Change Modeling

Because of the complexity involved, the length of the simulation period, and the amounts of data generated, weather prediction and climate modeling on a global basis requires some of the most powerful computers in the world. The models incorporate topography, winds, temperatures, radiation, gas emission, cloud forming, land and sea ice, vegetation, and more. However, although weather prediction and climate modeling make use of a common numerical methods, the items they compute differ.