In this slidecast, Tony DeVarco from SGI describes how the company delivers Production Supercomputing for SMEs. “As the trusted leader in high performance computing, SGI helps companies find answers to the world’s biggest challenges. Our commitment to innovation is unwavering and focused on delivering market leading solutions in Technical Computing, Big Data Analytics, and Petascale Storage. Our solutions provide unmatched performance, scalability and efficiency for a broad range of customers.”
“We have taken a flexible and adaptive approach to multi-level security that combines recent community advances with DDN-developed Lustre features to deliver full Lustre isolation, while minimizing the performance impact of the system overhead associated with implementing security,” said Robert Triendl, senior vice president, global sales, marketing and field services, DDN. “This approach lets us customize configurations that meet our customers’ security requirements without them having to sacrifice workflow efficiency.”
Today Cray announced the Department of Defense High Performance Computing Modernization Program (HPCMP) has awarded the Company with a $26 million supercomputer contract for a Cray XC40 supercomputer and three Cray Sonexion storage systems.
In this podcast, the Radio Free HPC team previews the ancillary events around SC16 in Salt Lake City. With a full week in store, this could be the best conference yet. After our event roundup, they share their predictions for SC16 total attendance numbers.
‘Trading firms in the STAC Benchmark Council designed the STAC-M3 suite to represent a range of performance challenges that are common in financial time-series analysis,” said Peter Lankford, Director of STAC. “As data volumes grow and as the query demands from quants, machines, and regulators increase, it is more important than ever for tick database solutions to perform well at scale. Kx’s kdb+, running on Dell EMC DSSD D5 and PowerEdge servers, has established performance records while testing on the largest STAC-M3 data scale so far.”
SC16 returns to Salt Lake City on Nov. 13-18. The Six-day supercomputing event features internationally-known expert speakers, cutting-edge workshops and sessions, a non-stop student competition, the world’s largest supercomputing exhibition,panel discussions and much more. “No other annual event showcases the revolutionary advances and possibilities of high performance computing than the annual ACM/IEEE International Conference for High Performance Computing, Networking, Data Storage Analysis. From the impact of HPC on the future of medicine, to its transformative power in developing countries and “smart cities.” SC is the premiere venue for presenting leading-edge HPC research.”
“The results of DDN’s annual HPC Trends Survey reflect very accurately what HPC end users tell us and what we are seeing in their data center infrastructures. The use of private and hybrid clouds continues to grow although most HPC organizations are not storing as large a percentage of their data in public clouds as they anticipated even a year ago. Performance remains the top challenge, especially when handling mixed I/O workloads and resolving I/O bottlenecks.”
Today IBM announced new hybrid cloud all-flash storage solutions developed to modernize and transform storage deployments, providing a strong bridge to the development of cognitive applications. These new solutions and software allow clients to store their valuable data where it makes the best business sense.
Today Hewlett Packard Enterprise announced that it has completed its acquisition of SGI, a global leader in high performance solutions for compute, data analytics and data management, for $7.75 per share in cash. “This deal combines SGI’s computing strengths with HPE’s global reach,” said Antonio Neri, executive vice president and general manager, Enterprise Group, Hewlett Packard Enterprise. “SGI’s technologies and services will further our position in high-performance computing and give our customers the best of data management capabilities for real time analytics.”
“Unchecked data growth and data sprawl are having a profound impact on life science workflows. As data volumes continue to grow, researchers and IT leaders face increasingly difficult decisions about how to manage this data yet keep the storage budget in check. Learn how these challenges can be overcome through active data management and leveraging cloud technology. The concepts will be applied to an example architecture that supports both genomic and bioimaging workflows.”