Today Italy’s A3Cube announced the F-730 Family of EXA-Converged parallel systems built on Dell servers and achieving sub-microsecond latency through bare metal data access. “A3Cube’s EXA-Converged infrastructure represents the next step in the evolution of converged systems”, said Emilio Billi, A3Cube’s CTO, “while keeping and improving on the scalability and resilience of Hyper-Converged infrastructure. It is engineered to converge all system resources and provide parallel data access and inter node communication at the bare metal level, eliminating the need for, and the limits of, traditional Hyper-converged systems. The system can efficiently use all the fastest storage devices currently on the market or planned to come to market, and puts all existing solutions in the rear view mirror.”
In this TACC Podcast, Jorge Salazar reports that scientists and engineers at the Texas Advanced Computing Center have created Wrangler, a new kind of supercomputer to handle Big Data.
“As a research area, quantum computing is highly competitive, but if you want to buy a quantum computer then D-Wave Systems, founded in 1999, is the only game in town. Quantum computing is as promising as it is unproven. Quantum computing goes beyond Moore’s law since every quantum bit (qubit) doubles the computational power, similar to the famous wheat and chessboard problem. So the payoff is huge, even though it is expensive, unproven, and difficult to program.”
Today Hewlett Packard Enterprise announced HPE Haven OnDemand, an innovative cloud platform that provides advanced machine learning APIs and services that enable developers, startups and enterprises to build data-rich mobile and enterprise applications. Delivered as a service on Microsoft Azure, HPE Haven OnDemand provides more than 60 APIs and services that deliver deep learning analytics on a wide range of data, including text, audio, image, social, web and video.
Registration opened today for the ISC 2016 conference, which takes place June 19-23 in Frankfurt. This year, the ISC 2016 conference program features an increased focus on Cloud, Machine Learning, and Robotics. In fact, insideHPC has learned that bulk of topics normally covered at the annual ISC Cloud conference have been absorbed into the ISC High Performance industry track. To learn more, we caught up with Wolfgang Gentzsch, a member of the ISC Steering Committee who has chaired the ISC Cloud event since its beginnings.
“DDN’s selection for seven consecutive years as strategic partner and storage vendor of choice by the overwhelming majority of supercomputer centers on the Top500 list is a testament to the continuous innovation and performance leadership we bring to the HPC space,” said Alex Bouzari, CEO and founder, DDN. “From SSD to Persistent Storage and Archive, File Systems to Object Stores and Burst Buffers, DDN’s comprehensive end to end data lifecycle solutions continue to power the most data-intensive workflows in the world – generation after generation.”
“In high performance computing, data sets are increasing in size and workflows are growing in complexity. Additionally, it is becoming too costly to have copies of that data and, perhaps more importantly, too time and energy intensive to move them. Thus, the novel Zero Copy Architecture (ZCA) was developed, where each process in a multi-stage workflow writes data locally for performance, yet other stages can access data globally. The result is accelerated workflows with the ability to perform burst buffer operations, in-situ analytics & visualization without the need for a data copy or movement.”
Today Sony Corporation announced that it has reached an agreement with Altair Semiconductor to acquire the company. The purchase price is $212 million U.S. dollars (approximately 25 billion yen), and Sony expects to complete the acquisition in early February, 2016.
“As a result of a new alliance with Intel, HP is offering its HPC Solutions Framework based on HP Apollo servers, which are specialized for HPC and now optimized to support industry- specific software applications from leading independent software vendors. These solutions will dramatically simplify the deployment of HPC for customers in industries such as oil and gas, life sciences and financial services. The HP Apollo product line integrates Intel’s technology innovation from its HPC scalable system framework, which helps to extend the resilience, reliability, power efficiency and price/performance of the HP Apollo solutions.”
“If you think of a data mart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.” These “data lake” systems will hold massive amounts of data and be accessible through file and web interfaces. Data protection for data lakes will consist of replicas and will not require backup since the data is not updated. Erasure coding will be used to protect large data sets and enable fast recovery. Open source will be used to reduce licensing costs and compute systems will be optimized for map reduce analytics. Automated tiering will be employed for performance and long-term retention requirements. Cold storage, storage that will not require power for long-term retention, will be introduced in the form of tape or optical media.”