Today Ellexus in the UK announced the release of Mistral, a “ground breaking” product for balancing shared storage across a high performance computing cluster. Developed in collaboration with ARM’s IT department, Mistral monitors application IO and cluster performance so that jobs exceeding the expected IO thresholds can be automatically identified and slowed down through IO throttling.
“DDN’s IME14K revolutionizes how information is saved and accessed by compute. IME software allows data to reside next to compute in a very fast, shared pool of non-volatile memory (NVM). This new data adjacency significantly reduces latency by allowing IME software’s revolutionary, fast data communication layer to pass data without the file locking contention inherent in today’s parallel file systems.”
altera2In this video from the 2015 Hot Chips Conference, Mike Hutton from Altera presents: Stratix 10 Altera’s 14nm FPGA Targeting 1GHz Performance. “Stratix 10 FPGAs and SoCs deliver breakthrough advantages in performance, power efficiency, density, and system integration: advantages that are unmatched in the industry. Featuring the revolutionary HyperFlex core fabric architecture and built on the Intel 14 nm Tri-Gate process, Stratix 10 devices deliver 2X core performance gains over previous-generation, high-performance FPGAs with up to 70% lower power.”
ICER at Michigan State is seeking an Information Technologist in our Job of the Week. “As a joint appointment between Michigan State University’s Information Technology Services and the Institute for Cyber-Enabled Research, the storage server administers computer storage clusters totaling a few nodes, including high speed Ethernet network interconnections. The position will involve Linux systems administration and working in a team environment with systems administrators, programmers, and research specialists to support the university’s research computing needs; will deploy and test new systems and services; will monitor, diagnose, support, and upgrade existing services (using the technologies described in the ‘Desired Qualifications’ section); will work with staff to document internal and external procedures; will develop, expand, and implement tools and scripts to facilitate administration ; will work with users on how to use object-oriented Ceph-based systems.”
IDC has published the agenda for their next HPC User Forum. The event will take place April 11-13 in Tucson, AZ. “Don’t miss the chance to hear top experts on these high-innovation, high-growth areas of the HPC market. At this meeting, you’ll also hear about government initiatives to get ready for future-generation supercomputers, machine learning, and High Performance Data Analytics.”
Today Auburn University unveiled its new $1 million supercomputer that will enhance research across campus, from microscopic gene sequencing to huge engineering tasks. The university is also initiating a plan to purchase a new one every few years as research needs evolve and expand.
“Because the silverfly species are identical to look at, the best way to distinguish them is by examining their genetic difference, so we are deploying a mix of genomics, supercomputing, and evolutionary history. This knowledge will help African farmers and scientists distinguish between the harmless and the invasive ones, develop management strategies, and breed new whitefly-resistant strains of cassava. The computational challenge for our team is in processing the genomic data the sequencing machines produce.”
“If you think of a data mart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.” These “data lake” systems will hold massive amounts of data and be accessible through file and web interfaces. Data protection for data lakes will consist of replicas and will not require backup since the data is not updated. Erasure coding will be used to protect large data sets and enable fast recovery. Open source will be used to reduce licensing costs and compute systems will be optimized for map reduce analytics. Automated tiering will be employed for performance and long-term retention requirements. Cold storage, storage that will not require power for long-term retention, will be introduced in the form of tape or optical media.”
Registration is now open for the inaugural Nimbix Developer Summit. With an impressive lineup of speakers & sponsors from Mellanox, migenius, Xilinx, and more, the event takes place March 15 in Dallas, Texas. “The summit agenda will feature topics such as hardware acceleration, coprocessing, photorealistic rendering, bioinformatics, and high performance analytics. The sessions will conclude with a panel of developers discussing how to overcome challenges of creating and optimizing cloud-based applications.”
Today the Pittsburgh Supercomputing Center (PSC) announced a $1.8-million National Institutes of Health grant to make the next-generation Anton 2 supercomputer developed by D. E. Shaw Research (DESRES) available to the biomedical research community. A specialized system for modeling the function and dynamics of biomolecules, the Anton 2 machine at PSC will be the only one of its kind publicly available to U.S. scientists. The grant also extends the operation of the Anton 1 supercomputer currently at PSC until the new Anton 2 is deployed, expected in the Fall of 2016.