Attaining High-Performance Scalable Storage

This second article is an editorial series that explores high performance storage and the benefits of Lustre solution for HPC. This week we look how Lustre enables scalable storage solution for business.

InsideHPC Guide to Lustre Solutions for Business

The recent release of a commercial version of the Lustre* parallel file system was big news for business data centers facing ever expanding data analysis and storage demands. Now, Lustre, the predominant high-performing file system installed in most of the supercomputer installations around the world, could be deployed to business customers in a hardened, tested, easy to manage and fully supported distribution.

HPC Virtualization and Workload Agility

Virtualization allows workloads to be compartmentalized in their own VM in order to take full advantage of the underlying parallelism of today’s multicore, heterogeneous HPC systems without compromising security. This approach is particularly beneficial for organizations centralizing multiple groups on to a shared cluster or for teams with security issues – for example, a life sciences environment where access to genomic data needs to be restricted to specific researchers.

Data Center Floor Space Utilization

You would probably not be surprised to learn that liquid submersion cooling of your servers can save you money on power and cooling costs. Yet, you might be surprised when you read this paper and find that liquid cooling also provides greater efficiency in terms of data center floor space.

Webinar: Pushing the Performance Limit of Virtual Wind Tunnels

External aerodynamics analysis plays a key role in modern automotive design. But while performance can be tested in physical wind tunnels, doing so is extremely costly — and often numerous test runs are required to determine the changes needed to improve results. Virtual wind tunnel simulations provide an alternative by allowing design engineers to study aerodynamic loads – reducing the need for physical wind tunnel testing.

HPC Life Sciences Report

For decades, HPC systems have accelerated life sciences research, often by helping to identify and eliminate in feasible targets sooner. That is, it’s easier to find needles in haystacks if you can eliminate the hay. A new white paper from Intersect360 Research focuses on speeding up HPC life sciences research.

insideHPC Launches HPC Events Calendar

Today Inside HPC announced the roll out of a comprehensive HPC events calendar, and offers free listings to event planners. This events calendar is a valuable resource for planning your conference travels in 2014

NASA’s Rupak Biswas Investigates the Weird World of Quantum Computing

In this special feature, John Kirkley talks with Dr. Biswas to learn more about NASA’s fledgling involvement in the weird world of quantum computing.

Sponsored Post: Distributed Content Repository

The Distributed Content Repository from NetApp was designed to extend object-based storage across geographies and provide a repository for unstructured content that is effectively infinite, practically immortal, and highly intelligent. In a new ESG Lab report, the NetApp Distributed Content Repository is examined in the context of transparent connectivity to massively scalable, object-based storage for enterprises in data centers

ESG White Paper: NetApp Open Solution for Hadoop

Hadoop is an open source and significant emerging technology for solving business problems around large volumes of mostly unstructured data that cannot be analyzed with traditional database tools. The NetApp Open Solution for Hadoop combines the power of the Hadoop framework with flexible storage, professional support, and services of NetApp and its partners to deliver higher Hadoop cluster availability and efficiency.