Data Center Floor Space Utilization

Data Center Utilization Image

You would probably not be surprised to learn that liquid submersion cooling of your servers can save you money on power and cooling costs. Yet, you might be surprised when you read this paper and find that liquid cooling also provides greater efficiency in terms of data center floor space.

Webinar: Pushing the Performance Limit of Virtual Wind Tunnels

Virtual Wind Tunnel image

External aerodynamics analysis plays a key role in modern automotive design. But while performance can be tested in physical wind tunnels, doing so is extremely costly — and often numerous test runs are required to determine the changes needed to improve results. Virtual wind tunnel simulations provide an alternative by allowing design engineers to study aerodynamic loads – reducing the need for physical wind tunnel testing.

HPC Life Sciences Report

HPC Life Sciences

For decades, HPC systems have accelerated life sciences research, often by helping to identify and eliminate in feasible targets sooner. That is, it’s easier to find needles in haystacks if you can eliminate the hay. A new white paper from Intersect360 Research focuses on speeding up HPC life sciences research.

insideHPC Launches HPC Events Calendar

insideHPC Events Calendar

Today Inside HPC announced the roll out of a comprehensive HPC events calendar, and offers free listings to event planners. This events calendar is a valuable resource for planning your conference travels in 2014

Sponsored Post: Distributed Content Repository


The Distributed Content Repository from NetApp was designed to extend object-based storage across geographies and provide a repository for unstructured content that is effectively infinite, practically immortal, and highly intelligent. In a new ESG Lab report, the NetApp Distributed Content Repository is examined in the context of transparent connectivity to massively scalable, object-based storage for enterprises in data centers

ESG White Paper: NetApp Open Solution for Hadoop

Hadoop is an open source and significant emerging technology for solving business problems around large volumes of mostly unstructured data that cannot be analyzed with traditional database tools. The NetApp Open Solution for Hadoop combines the power of the Hadoop framework with flexible storage, professional support, and services of NetApp and its partners to deliver higher Hadoop cluster availability and efficiency.

White Paper: Distributed Content Repositories

NetApp Logo small

Content is growing fast, and objects that must be stored are increasing in both size and lifespan. To help meet this challenge, NetApp provides content-repository solutions with data scalability, enterprise reliability, and object management.

Sponsored Post: Flexpod Select for Hadoop

NetApp Logo small

The FlexPod Select for Hadoop is an extension of FlexPod initiative built based on Cisco Common Platform Architecture (CPA) for HPC and Big Data for deployments that need enterprise class external storage array features.

Object Storage for Dummies eBook

Object storage is not just about adding new storage functionality. It also fundamentally reduces the complexity of how applications interact with storage. A new eBook offers timely insight into this important new field.

Sponsored Post: Balancing High Performance with Cost when Analyzing Big Data

NetApp Logo small

HPC and big data professionals need high performance to ingest and analyze huge amounts of data, while still managing power and cost efficiently.