Panasas launches tiered parity product, new hardware

Print Friendly, PDF & Email

Today Panasas made several storage announcements. First, they’ve rolled the previously announced tiered parity solution in a shipping product. Then there are enhancements to existing hardware at the high end

AS6000 – Designed to meet the needs of commercial organizations heavily invested in research and product development (R&D) for design, modeling and visualization, the AS6000 includes 20 GB cache per storage shelf with an integrated 10GigE switch that doubles the throughput performance per storage shelf to over 600 MB/s.

AS4000 – Designed primarily for companies dependent on simulation and analysis, like those in the oil & gas, aerospace and automotive sectors, the AS4000 includes an integrated 10GigE switch for unsurpassed speed, plus improved reliability and data availability with the Panasas Tiered Parity architecture.

The 6000 updates the AS5000 previous generation product, and the big advancements are the 10 GbE and 20 GB/shelf cache. Likewise the 4000 replaces the earliers AS3000 generation, and also has the new 10 GbE and a doubled cache/shelf (though this time to 10 GB, not 20). The company has also introduced was it calls the industry’s first parallel second tier storage solution

AS200 – The industry’s first parallel second-tier storage solution, the AS200 configuration includes 104 TBs of available storage space, 5 Gigabit Ethernet ports for fast data transfer, and Panasas Tiered Parity data protection. The solution extends parallel storage capabilities to second tier applications, providing a fully unified parallel storage platform for both primary and secondary storage and delivering improved performance, scalability and manageability.

Pricing ranges from $5/GB for the 6000, down to $1.2/GB on the 200. You’ll see that difference in the performance at the low end: where the 200 delivers 104 TB over 5 shelves with an aggregate bandwidth of 350 MB/s, the higher end products offer 600 MB/s per shelf.

HPCwire will be running a feature [UPDATE: LINK ADDED] on this later today/tomorrow (that I wrote), so you’ll be able to read more there.