Sign up for our newsletter and get the latest HPC news and analysis.

Panel Discussion: Solving the HPC Data Deluge

bigdatapanel

As science drives a rapidly growing need for storage, existing environments face increasing pressure to expand capabilities while controlling costs. Many researchers, scientists and engineers find that they are outgrowing their current system, but fear their organizations may be too small to cover the cost and support needed for more storage. Join these experts for a lively discussion on how you can take control and solve the HPC data deluge.

Creating a Better Infrastructure to Manage Big Data

Trev Harmon

Over the course of this talk, Trev Harmon looks back to the utility computing vision of Douglas Parkhill and proposes an application-centric workflow for the future that fulfills that vision across many disciplines of computing.

Using PBS to Schedule MapReduce Jobs Accessing OrangeFS

Screen Shot 2013-10-04 at 6.58.51 AM

Using PBS Professoinal and a customized version of myHadoop has allowed researchers at Clemson University to submit their own Hadoop MapReduce jobs on the “Palmetto Cluster”. Now, researchers at Clemson can run their own dedicated Hadoop daemons in a PBS scheduled environment as needed.

Sponsored Post: Balancing High Performance with Cost when Analyzing Big Data

NetApp Logo small

HPC and big data professionals need high performance to ingest and analyze huge amounts of data, while still managing power and cost efficiently.

Slidecast: SSRLabs Develops Energy- and Instruction-Efficient HPC

SSRLabs_InsideHPC_August2013

Scalable Systems Research Labs is developing coprocessors to solve the “Big Data” problem by accelerating execution of applications.

IDC's Perspective on Big Data Outside of HPC

In this video from the IDC Breakfast Briefing at ISC’13, Steve Conway presents: IDC’s Perspective on Big Data Outside of HPC. View the slides or check out more talks from the show at our ISC’13 Video Gallery.

Video: High Performance Computing Trends for 2013

In this video from the HPC Advisory Council European Conference 2013, Addison Snell from Intersect360 Research presents: High Performance Computing Trends for 2013. This presentation is an overview of the current important trends in HPC, based on the latest end-user research studies and market forecasts. Topics include accelerator adoption, the role of HPC in Big […]

Cray Rolls Out Hadoop Cluster Solution

Today Cray announced a new Hadoop solution that combines supercomputing technologies with an “enterprise-strength” approach to Big Data analytics. Available later this month, Cray cluster supercomputers for Hadoop will pair Cray CS300 systems with the Intel Distribution for Apache Hadoop. More and more organizations are expanding their usage of Hadoop software beyond just basic storage […]

Hadoop Meets Lustre – Intel Rolls out Big Data Distribution for the Enterprise

Today Intel announced the “first converged HPC and Big Data plaform” with the new Intel Enterprise Edition for Lustre. Paired with Chroma storage management tools from Whamcloud as well as a new adaptor for the Intel Distribution for Apache Hadoop, the new offering provides enterprise-class reliability combined with HPC performance for Big Data applications. Enterprise […]

Slidecast: Rogue Wave Software for Developing Parallel, Data-intensive Applications

In this slidecast, Scott Lasica from Rogue Wave Software describes how the company helps its customers meet the challenges of programming at extreme scale. Developing parallel, data-intensive applications is hard. We make it easier. Rogue Wave works with many scientists performing cutting-edge research and solving Grand Challenge class problems at labs and supercomputer facilities around […]