MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Nimbix and Mangstor Accelerate High Performance PaaS

nimbix

Today Nimbix and Mangstor announced a joint technology offering designed to accelerate high performance data analysis for seismic processing, bioinformatics, and other big data analytics use cases in the Nimbix Cloud.

Seagate Combines Cloud, HPC, and Electronic Solutions Groups

seagate2015_2c_horizontal_pos

Today Seagate announced that it is combining its Cloud Storage, High Performance Computing, and Electronic Solutions groups to further align the company’s full breadth of enterprise storage hardware capabilities.

Petascale Comet Supercomputer Enters Early Operations

comet

“Comet is really all about providing high-performance computing to a much larger research community – what we call ‘HPC for the 99 percent’ – and serving as a gateway to discovery,” said SDSC Director Michael Norman, the project’s principal investigator. “Comet has been specifically configured to meet the needs of researchers in domains that have not traditionally relied on supercomputers to solve their problems.”

SKA & AWS Seeking Proposals for AstroCompute in the Cloud

SKA_AWS_Logo-1024x724

Today the Square Kilometre Array (SKA) Organization announced that it is teaming up with Amazon Web Services (AWS) to use cloud computing to explore ever-increasing amounts of astronomy data. To kick things off, they just issued a Call for Proposals for AstroCompute in the Cloud, a grant program to accelerate the development of innovative tools and techniques for processing, storing and analyzing the global astronomy community’s vast amounts of astronomic data.

Video: Understanding Hadoop Performance on Lustre

Hadoop Cluster

“In this talk, Seagate presents details on its efforts and achievements around improving Hadoop performance on Lustre including a summary on why and how HDFS and Lustre are different and how those differences affect Hadoop performance on Lustre compared to HDFS, Hadoop ecosystem benchmarks and best practices on HDFS and Lustre, Seagate’s open-source efforts to enhance performance of Lustre within “diskless” compute nodes involving core Hadoop source code modification (and the unexpected results), and general takeaways ways on running Hadoop on Lustre more rapidly.”

Jeff Layton on How 3D Flash is Changing the Market

Jeff Layton

“For a period of time, it didn’t look like flash drives were going to decrease in price very much. Flash cell technology is limited to around 20nm because of cost and complexity considerations, but manufacturers have found ways around the limitation. Rather than decrease the features size, they now store more bits per cell (TLC) and have started to create 3D flash chips. This combination, plus the growth in flash storage sales, has driven down the price per gigabyte.”

Video: Monitoring a Heterogeneous Lustre Environment with Splunk

lug

“Monitoring a large Lustre site, running multiple generations of Lustre filesystems can be a challenge. Some equipment offer vendor specific monitoring interfaces while others, built on open source Lustre, have minimal monitoring capabilities. This talk will report on our operational experience using a homegrown python module to collect data from each filesystem. We will discuss in detail how the data is visualized centrally in Splunk and cross-referenced with users workload to analyze and troubleshoot our environment.”

Deploying Hadoop on Lustre Storage: Lessons Learned and Best Practices

Hadoop Cluster

In this video from LUG 2015 in Denver, J.Mario Gallegos from Dell presents: Deploying Hadoop on Lustre Storage: Lessons Learned and Best Practices. “Merging of strengths of both technologies to solve big data problems permits harvesting the power of HPC clusters on very fast storage.”

SGI Powers Earthquake Research in Japan

imgres

Today SGI announced that the Earthquake and Volcano Information Center of the Earthquake Research Institute (ERI) at the University of Tokyo, has deployed a large-scale parallel computing solution from SGI for leading-edge seismological and volcanological research.

Dell’s GDAP Delivers an Integrated Genomic Processing Infrastructure

Genomics processing

Dell has teamed with Intel to create innovative solutions that can accelerate the research, diagnosis and treatment of diseases through personalized medicine. The combination of leading-edge CPUs from Intel and the systems and storage expertise from Dell create a state-of-the-art solution that is easy to install, manage and expand as required.