MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Video: Data Centric Computing for File and Object Storage from IBM

Doug O'Flaherty, IBM

“We’ve had a great time here in Austin talking about data centric computing– the ability to use IBM Spectrum Scale and Platform LSF to do Cognitive Computing. Customers, partners, and the world have been talking about how we can really bring together file, object, and even business analytics workloads together in amazing ways. It’s been fun.”

Video: SGI Looks to Zero Copy Architecture for HPC and Big Data

Dr. Eng Lim Goh, CTO, SGI

In this video from SC15, Dr. Eng Lim Goh from SGI describes how the company is embracing new HPC technology trends such as new memory hierarchies. With the convergence of HPC and Big Data as a growing trend, SGI is envisions a “Zero Copy Architecture” that would bring together a traditional supercomputer with a Big Data analytics machine in a way that would not require users to move their data between systems.

Penguin Computing Looks to Cavium ThunderX for ARM HPC Servers


Today Penguin Computing announced first customer shipments of its Tundra Extreme Scale (ES) server based on Cavium’s 48 core ARMv8 based ThunderX workload optimized processors. Tundra ES Valkre servers are now available for public order and a standard 19” rack mount version will ship in early 2016.

Video: IBM Showcases Data-centric Computing at SC15

Matt Drahzal, IBM

In this video from SC15, Matt Drahzal from IBM describes the company’s comprehensive approach to data-centric computing. “IBM is speeding up innovation through Data Centric Design. Limitations in traditional computing are slowing progress in business and society. A new approach to computer design is needed that will accelerate the pace of innovation to benefit organizations and individuals alike.”

Deep Learning Systems Analyze Periscope Streams on a Supercomputer


Today Dextro, developer of advanced computer vision, machine learning, and data analytics technologies, announced that Orange Silicon Valley, an innovation subsidiary of global telecommunications operator Orange, will use Dextro’s Stream application in its prototype to demonstrate the power and performance capabilities of its {Orange Silicone Valley’s} Exascale supercomputing platform, in collaboration with Echostreams and CocoLink. The Exascale one-unit system can do what previously took dozens of servers while making the learning speed of Dextro’s algorithms more than 5x faster.

Berkeley Lab to Optimize Spark for HPC


Today LBNL announced that a team of scientists from Berkeley Lab’s Computational Research Division has been awarded a grant by Intel to support their goal of enabling data analytics software stacks—notably Spark—to scale out on next-generation high performance computing systems.

Dell Equips PowerEdge Servers with Bright Cluster Manager for HPC Environments


Today Bright Computing announced that the latest version of its Bright Cluster Manager software is now integrated with Dell’s 13th generation PowerEdge server portfolio. The integration enables systems administrators to easily deploy and configure Dell infrastructure using Bright Cluster Manager.

Podcast: Dell Panels on NSCI and the Convergence of Big Data Coming to SC15


In this podcast, Stephen Sofhauser from Dell describes what’s coming up at the company’s exhibit at SC15 in Austin. With a 50×50 exhibit and two booth theaters, Dell will showcase how customers are using their technology to solve their toughest computational problems. “Our own Rich Brueckner from insideHPC will host a pair of panel discussions in the Dell booth #1009 on Wednesday, Nov. 18.”

Seagate SSDs Boost Analytics on Comet Supercomputer


The San Diego Supercomputer Center is adding 800GB Seagate SAS SSDs to significantly boost the data analytics capability of its Comet supercomputer. To expand its node-local storage capacity for data-intensive workloads, device pairs will be added to all 72 compute nodes in one rack of Comet, alongside the existing SSDs. This will bring the flash storage in a single node to almost 2TB, with total rack capacity at more than 138TB.

Slidecast: Micron Persistent Memory & NVDIMM


“Micron is delivering on the promise of persistent memory with a solution that gives system architects a new approach for designing systems with better performance, reduced energy usage and improved total cost of ownership,” said Tom Eby, vice president for Micron’s compute and networking business unit. “With NVDIMM, we have a powerful solution that is available today. We’re also leading the way on future persistent memory development by spearheading R&D efforts on promising new technologies such as 3D XPoint memory, which will be available in 2016 and beyond.”