Sign up for our newsletter and get the latest HPC news and analysis.

Can IBM Succeed with Neurosynaptic Computing?

Robert Roe

In this Industry Perspective from Scientific Computing World, Robert Roe considers the human challenges that must be overcome if IBM’s new neurosynaptic computing paradigm is to be successful. “IBM’s move reflects a growing understanding by hardware manufacturers that software is critical to the uptake of new chips and new architectures. Creating a new architecture, drastically different from those that preceded it, inexorably creates challenges and increases the complexity of generating code that will work for the new system.”

ISC Big Data Conference Offers Early Bird Discount

isc

ISC has announced an Early Bird discount for its second annual Big Data conference. The event will offer attendees insights into the latest developments in data-intensive computing from both industry players and researchers.

Video: For Big Data and Technical Computing, Infrastructure Matters

Addison Snell, Intersect360 Research

In this video, Intersect360 Research CEO Addison Snell describes the dynamics unfolding in the technical computing market, and its convergence with big data analytics. “In their customer surveys on big data, Intersect360 finds that end users need to scale their environments from desktops to servers, to clusters of servers, to large supercomputers. Mr. Snell notes that IBM is a leader in HPC and Big Data that is making the investments to extend that leadership.”

TPC Council Launches First Vendor-Neutral Big Data Benchmark

tpc_logo

Today the Transaction Processing Performance Council (TPC) today announced the immediate availability of TPCx-HS, developed to provide verifiable performance, price / performance, availability, and optional energy consumption metrics of big data systems.

Slidecast: Introducing the TPCx-HS Benchmark for Big Data

raghu

In this slidecast, Raghu Nambiar from Cisco introduces the new TPCx-HS Benchmark for Big Data. “TPCx-HS is the first vendor-neutral benchmark focused on big data systems – which have become a critical part of the enterprise IT ecosystem. “

Python’s Role in Big Data Analytics: Past, Present, and Future

Travis Oliphant

In this video from EuroPython 2014, Travis Oliphant from Continuum Analytics presents: Python’s Role in Big Data Analytics: Past, Present, and Future.

Supercomputing Technologies for Big Data Challenges

Ferhat Hatay

In this special guest feature, Ferhat Hatay from Fujitsu writes that supercomputing technologies developed for data-intensive scientific computing can be a powerful tool for taking on the challenges of Big Data. We all feel it, data use and growth is explosive. Individuals and businesses are consuming — and generating — more data every day. The […]

TACC Workshop: Leveraging HPC Resources for Managing Large Datasets

taccLogo

This October TACC will host a hands-on workshop on Leveraging High Performance Computing Resources for Managing Large Datasets.

Why Big Data is Really Big Parallelism

Robin Bloor

“Moore’s Law got deflected in 2004, when it became no longer practical to ramp up the clock speed of CPUs to improve performance. So the chip industry improved CPU performance by adding more processors to a chip in concert with miniaturization. This was extra power, but you could not leverage it easily without building parallel software. Virtual machines could use multicore chips for server consolidation of light workloads, but large workloads needed parallel architectures to exploit the power. So, the software industry and the hardware industry moved towards exploiting parallelism in ways they had not previously done. This is the motive force behind the Big Data.”

Google Mesa: Geo-Replicated, Near Real-Time, Scalable Data Warehousing

mesa

Over at GigaOM, Derrick Harris writes that Google has developed a data warehousing system called Mesa that is designed to handle real-time data while maintaining performance even if an entire data center goes offline.