Sign up for our newsletter and get the latest HPC news and analysis.

Interview: Addison Snell of Intersect360 Talks HPC, Cloud and Big Data

addison2

“Our annual ‘Budget Map’ report series looks at the relative spending between all of the products, components, and services that make up the HPC market. With six years of end user data, we get a strong grip on where the money is flowing, whether it’s on big items like clusters and storage, or on topical things like power consumption, programming, or compute cycles in public cloud. We also get a sense of future budget outlook and how the market is likely to evolve.”

Marc Hamilton Looks at China HPC

“Like the US, Japan, and Europe, China still has plans to build giant HPC systems like Tianhe. However, increasingly these systems are being looked at to support commercial HPC workloads like machine vision in a cloud environment in addition to just scientific data processing.”

Why Big Data Translates into Big Bucks

Big-Data-funding

“UK-based Maxeler Technologies took the opportunity to get over the message about the importance of big data to the highest political levels when British Prime Minister David Cameron and German Chancellor Angela Merkel visited its stand at CeBIT this month. Oskar Mencer, Maxeler CEO, spoke to the two world leaders about the design of the company’s dataflow engines and their use in the finance industry where stock exchanges and banks employ the engines to accelerate risk analytics in real-time.”

Interview: Transtec Solutions for HPC and Big Data Workflows

97f5cddc6.6604006,2.140x185

“One of the hottest topics we see is remote visualization for post-processing simulation results. One big issue in traditional workflows in technical and scientific computing is the transfer of large amounts of data from where these have been created to where they are analyzed. Streamlining this workflow by processing where the data have been created in the first place is tantamount to shorten the wall-clock time it takes end users to get final results. At the same time, hardware utilization is greatly enhanced by using innovative technology for remote 3D visualization. For this, we have long since entered into a strategic partnership with NICE.”

Interview: GPU Technology Conference Enters 5th Year with Over 100 HPC Sessions

george

The upcoming GPU Technology Conference is entering it’s fifth year with developer talks on everything from Numerical Algorithms to Big Data Analytics. “In short, we’ll have a ton of HPC content. There are nearly 100 sessions dedicated to supercomputing and HPC topics. This includes major scientific research enabled by these GPU-accelerate systems – everything from breakthroughs in cancer research and astronomy, to HIV research and new big data analytics innovation.”

The Scary Side of AI and Big Data

1c9b78d

Douglas Eadline writes that recent big investments in AI technology by IBM and Google show that intelligent systems are the future of big business. The problem is, these advancements could come at the expense of our privacy.

For Financial Services, it’s HPC to the Rescue

Austin Trippensee

“Today, most financial services organizations try to solve all of their big data challenges using either grid or cluster technologies. One popular approach involves the use of Hadoop running on commodity x86-based clusters. At Cray we’re taking a different approach, leveraging our supercomputing technologies to improve I/O performance, disk utilization and efficiency.”

Slidecast: How Big Workflow Delivers Business Intelligence

rob_clyde

In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”

Porting Hadoop to HPC

intel

Ralph H. Castain from Intel presented this talk at the Adaptive Computing booth at SC13. “The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments. The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”

Rich Brueckner Presents: Big Data – What’s It Really About?

rich

In this video from the TCC Conference, Rich Brueckner from insideHPC describes the convergence of Big Data and HPC. “While the term Big Data has become pervasive in Information Technology, many in the industry are still puzzled by how to make money from this phenomenon. In this talk, Brueckner will look at what’s really behind Big Data as an engine for change with case studies that are bringing the full potential of Big Data home.”