Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


The Intelligent Industrial Revolution

“Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years.”

Radio Free HPC Previews the SC16 Student Cluster Competition

In this podcast, the Radio Free HPC team previews the SC16 Student Cluster Competition. To get us primed up, Dan gives us his impressions of the 14 teams competing this year. That’s a record number! “The Student Cluster Competition was developed in 2007 to immerse undergraduate and high school students in HPC. Student teams design and build small clusters, with hardware and software vendor partners, learn designated scientific applications, apply optimization techniques for their chosen architectures, and compete in a non-stop, 48-hour challenge.”

HPC: Retrospect & Looking Towards the Next 10 Years

In this video from the HPC Advisory Council Spain Conference, Addison Snell from Intersect360 Research looks back over the past 10 years of HPC and provides predictions for the next 10 years. Intersect360 Research just released their Worldwide HPC 2015 Total Market Model and 2016–2020 Forecast.

Radio Free HPC Looks into the New OpenCAPI Consortium

In this podcast, the Radio Free HPC team looks at the new OpenCAPI interconnect standard. “Released this week by the newly formed OpenCAPI Consortium, OpenCAPI provides an open, high-speed pathway for different types of technology – advanced memory, accelerators, networking and storage – to more tightly integrate their functions within servers. This data-centric approach to server design, which puts the compute power closer to the data, removes inefficiencies in traditional system architectures to help eliminate system bottlenecks and can significantly improve server performance.”

Reader Survey: Is Machine Learning in Your Future?

Will this be the year of artificial intelligence, when the technology comes into its own for mainstream business? There are big pushes for AI in manufacturing, agriculture, healthcare and many other industry sectors. But why now? Please share your insights in our Reader Survey.

Radio Free HPC Looks at Security Concerns for Augmented Reality

In this podcast, the Radio Free HPC team looks at the issue of security for Augmented Reality and IoT. Now that every device in our lives is getting connected to the Internet, how will be prevented from attackers? Henry points out that even our medical devices are not safe any more.

Is Free Lunch Back? Douglas Eadline Looks at the Epiphany-V Processor

Over at Cluster Monkey, Douglas Eadline writes that the “free lunch” performance boost of Moore’s Law may indeed be back with the 1024-core Epiphany-V chip that will hit the market in the next few months.

insideHPC Readers: Weigh in on Why AI is Taking Off Now

This may indeed be the year of artificial intelligence, when the technology came into its own for mainstream businesses. “But will other companies understand if AI has value for them? Perhaps a better question is “Why now?” This question centers on both the opportunity and why many companies are scared about missing out.”

Radio Free HPC Looks for the Forever Data Format

In this podcast, the Radio Free HPC team discuss Henry Newman’s recent editorial calling for a self-descriptive data format that will stand the test of time. Henry contends that we seem headed for massive data loss unless we act.

Larry Smarr Presents: 50 Years of Supercomputing

Larry Smarr presented this talk as part of NCSA’s 30th Anniversary Celebration. “For the last thirty years, NCSA has played a critical role in bringing computational science and scientific visualization to the national user community. I will embed those three decades in the 50 year period 1975 to 2025, beginning with my solving Einstein’s equations for colliding black holes on the megaFLOPs CDC 6600 and ending with the exascale supercomputer. This 50 years spans a period in which we will have seen a one trillion-fold increase in supercomputer speed.”