Sign up for our newsletter and get the latest HPC news and analysis.

University of Houston Installs Opuntia Cluster

cluster

The University of Houston (UH) is adding a new, state-of-the-art supercomputer to its arsenal of research tools. With 1860 compute cores, the new Opuntia cluster will be used primarily for scientific and engineering work. The acquisition of this new system marks the start of a new era of supercomputing not only for the University of […]

Podcast: Using Allinea Programming Tools to Speed XSEDE Supercomputing Research

xsede logo

“In my humble opinion, I think that debuggers and profiling tools are far too infrequently used. And it’s not because they’re not there. It’s because people just either don’t know about them, don’t do training on them, or don’t know how to use them. We’re in a state where we have less cycles than we’ve ever had per request, right? So being able to take full advantage of those cycles by having optimized code and optimized run patterns is crucial. Otherwise, you’re just not going to be able to get your work done and the science won’t get done.”

Adaptive Computing Rolls Out Moab 8.1

adaptive

Moab 8.1 systems management software includes a revamped Web-based user interface with bolstered reporting and tracking capabilities that give greater insight into the job states, workloads and nodes of a HPC system; massive performance gains and improvements in scale; and system improvements to achieve elastic computing to expand to other resources as workloads demand.

Registration Opens for OFS Developer Workshop and User Group

UGWorkshopHeadGraphic

Registration is now open for 11th annual International OFS Developers’ Workshop and the 2015 OFS User Group (formerly known as IBUG) Workshop. Both events are being held the week of March 15, 2015, at the Marriott Hotel in Monterey, California.

Star Tribune Reflects on the Life of Seymour Cray

Seymour Cray showed off the Cray-1 supercomputer built in 1976 by Cray Research in Chippewa Falls, Wis. Cray revolutionized computing, ramping up productivity and speed with each new design.

Over at the Star Tribune, Curt Brown has posted a brief history on the life of Seymour Cray, the Father of the Supercomputing Industry. “It seems impossible to exaggerate the effect he had on the industry,” said Joel Birnbaum, a Hewlett-Packard executive. “Many of the things that high performance computers now do routinely were at the furthest edge of credibility when Seymour envisioned them. He ranks up there with Edison and Bell.”

Thought Leaders Gather in Barcelona for Exascale Workshop

bdec-header

More than 100 exascale experts will gather in Barcelona this week for the Big Data and Extreme-scale Computing (BDEC). Along with application leaders confronting diverse big data challenges, attendees will include members from industry, academia, and government, with expertise in algorithms, computer system architecture, operating systems, workflow middleware, compilers, libraries, languages and applications.

HPC People on the Move: Superbowl Edition

Dr. Lewey Anton

It’s me again–Dr. Lewey Anton. I’ve been commissioned by insideHPC to get the scoop on who’s jumping ship and moving on up in high performance computing. This week familiar faces like Dr. Richard Juday and Anton Korzh have new gigs.

Radio Free HPC Compares Providers at CloudHarmony

bubble

In this podcast, the Radio Free HPC team looks at CloudHarmony, an online service that measures and compares cloud provider uptime. “At CloudHarmony, we simplify the comparison of cloud services by providing reliable and objective performance analysis, reports, commentary, metrics and tools.”

Job of the Week: HPC Cluster Systems Engineer at Intel

Intel in Champaign, Illinois is seeking an HPC Cluster Systems Engineer in our Job of the Week.

University of New Mexico to Repurpose 3000 LANL Cores

The University of New Mexico’s Center for Advanced Research Computing is renovating its principle machine room in order to install a new supercomputer. The new 3000-core system will be used for bioinformatics and bionomic research, combining computer science, statistics, mathematics and engineering to study and process biological data.