MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


InsideHPC Guide to Technical Computing

Today’s High Performance Computing (HPC) systems offer the ability to model everything from proteins to galaxies. The insights and discoveries offered by these systems are nothing short of astounding. Indeed, the ability to process, move, and store data at unprecedented levels, often reducing jobs from weeks to hours, continues to move science and technology forward at an accelerating pace. This article series offers those considering HPC, both users and managers, guidance when considering the best way to deploy an HPC solution.

Three Questions to Ensure Your HPC Success

Successful HPC computing depends on choosing the architecture that addresses both application and institutional needs. In particular, finding a simple path to leading edge HPC and Data Analytics is not difficult, if you consider the capabilities and limitations of various approaches to HPC performance, scaling, ease of use, and time to solution. Careful analysis and consideration of the following questions will help lead to a successful and cost-effective HPC solution. Here are three questions to ask to ensure HPC success.

Local or Cloud HPC?

Cloud computing has become another tool for the HPC practitioner. For some organizations, the ability of cloud computing to shift costs from capital to operating expenses is very attractive. Because all cloud solutions require use of the Internet, a basic analysis of data origins and destinations is needed. Here’s an overview of when local or cloud HPC make the most sense.

Faster SAS Analytics Using DDN Storage Solutions

Parallel file systems have become the norm for HPC environments. While typically used in high end simulations, these parallel file systems can greatly affect the performance and thus the customer experience when using analytics from leading organizations such as SAS. This whitepaper is an excellent summary of how parallel file systems can enhance the workflow and insight that SAS Analytics gives.

Understanding Your HPC Application Needs

Many HPC applications began as single processor (single core) programs. If these applications take too long on a single core or need more memory than is available, they need to be modified so they can run on scalable systems. Fortunately, many of the important (and most used) HPC applications are already available for scalable systems. Not all applications require large numbers of cores for effective performance, while others are highly scalable. Here is how to better understand your HPC application needs.

Who Is Using HPC (and Why)?

In today’s highly competitive world, High Performance Computing (HPC) is a game changer. Though not as splashy as many other computing trends, the HPC market has continued to show steady growth and success over the last several decades. Market forecaster IDC expects the overall HPC market to hit $31 billion by 2019 while riding an 8.3% CAGR. The HPC market cuts across many sectors including academic, government, and industry. Learn which industries are using HPC and why.

The Lustre Parallel File System—A Landscape of Topics and Insight from the Community

Since its beginnings in 1999 as a project at Carnegie Mellon University, Lustre, the high performance parallel file system, has come a long, long way. Designed and always focusing on performance and scalability, it is now part of nearly every High Performance Computing (HPC) cluster on the Top500.org’s list of fastest
computers in the world—present in 70 percent of the top 100 and nine out of the top ten. That’s an achievement for any developer—or community of developers, in the case of Lustre—to be proud of. Learn what the HPC Community is saying about Lustre.

Empowering Cloud Utilization with Cloud Bursting

Cloud computing has become a strong alternative to in house data centers for a large percentage of all enterprise needs. Most enterprises are adopting some form of could computing, with some estimates that as high as 90 % are putting workloads into a public cloud infrastructure. The whitepaper, Empowering Cloud Utilization with Cloud Bursting is an excellent summary of various options for enterprises that are planning for using a public cloud infrastructure.

Seismic Processing Places High Demand on Storage

Oil and gas exploration is always a challenging endeavor, and with today’s large risks and rewards, optimizing the process is of critical importance. A whole range of High Performance Computing (HPC) technologies need to be employed for fast and accurate decision making. This Intersect360 Research whitepaper, Seismic Processing Places High Demand on Storage, is an excellent summary of the challenges and solutions that are being address by storage solutions from Seagate.

How HPC is Helping Solve Climate and Weather Forecasting Challenges

Data accumulation is just one of the challenges facing today weather and climatology researchers and scientists. To understand and predict Earth’s weather and climate, they rely on increasingly complex computer models and simulations based on a constantly growing body of data from around the globe. “It turns out that in today’s HPC technology, the moving of data in and out of the processing units is more demanding in time than the computations performed. To be effective, systems working with weather forecasting and climate modeling require high memory bandwidth and fast interconnect across the system, as well as a robust parallel file system.”