Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Exascale Computing: A Race to the Future of HPC

In this week’s Sponsored Post, Nicolas Dube of Hewlett Packard Enterprise outlines the future of HPC and the role and challenges of exascale computing in this evolution. The HPE approach to exascale is geared to breaking the dependencies that come with outdated protocols. Exascale computing will allow users to process data, run systems, and solve problems at a totally new scale, which will become increasingly important as the world’s problems grow ever larger and more complex.

Video: Livermore HPC Takes Aim at Cancer

In this video, Jonathan Allen from LLNL describes how Lawrence Livermore’s supercomputers are playing a crucial role in advancing cancer research and treatment. “A historic partnership between the Department of Energy (DOE) and the National Cancer Institute (NCI) is applying the formidable computing resources at Livermore and other DOE national laboratories to advance cancer research and treatment. Announced in late 2015, the effort will help researchers and physicians better understand the complexity of cancer, choose the best treatment options for every patient, and reveal possible patterns hidden in vast patient and experimental data sets.”

Understanding Cities through Computation, Data Analytics, and Measurement

“For many urban questions, however, new data sources will be required with greater spatial and/or temporal resolution, driving innovation in the use of sensors in mobile devices as well as embedding intelligent sensing infrastructure in the built environment. Collectively, these data sources also hold promise to begin to integrate computational models associated with individual urban sectors such as transportation, building energy use, or climate. Catlett will discuss the work that Argonne National Laboratory and the University of Chicago are doing in partnership with the City of Chicago and other cities through the Urban Center for Computation and Data, focusing in particular on new opportunities related to embedded systems and computational modeling.”

Bull Atos Powers New Genomics Supercomputer at Pirbright Institute

“Atos is determined to solve the technical challenges that arise in life sciences projects, to help scientists to focus on making breakthroughs and forget about technicalities. We know that one size doesn’t fit all and that is the reason why we studied carefully The Pirbright Institute’s challenges to design a customized and unique architecture. It is a pleasure for us to work with Pirbright and to contribute in some way to reduce the impact of viral diseases”, says Natalia Jiménez, WW Life Sciences lead at Atos.

Experts Weigh in on 2017 Artificial Intelligence Predictions

In this presentation from Nvidia, top AI experts from the world’s most influential companies weigh in on predicted advances for AI in 2017. “In 2017, intelligence will trump speed. Over the last several decades, nations have competed on speed, intent to build the world’s fastest supercomputer,” said Ian Buck, VP of Accelerated computing at Nvidia. “In 2017, the race will shift. Nations of the world will compete on who has the smartest supercomputer, not solely the fastest.”

Call for Papers: International Workshop on High-Performance Big Data Computing (HPBDC)

The 3rd annual International Workshop on High-Performance Big Data Computing (HPBDC) has issued its Call for Papers. Featuring a keynote by Prof. Satoshi Matsuoka from Tokyo Institute of Technology, the event takes place May 29, 2017 in Orlando, FL.

insideBigData Industry Predictions for 2017

In this special guest feature, Daniel Gutierrez from insideBIGDATA offers up his 2017 roundup industry predictions from Big Data thought leaders. “AI, ML, and NLP innovations have really exploded this past year but despite a lot of hype, most of the tangible applications are still based on specialized AI and not general AI. We will continue to see new use-cases of such specialized AI across verticals and key business processes. These use-cases would primarily be focused on the evolutionary process improvement side of the digital transformation.”

SC16 Applies Advanced Computing for Social Change

In this video, Dr. Kelly Gaither from TACC describes how 20 students identified by XSEDE’s community engagement team participated in a four-day long cohort experience themed around social change at SC16. “The objectives of the program are to engage students in a social change challenge using visualization and data analytics to increase awareness, interest, and ultimately inspire students to continue their path in advanced computing careers; to increase the participation of students historically underserved in STEM at SC.”

Radio Free HPC Year End Review of 2016 Predictions

In this podcast, the Radio Free HPC team looks at how Shahin Khan fared with his OrionX 2016 Technology Issues and Predictions. “Here at OrionX.net, we are fortunate to work with tech leaders across several industries and geographies, serving markets in Mobile, Social, Cloud, and Big Data (including Analytics, Cognitive Computing, IoT, Machine Learning, Semantic Web, etc.), and focused on pretty much every part of the “stack”, from chips to apps and everything in between. Doing this for several years has given us a privileged perspective. We spent some time to discuss what we are seeing and to capture some of the trends in this blog.”

Building a Platform for Collaborative Scientific Research on AWS

“The pharmaceutical industry trend toward joint ventures and collaborations has created a need for new platforms in which to work together. We’ll dive into architectural decisions for building collaborative systems. Examples include how such a platform allowed Human Longevity, Inc. to accelerate software deployment to production in a fast-paced research environment, and how Celgene uses AWS for research collaboration with outside universities and foundations.”