Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


PASC18 Panel Discussion Big Data vs. Fast Computation

“The panelists will discuss the critical challenges facing key HPC application areas in the next 5-10 years, based on a mix of knowledge and speculation. They will explore whether we need to make radical changes to our practices, methods, tools, and techniques to be able to use modern resources and make faster and bigger progress on our scientific problems.”

Julia 1.0 release Opens the Doors for a Connected World

Today Julia Computing announced the Julia 1.0 programming language release. As the first complete, reliable, stable and forward-compatible Julia release, version 1.0 is the fastest, simplest and most productive open-source programming language for scientific, numeric and mathematical computing. “During the last six and a half years, Julia has reached more than 2 million downloads and early adopters have already put Julia into production to power self-driving cars, robots, 3D printers and applications in precision medicine, augmented reality, genomics, energy trading, machine learning, financial risk management and space mission planning.”

Dr. Eng Lim Goh presents: Prediction – Use Science or History?

Dr. Eng Lim Goh from HPE gave this keynote talk at PASC18. “Traditionally, scientific laws have been applied deductively – from predicting the performance of a pacemaker before implant, downforce of a Formula 1 car, pricing of derivatives in finance or the motion of planets for a trip to Mars. With Artificial Intelligence, we are starting to also use the data-intensive inductive approach, enabled by the re-emergence of Machine Learning which has been fueled by decades of accumulated data.”

David Bader on Real World Challenges for Big Data Analytics

In this video from PASC18, David Bader from Georgia Tech summarizes his keynote talk on Big Data Analytics. “Emerging real-world graph problems include: detecting and preventing disease in human populations; revealing community structure in large social networks; and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms, and development of frameworks for solving these real-world problems on high performance computers.”

Video: Large Scale Training for Model Optimization

Jakub Tomczak from the University of Amsterdam gave this talk at PASC18. “Deep generative models allow us to learn hidden representations of data and generate new examples. There are two major families of models that are exploited in current applications: Generative Adversarial Networks (GANs), and Variational Auto-Encoders (VAE). We will point out advantages and disadvantages of GANs and VAE. Some of most promising applications of deep generative models will be shown.”

Atomicus Chart Software Brings Easy Analytics to Scientists

ISV vendor Atomicus is approaching the release of an advanced software component AtomicusChart developed for scientists researching physical, chemical, biological phenomena and for those who need a convenient tool to present the results in scientific formats. “The ATOMICUS CHART is a product from Atomicus, which can be re-used and integrated in any software requiring high-speed graphics for large volumes of data (including big data) and dedicated to the needs of analytical applications.”

Massive-Scale Analytics Applied to Real-World Problems

David Bader from Georgia Tech gave this talk at PASC18. “Emerging real-world graph problems include: detecting and preventing disease in human populations; revealing community structure in large social networks; and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams.”

Video: Kathy Yelick from LBNL Testifies at House Hearing on Big Data Challenges and Advanced Computing

In this video, Kathy Yelick from LBNL describes why the US needs to accelerate its efforts to stay ahead in AI and Big Data Analytics. “Data-driven scientific discovery is poised to deliver breakthroughs across many disciplines, and the U.S. Department of Energy, through its national laboratories, is well positioned to play a leadership role in this revolution. Driven by DOE innovations in instrumentation and computing, however, the scientific data sets being created are becoming increasingly challenging to sift through and manage.”

Evolving Considerations for Data Storage in Life Sciences

The latest lab instruments are driving the need for powerful IT resources in the life sciences. Laboratory technologies are evolving rapidly. Download the new report from Quantum, to discover the latest for data storage in life sciences. 

CSCS Takes on Big Data from the Paul Scherrer Institute

Today CSCS in Lugano announced plans to archive research data collected by the large-scale research facilities at the Paul Scherrer Institute (PSI) in Villigen. The collaboration between PSI and CSCS enabled major improvements to the data transfer and storage process. “Storing and archiving data is part of everyday life at CSCS. However, transferring huge amounts of data — including ongoing research projects online across Switzerland as well as an entire data archive containing decades of major completed projects — via fibre-optic cable presents a logistical challenge.”