Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Advancing Progress in Life Sciences

In this special guest feature from Scientific Computing World, Christian Marcazzo, VP and general manager at IDBS highlights trends in life sciences research and development. “As can be seen across most industries, organizations are increasingly moving systems and services to the cloud. For R&D firms, cloud-based software-as-a-service (SaaS) platforms that integrate all systems are the most effective way of overcoming legacy.”

Cortical.io Demonstrates Natural Language Understanding Inspired by Neuroscience

In this video, Cortical.io CEO Francisco Webber demonstrates how the company’s software running on Xilinx FPGAs breaks new ground in the field of natural language understanding (NLU). “Cortical.io delivers AI-based Natural Language Understanding solutions which are quicker and easier to implement and more capable than current approaches. The company’s patented approach enables enterprises to more effectively search, extract, annotate and analyze key information from any kind of unstructured text.”

Heterogeneous Computing: Long Live the CPU

In this guest article, our friends at Intel discuss how the company is investing in heterogeneous computing. Intel recently spoke about supporting heterogeneous computing with the catch phrase “One Size Does Not Fit All” in talking about software and Intel’s commitment to help programmers with oneAPI.

Sandia Research Project turns Big Data into real-time, actionable intelligence

Researchers at Sandia National Labs are leading a project to deliver actionable information from streaming data to decision makers. While social media, cameras, sensors and more generate huge amounts of data that can overwhelm analysts, the project looks to provide crucial insight in real time. “Actionable intelligence is the next level of data analysis where analysis is put into use for near-real-time decision-making. Success on this research will have a strong impact to many time-critical national security applications.”

Supercomputing and the Scientist: How HPC and Analytics are transforming experimental science

In this video from DataTech19, Debbie Bard from NERSC presents: Supercomputing and the scientist: How HPC and large-scale data analytics are transforming experimental science. “Debbie Bard leads the Data Science Engagement Group NERSC. NERSC is the mission supercomputing center for the USA Department of Energy, and supports over 7000 scientists and 700 projects with supercomputing needs.”

UPMEM Puts CPUs Inside Memory to Allow Apps to Run 20 Times Faster

Today UPMEM announced a Processing-in-Memory (PIM) acceleration solution that allows big data and AI applications to run 20 times faster and with 10 times less energy. Instead of moving massive amounts of data to CPUs, the silicon-based technology from UPMEM puts CPUs right in the middle of data, saving time and improving efficiency. By allowing compute to take place directly in the memory chips where data already resides, data-intensive applications can be substantially accelerated.

Video: Big Data is Dead, Long Live Its Replacement

Tom Fisher gave this talk at the Samsung Forum. “Big Data is experiencing a second revolution. This talk will address what’s happened, how it happened and what big data is bridging too. Enterprise companies have to make business critical decisions in the coming years and the marketplace is not clear. The recent changes in the Big Data market will be reviewed as well as the effects on the related ecosystem. The goal of this presentation is to provide insights to engineers, data engineers and data scientists to better navigate a rapidly moving landscape.”

HPE Acquires MapR Business Assets

Today HPE announced it has acquired the business assets of MapR, whose data platform for artificial intelligence and analytics applications is powered by scale-out, multi-cloud and multi-protocol file system technology. This transaction includes MapR’s technology, intellectual property, and domain expertise in artificial intelligence and machine learning (AI/ML) and analytics data management. 

DAOS: Scale-Out Software-Defined Storage for HPC/Big Data/AI Convergence

As an all-new parallel file system, DAOS will be a key component of the the upcoming Aurora supercomputer coming to Argonne National Laboratory in 2021. “DAOS is an open source software-defined scale-out object store that provides high bandwidth, low latency, and high I/O operations per second (IOPS) storage containers to HPC applications. It enables next-generation data-centric workflows that combine simulation, data analytics, and AI.”

Active Archive Alliance Report: Solving Data Growth Challenges

According to this new report, “Archival data is piling up faster than ever as organizations are quickly learning the value of analyzing vast amounts of previously untapped digital data. The need to securely store, search for, retrieve and analyze massive volumes of archival content is fueling new and more effective advancements in archive solutions.”