UPMEM Puts CPUs Inside Memory to Allow Apps to Run 20 Times Faster

Today UPMEM announced a Processing-in-Memory (PIM) acceleration solution that allows big data and AI applications to run 20 times faster and with 10 times less energy. Instead of moving massive amounts of data to CPUs, the silicon-based technology from UPMEM puts CPUs right in the middle of data, saving time and improving efficiency. By allowing compute to take place directly in the memory chips where data already resides, data-intensive applications can be substantially accelerated.

Video: Big Data is Dead, Long Live Its Replacement

Tom Fisher gave this talk at the Samsung Forum. “Big Data is experiencing a second revolution. This talk will address what’s happened, how it happened and what big data is bridging too. Enterprise companies have to make business critical decisions in the coming years and the marketplace is not clear. The recent changes in the Big Data market will be reviewed as well as the effects on the related ecosystem. The goal of this presentation is to provide insights to engineers, data engineers and data scientists to better navigate a rapidly moving landscape.”

HPE Acquires MapR Business Assets

Today HPE announced it has acquired the business assets of MapR, whose data platform for artificial intelligence and analytics applications is powered by scale-out, multi-cloud and multi-protocol file system technology. This transaction includes MapR’s technology, intellectual property, and domain expertise in artificial intelligence and machine learning (AI/ML) and analytics data management. 

DAOS: Scale-Out Software-Defined Storage for HPC/Big Data/AI Convergence

As an all-new parallel file system, DAOS will be a key component of the the upcoming Aurora supercomputer coming to Argonne National Laboratory in 2021. “DAOS is an open source software-defined scale-out object store that provides high bandwidth, low latency, and high I/O operations per second (IOPS) storage containers to HPC applications. It enables next-generation data-centric workflows that combine simulation, data analytics, and AI.”

Active Archive Alliance Report: Solving Data Growth Challenges

According to this new report, “Archival data is piling up faster than ever as organizations are quickly learning the value of analyzing vast amounts of previously untapped digital data. The need to securely store, search for, retrieve and analyze massive volumes of archival content is fueling new and more effective advancements in archive solutions.”

Accelerate Your Apache Spark with Intel Optane DC Persistent Memory

Piotr Balcer and Cheng Xu from Intel gave this talk at the 2019 Spark+AI Summit. “Intel Optane DC persistent memory breaks the traditional memory/storage hierarchy and scales up the computing server with higher capacity persistent memory. Also it brings higher bandwidth & lower latency than storage like SSD or HDD. And Apache Spark is widely used in the analytics like SQL and Machine Learning on the cloud environment.”

Evolve Project in the EU to Tackle Big Data Processing

The European Commission is planning to tackle the challenges of big data processing with the Evolve Project as part of the Horizon 2020 Research and Innovation program. Evolve aims to take concrete steps in bringing the big data, high-performance computing, and cloud computing technology into a testbed that will increase researchers ability to extract value from massive and demanding datasets. This could impact the way that big data applications by enabling researchers to process large amounts of data much faster.

Qumulo Delivers new All-Flash Platforms for Unstructured Data

Today Qumulo announced comprehensive software and hardware advancements that will help enterprises to capitalize on dynamic market conditions, including rapidly falling NVMe prices, in order to gain data center efficiencies and benefit from the increased reliability and performance of flash-based platforms. “Qumulo makes it easy for users to incorporate new technology developments both on-prem and in the cloud,” said Molly Presley, director of product marketing. “Our software-defined, hybrid cloud file system allows our users to focus on their data-driven businesses, not on managing their storage.”

High Performance Computing in the World of Artificial Intelligence

In this special guest feature, Thierry Pellegrino from Dell EMC writes that data analytics powered by HPC & AI solutions are delivering new insights for research and the enterprise. “HPC is clearly no longer reserved for large companies or research organizations. It is meant for those who want to achieve more innovation, discoveries, and the elusive competitive edge.”

Agenda Posted: Exacomm 2019 Workshop at ISC High Performance

“The goal of this workshop is to bring together researchers and software/hardware designers from academia, industry and national laboratories who are involved in creating network-based computing solutions for extreme scale architectures. The objectives of this workshop will be to share the experiences of the members of this community and to learn the opportunities and challenges in the design trends for exascale communication architectures.”