Sign up for our newsletter and get the latest big data news and analysis.
Daily
Weekly

Intel Releases oneAPI 2022 Toolkits

Jan. 4, 2022 — Intel today released oneAPI 2022 toolkits, which the company said have expanded cross-architecture features to provide greater utility and architectural choice to accelerate computing. oneAPI is a cross-industry, open, standards-based unified programming model designed to improve the productivity of code development when building cross-architecture applications. New capabilities in the 2022 toolkits […]

Intel’s Infrastructure Processing Unit Targets Hyperscaler Data Centers

Intel has unveiled a programmable networking device for hyperscalers and their massive data center infrastructures. Called the Infrastructure Processing Unit and announced at Intel made the announcement during the Six Five Summit, it’s a networking device is intended to help cloud and communication service providers cut overhead and free up performance for CPUs, to better utilize […]

Video: GigaIO on Optimizing Compute Resources for ML, HPDA and other Advanced Workloads

In this interview, GigaIO CEO Alan Benjamin talks about systems performance problems and wasted compute resources when implementing ML, HPDA and other high demand workloads that involve high data volumes. At issue, Benjamin explains, is today’s rack architecture, which is decades old and unsuited for combinations of CPUs, GPUs and other accelerators needed for advanced computing strategies. The answer: the “composable disaggregated infrastructure.”

Video: Heterogeneous Computing at the Large Hadron Collider

In this video, Philip Harris from MIT presents: Heterogeneous Computing at the Large Hadron Collider. “Only a small fraction of the 40 million collisions per second at the Large Hadron Collider are stored and analyzed due to the huge volumes of data and the compute power required to process it. This project proposes a redesign of the algorithms using modern machine learning techniques that can be incorporated into heterogeneous computing systems, allowing more data to be processed and thus larger physics output and potentially foundational discoveries in the field.”