Proposals Open to Research Community for ALCF AI Testbed’s Cerebras and SambaNova Systems

Print Friendly, PDF & Email

ALCF’s SambaNova DataScale system. (credit: Argonne National Laboratory)

Proposals are now being accepted by the Argonne Leadership Computing Facility for access to its AI Testbed, a collection of advanced artificial intelligence accelerators available for science. Researchers interested in using the AI Testbed’s Cerebras CS-2 and SambaNova DataScale systems can submit project proposals via the ALCF’s Director’s Discretionary program. Access to additional testbed resources, including GraphcoreGroq, and Habana accelerators, will be announced at a later date.

To apply for time, submit a proposal: Allocation Request Form. For more information about using the AI Testbed: AI Testbed User Guides. Contact support@alcf.anl.gov with questions.

The ALCF AI Testbed is designed for researchers to explore machine learning applications and workloads to advance AI for science. The AI platforms will complement the facility’s current and next-generation supercomputers to provide a state-of-the-art environment that supports pioneering research at the intersection of AI, big data, and high-performance computing (HPC).

“It’s clear that AI will have a significant role in the future of scientific computing,” said Michael Papka, director of the ALCF, a U.S. Department of Energy (DOE) Office of Science user facility at Argonne National Laboratory. “With the ALCF AI Testbed, our goal is to understand the role AI accelerators can play in advancing data-driven discoveries, and how these systems can be combined with supercomputers to scale to extremely large and complex science problems.”

The ALCF AI Testbed systems are built to support machine learning and data-centric workloads, making them well suited to address challenges involving the increasingly large amounts of data produced by supercomputers, light sources, and particle accelerators among other powerful research tools. In addition, the testbed will allow researchers to explore novel workflows that combine AI methods with simulation and experimental science to accelerate the pace of discovery.

“New AI technologies are largely designed for enterprise workloads and applications, such as e-commerce and social networks,” said Venkat Vishwanath, lead for ALCF’s Data Science Group. “By making the latest AI accelerators available to the open science community, we are providing a proving ground for innovative machine learning and HPC-driven research campaigns. We’re really looking forward to seeing how the community employs these accelerators for different types of scientific applications and workflows.”

ALCF’s Cerebras CS-2 system. (credit: Argonne National Laboratory)

Prior to opening the ALCF AI Testbed up to the broader scientific community, Argonne researchers led several collaborative efforts to use the AI accelerators for a variety of data-centric studies. Read about some of the early success stories below.

Edge Computing

To keep pace with the growing amount of data produced at DOE light source facilities, researchers are looking to machine learning methods to help with tasks such as data reduction and providing insights to steer future experiments. Using the ALCF’s Cerebras system, researchers from Argonne, the University of Chicago, SLAC National Accelerator Laboratory, and Stanford University demonstrated how specialized AI systems can be used to quickly train machine learning models through a geographically distributed workflow. To obtain actionable information in real-time, the team trained the models on the remote AI system and then deployed them on edge computing devices near the experimental data source. Their work was recognized with the Best Paper Award at last year’s Workshop on Extreme-Scale Experiment-in-the-Loop Computing (XLOOP) at SC21.

COVID-19 Research

Using a combination of AI and supercomputing resources, an Argonne-led team carried out a study of the SARS-CoV-2 replication mechanism that was nominated for the Gordon Bell Special Prize for HPC-Based COVID-19 Research at SC21. The team used data from cryo-electron microscopy to explore the molecular machinery, but static images alone did not provide sufficiently high resolution to capture the inner workings of the process. To get a closer look at the replication mechanism, the team developed an innovative workflow to enhance resolution using a hierarchy of AI methods that continually learn and infer features for maintaining consistency between different types of simulations. The researchers used the Balsam workflow engine to orchestrate AI and simulation campaigns across four of the nation’s top supercomputers and the ALCF’s Cerebras CS-2 system. The method allowed the team to study the SARS-CoV-2 replication transcription process at an unprecedented level of detail, while demonstrating a generalized, multiscale computational toolkit for exploring dynamic biomolecular machines.

Neutrino Physics

Scientists use liquid argon time projection chambers (LArTPCs) to detect neutrinos, but the resulting images are susceptible to background particles induced by cosmic interactions. To improve the neutrino signal efficiency, scientists use image segmentation to tag each input pixel as one of three classes: cosmic-induced, neutrino-induced, or background noise. Deep learning has been a useful tool for accelerating this classic image segmentation task, but it has been limited by the image size that available GPU-based platforms can efficiently train on. Leveraging the ALCF’s SambaNova system, researchers were able to improve this method to establish a new state-of-the art accuracy level of 90.23% using images at their original resolution without the need to downsample. Their work demonstrates capabilities that can be used to advance model quality for a variety of important and challenging image processing problems.

Drug Discovery

A team of Argonne researchers leveraged the ALCF’s Groq system to speed the process of searching through a vast number of small molecules to find promising antiviral drugs to fight COVID-19. With billions upon billions of potential drug candidates to sort through, the scientists needed a way to dramatically speed their search. In tests on a large dataset of molecules, the team found they could achieve 20 million predictions, or inferences, a second, vastly reducing the time needed for each search from days to minutes. Once the best candidates were found, the researchers identified which ones could be obtained commercially and had them tested on human cells at the University of Chicago’s Howard T. Ricketts Laboratory.

source: ALCF