HPC Predictions 2021: Quantum Beyond Qubits; the Rising Composable and Memory-Centric Tide; Neural Network Explosion; 5G’s Slow March

Print Friendly, PDF & Email

Annual technology predictions are like years: some are better than others. Nonetheless, many provide useful insight and serve as the basis for worthwhile discussion. Over the last few months we received a number of HPC and AI predictions for 2021, here are the most interesting and potentially valid. Let’s check in 12 months from now and see how many, along with our wish for a speedy pandemic end, come true.

Quantum’s Lack of Clarity
“Although it seems counterintuitive, the progress made in quantum hardware development over the next year may lead to more uncertainties within the quantum industry rather than less. However, this is actually a good thing. Currently, it may seem like one or two types of qubit hardware are ahead of the others in the R&D process. Over the next year, however, we expect to see more researchers working with other hardware types to close the gap. Closing this gap will undoubtedly make it more unclear as to which qubit model (superconducting qubits, NV centers, trapped ions/atoms, mechanical resonators, etc.) will be the industry standard. Still, the continued development of all the hardware types is essential to ensure we end up with the best possible technology.”

“While many of the headlines from the past years were focused on qubit counts, over the next year we expect the attention regarding quantum computers to shift from this focus on tracking solely the amount of qubits to a the actual metrics regarding the effectiveness of quantum computers, such as the ‘quantum volume’ metric introduced by IBM, and other metrics which measure performance via-a-vis specific algorithms.”

Dr. Itamar Sivan, Co-founder & CEO, Quantum Machines

Ethical, Domain-Specific Artificial Intelligence
“The industry will shift away from generic horizontal AI platforms, such as IBM Watson and Amazon Lex, towards domain-specific, AI-powered products and managed service models. Generic platforms are not solutions, They start cold, without any training data or data model structure – building this, then optimizing it in production, is an expert- and resource-intensive task that is beyond most company’s capability. The move from the early innovator market into mass market adoption will be driven in 2021 by the adoption of domain-specific AI powered products that are pre-trained for a specific industry and are proven to work.”

Jake Tyler, Co-founder & CEO, Finn AI

“The world will come together on AI ethics, standards and regulation. In 2020, International partnerships like Global Partnership on AI have moved from ideas to reality. In 2021, they will deliver expertise and alignment on how to ensure that we leverage AI against major global problems, ensure inclusion and diversity and stimulate innovation and economic growth.”

 Natalie Cartwright, Co-founder & COO, Finn AI

Composability
“In 2021, the prevalence of remote work — even post-pandemic — will continue accelerating capabilities in the cloud. This will drive unprecedented interest in disaggregated composable systems, so there aren’t wasted resources in an over-provisioned environment. This move toward streamlining resources will be critical in reducing the rising environmental impact of IT. For example, information and communication technology is already predicted to use 20 percent of the world’s electricity by 2030. As companies look to incorporate sustainability into business strategy and reduce OpEx for compute-intensive workloads such as AI and high-performance computing, we’ll see escalating demand for energy-efficient memory and storage.”

“2021 is going to see AI-as-a-service become mainstream, intelligence migrate to the edge and 5G come to life. This is going to propel fundamental changes in the way server systems are architected. Memory will extend into multiple zones—and will become a shared resource. And storage and memory will merge. You’ll no longer think “DRAM for memory and NAND for storage” because faster NAND will create the ability to use it as memory.”

Raj Hazra, SVP of Emerging Products & Corporate Strategy, Micron

Memory Centrism
“The basic rationale for the many flavors of memory-centric compute—integrating CPUs and GPUs directly into memory devices, high speed interconnects inside 3D chiplets, etc.—is that moving data takes as much or more energy as computing. If you can keep the data where it is and move the compute to the memory, energy cost goes down. Some have shown that 62.7 percent of total system energy gets spent moving data between main memory and compute.

“Other critical benefits—reducing the relative need for internal bandwidth, getting around the problem of limited ‘beachfront’ real estate on chip edges for connections, being able to use energy savings ordinarily consumed in transport for other purposes—flow from the shift in thinking…

The ideas range from adding processors to memory cards to building customized memory instances with compute capability. We don’t know which will succeed, but as a concept, memory centric computing is inevitable.”

Rob Aitken, Fellow & Director of Technology, Arm

Neural Networks
“We will see neural networks continue to replace classical computing methods across a range of industries. What started as fun tech for recognizing cats in pictures will have evolved into a technological juggernaut, completely transforming most industries. Consumer electronics, healthcare, law, communications, the automotive industry—all will be transformed as neural network technology marches forward, achieving things once thought impossible by machine.

“The insatiable demand for neural network compute is already providing the motivation for a new class of processor optimized specifically for neural networks. New processor architectures with tensor-level operation abstractions will be present in nearly every computing platform, running the majority of computing cycles. This new class of processor will achieve orders of magnitude efficiency gains over traditional computing platforms, heralding an industry wide shift in the computing landscape. And of course, it will be running on Arm.”

Ian Bratt, Fellow and Senior Director of Technology, Arm

Software Insatiability
“Our demand for software is outstripping traditional methods for developing it. IDC estimates 500 million apps and services will be developed by 2023 using cloud-native approaches, as many as were developed 40 years previously.

“As time goes on, we will shift from developing applications to developing tools that can develop applications on our behalf. Similarly, a growing portion of chip design and verification will have to be performed through AI-powered applications, particularly for low volume products (i.e., thousands of units) optimized for particular uses.

Mark Hambleton, Vice President of Software, Arm

5G
“In the next few years, we’ll see 5G enable new and improved telemedicine, tele-learning and virtual entertainment. Already, the pandemic has forced us to embrace these, but it’s clear that the optimal infrastructure isn’t there. As 5G becomes a reality and the cultural shift toward social distancing lingers, we could see it enabling 4K/8K high-resolution video for telemedicine, personalized AI-based teachers in virtual classrooms, and lag-free Zoom meetings. In 2021 and beyond, 5G might power inventive contactless experiences in retail and hospitality, and interactive sporting and entertainment experiences.

“On the flip side, the promise that 5G will revolutionize our lives immediately or even by the end of next year is overhyped. It will take time for disruptive use cases to develop—maybe five years. First, the foundational 5G infrastructure and memory and storage needs to be in place—then those disruptive applications will come.”

Raj Talluri, SVP & GM, Mobile Business Unit, Micron

Storage, Data Management
“In 2021, look for more usage of object stores, for storing structured and unstructured data, files, blocks, objects — all in one repository. AI’s large data sets need proximity to where processing is happening. So, rather than viewing it as a large cold store, object stores are going to be able to do AI-type workloads, which means large sets of data can be accessed with high bandwidth and low latency. As a result of the rise of data-intensive AI workloads, we’ll see the need for high-performance NVMe storage also increase, since this high-performing object store resides on flash-based storage, as opposed to the traditional, cold storage. This could lead to faster adoption of Gen4 NVMe SSDs to enable more powerful object store for AI.”

Prasad Alluri, Vice President of Corporate Strategy, Micron

“When it comes to an active archive, some people talk about on-prem, some people talk about cloud or even multi-cloud, and some people talk about hybrid, but what everyone needs is unified active archive platforms. The user doesn’t care where the data is so long as it is quickly accessible when they want it and where they want it. Unified deployment architectures are the antipathy of a piecemeal approach and will see far greater prominence once the pandemic ‘work from home’ edict has died down and ‘work from anywhere (*even from the office!)’ takes off.

Laura Light, Marketing Manager, Object Matrix

Inclusive AI, Smarter Data Science
“In order to ensure diversity is baked into their AI plans, companies must also commit the time and resources to practice inclusive engineering. This includes, but certainly isn’t limited to, doing whatever it takes to collect and use diverse datasets. This will help companies to create an experience that welcomes more people to the field — looking at everything from education to hiring practices.”

“Data scientists will need to speak the language of business in order to translate data insight and predictive modeling into actionable insight for business impact. Technology owners will also have to simplify access to the technology, so that technical and business owners can work together. The emphasis for data scientists will be not just on how quickly they can build things, but on how well they can collaborate with the rest of the business.”

Florian Douetteau, CEO and co-founder of Dataiku