Los Alamos Claims Quantum Machine Learning Breakthrough: Training with Small Amounts of Data

Print Friendly, PDF & Email

Researchers at Los Alamos National Laboratory today announced a quantum machine learning “proof” they say shows that training a quantum neural network requires only a small amount of data, “(upending) previous assumptions stemming from classical computing’s huge appetite for data in machine learning, or artificial intelligence.”

The lab said the theorem has direct applications, including more efficient compiling for quantum computers and distinguishing phases of matter for materials discovery.

“Many people believe that quantum machine learning will require a lot of data,” said Lukasz Cincio (T-4), a Los Alamos quantum theorist and co-author of the paper containing the proof published Aug. 23 in the journal Nature Communications. “We have rigorously shown that for many relevant problems, this is not the case.”

The paper, Generalization in quantum machine learning from few training data, is by Matthias C. Caro, Hsin-Yuan Huang, Cerezo, Kunal Sharma, Sornborger, Patrick Coles and Cincio.

“This provides new hope for quantum machine learning,” Cincio said. “We’re closing the gap between what we have today and what’s needed for quantum advantage, when quantum computers outperform classical computers.”

AI systems need data to train the neural networks to recognize — generalize to — unseen data in real applications. It had been assumed that the number of parameters, or variables, would be determined by the size of a mathematical construct called a Hilbert space, which becomes exponentially large for training over large numbers of qubits, Los Alamos said in its announcement. That size rendered this approach nearly impossible computationally.

“The need for large data sets could have been a roadblock to quantum AI, but our work removes this roadblock. While other issues for quantum AI could still exist, at least now we know that the size of the data set is not an issue,” said Coles (T-4), a quantum theorist at the lab and co-author of the paper.

“It is hard to imagine how vast the Hilbert space is: a space of a billion states even when you only have 30 qubits,” Coles said. “The training process for quantum AI happens inside this vast space. You might think that searching through this space would require a billion data points to guide you. But we showed you only need as many data points as the number of parameters in your model. That is often roughly equal to the number of qubits — so only about 30 data points,” Coles said.

One key aspect of the results, Cincio said, is that they yield efficiency guarantees even for classical algorithms that simulate quantum AI models, so the training data and compilation often can be handled on a classical computer, which simplifies the process. Then the machine-learned model runs on a quantum computer.

“That means we can lower the requirement for the performance quality that we need from the quantum computer, with respect to noise and errors, to perform meaningful quantum simulations, which pushes quantum advantage closer and closer to reality,” Cincio said.

The speedup resulting from the new proof has dramatic practical applications. The team found they could guarantee that a quantum model can be compiled, or prepared for processing on a quantum computer, in far fewer computational gates, relative to the amount of data. Compiling, a crucial application for the quantum computing industry, can shrink a long sequence of operational gates or turn the quantum dynamics of a system into a gate sequence.

“Our theorem will lead to much better compilation tools for quantum computing,” Cincio said. “Especially with today’s noisy, intermediate-scale quantum computers where every gate counts, you want to use as few gates as possible so you don’t pick up too much noise, which causes errors.”

The team also showed that a quantum AI could classify quantum states across a phase transition after training on a very small data set, Los Alamos said.

“Classifying the phases of quantum matter is important to materials science and relevant to the mission of Los Alamos,” said Andrew Sornborger (CCS-3), director of the Quantum Science Center at the Laboratory and co-author of the paper. “These materials are complex, having multiple distinct phases like superconducting and magnetic phases.”

Creating materials with desired traits, such as superconductivity, involves understanding the phase diagram, Sornborger said, which the team proved could be discovered by a machine-learning system with minimal training.

Other potential applications of the new theorem include learning quantum error correcting codes and quantum dynamical simulations.

“The efficiency of the new method exceeded our expectations,” said Marco Cerezo (CCS-3), a Los Alamos expert in quantum machine learning. “We can compile certain, very large quantum operations within minutes with very few training points — something that was not previously possible.”

“For a long time, we could not believe that the method would work so efficiently,” Cincio said. “With the compiler, our numerical analysis shows it’s even better than we can prove. We only have to train on a small number of states out of billions that are possible. We don’t have to check every option, but only a few. This tremendously simplifies the training.”

The Funding (Los Alamos co-authors only): ASC Beyond Moore’s Law project at Los Alamos National Laboratory; U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing Research Accelerated Research in Quantum Computing program; Laboratory Directed Research and Development program at Los Alamos National Laboratory; DOE Office of Science, National Quantum Information Science Research Centers, Quantum Science Center; and Department of Defense.