Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Two More Steps Towards Quantum Computing

Correlated syndrome single-shot histograms and quantum state tomography of code qubits.

Correlated syndrome single-shot histograms and quantum state tomography of code qubits.

Quantum computing has taken a step closer with two recent announcements demonstrating methods for error correction in addition to a new scalable design for quantum circuits based on a lattice structure.

The research, published earlier this month in the journal Nature Communications, lays out plans for overcoming some of the most significant technological roadblocks preventing quantum computing from reaching its full potential. The work at IBM was funded in part by the US IARPA (Intelligence Advanced Research Projects Activity) multi-qubit-coherent-operations program.

However IBM is not the only company working to solve these significant quantum computing challenges; another paper was published in Nature last month which also focuses on quantum error detection.

The team of researchers from IBM demonstrated that they could detect and measure the two types of quantum errors (bit-flip and phase-flip) that will occur in any real quantum computer. Until now, it was only possible to address one type of quantum error or the other, but never both at the same time. This is a necessary step toward quantum error correction, which is a critical requirement for building a practical and reliable large-scale quantum computer.

The researchers from IBM found that the design of IBM’s quantum bit circuit, based on a square lattice of four superconducting qubits on a chip roughly one-quarter-inch square, enables both types of quantum errors to be detected at the same time.

In contrast the paper from Kelly et al focused on quantum error detection on a linear array of nine qubits, as opposed to the square array in the IBM research project. The team used a system of repeatedly performing projective quantum non-demolition parity measurements to track errors as they occur in the system.

A particular focus of this paper was to look at error detection across multiple cycles as ‘quantum information must be preserved throughout computation using multiple error-correction cycles.’

The paper reports that relative to a single physical qubit, the team was able to reduce the failure rate in retrieving an input state by a factor of 2.7 when using five of the nine possible qubits and by a factor of 8.5 when using all nine qubits after eight cycles had been completed.

Kelly et al finished their paper with statement acknowledging that this is work could accelerate the research into ‘the many outstanding challenges that remain, such as the development of two-dimensional qubit arrays with scalable wiring and four-qubit QND parity checks, improving gate and measurement fidelities.’

In addition to the areas of research quoted by Kelly et al another major stumbling block for quantum computing is the scalable design of chips so that more qubit can be placed across the circuit, increasing the density of these systems and making them significantly more powerful than the prototypes and early products that are available today.

Arvind Krishna, senior vice president and director of IBM Research said: “Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today.”

Krishna continued: “While quantum computers have traditionally been explored for cryptography, one area we find very compelling is the potential for practical quantum systems to solve problems in physics and quantum chemistry that are unsolvable today. This could have enormous potential in materials or drug design, opening up a new realm of applications.”

IBM has been involved in quantum computing research since 1981, when Nobel Prize winner Richard Feynman first laid out the concept of a quantum computer at an MIT conference sponsored by IBM. Over the course of 30 years, they have developed the theory, carried out the experiments and now are building the devices for a true, practical quantum computer.

Jay Gambetta, a manager in the IBM Quantum Computing Group said: “Up until now, researchers have been able to detect bit-flip or phase-flip quantum errors, but never the two together. Previous work in this area, using linear arrangements, only looked at bit-flip errors offering incomplete information on the quantum state of a system and making them inadequate for a quantum computer. Our four qubit results take us past this hurdle by detecting both types of quantum errors and can be scalable to larger systems, as the qubits are arranged in a square lattice as opposed to a linear array.”

Because these qubits can be designed and manufactured using standard silicon fabrication techniques, IBM anticipates that once a handful of superconducting qubits can be manufactured reliably and repeatedly, and controlled with low error rates, there will be no fundamental obstacle to demonstrating error correction in larger lattices of qubits.

Although much work remains to be done this research highlights clear iterative development of the current technology. A technology that offers tremendous potential, if it can be harnessed efficiently and scaled to the levels required for HPC.

This story appears here as part of a cross-publishing agreement with Scientific Computing World.

Sign up for our insideHPC Newsletter.

Resource Links: