New protocol for quantum computers tackles noise issue

Read time 2min 30sec

Quantum computers will now have help tackling the central problem in their performance – noise.

Quantum computing researchers say noise is the central obstacle to building large-scale quantum computers.

They note that quantum systems with sufficiently uncorrelated and weak noise could be used to solve computational problems that are intractable with current digital computers.

Joel Wallman, a researcher at the Institute for Quantum Computing (IQC) and assistant professor of applied mathematics at the University of Waterloo, has developed a protocol that will help deal with the issue of noise in quantum computers so that they can tackle more complex problems.

“The intrinsic noise in quantum computers makes their output unreliable,” says Wallman, also co-founder of Quantum Benchmark, a start-up spun out of IQC.

“So any problem that we know how to solve on a quantum computer can be solved better on conventional computers. To deliver quantum computers that can do something useful, we need to make larger quantum computers and work out how to accurately control them.”

According to market research firm GlobalData, investments in quantum computing so far have been primarily led by the US and a few other European countries such as the UK, the Netherlands and France.

It notes that investment in quantum computing is expected to intensify in 2020, as more countries and leading tech firms hop onto the quantum physics bandwagon.

In SA, companies like IBM are driving quantum computing. The computing company forged a partnership with Wits University to accelerate quantum computing.

Last year, Wits University became the first African partner on the IBM Q Network.

IBM believes the Square Kilometre Array radio telescope can be one use case where quantum computing can be deployed.

Wallman, together with Robin Harper and Steve Flammia of the University of Sydney, has developed a new protocol that works on large systems – quantum computers running on many qubits (the quantum version of a classical computer’s binary bit) – that lets researchers characterise quantum noise across the qubits reliably and efficiently.

Prior to this work, the researchers ran error assessment protocols that could only detect errors on a small subset of the qubits.

They say the new method returns an estimate of the effective noise and can detect error correlations within arbitrary sets of qubits.

“The reason this protocol is so important is that if noise in systems don’t act locally, existing error correction and mitigation techniques just don’t work,” says Wallman. “And the data we obtained demonstrated that such nonlocal errors exist in real quantum computers.”

Wallman’s research team at the University of Waterloo and Quantum Benchmark is currently furthering the technique to characterise and suppress errors in specific data operations.

See also