The new method has implications for the future of quantum information science, including quantum computing and quantum sensing.
Many current quantum information applications, such as carrying out an algorithm on a quantum computer, suffer from "decoherence", or a loss of information due to "noise."
Matthew Otten, a Maria Goeppert Mayer Fellow at Argonne, and Stephen Gray, group leader of Theory and Modeling at the Center for Nanoscale Materials, a U.S. Department of Energy Office of Science User Facility, have developed a technique that recovers this lost information and does this by repeating the quantum process or experiment many times, with slightly different noise characteristics, and only then analysing the results.
After gathering results by running the process many times in sequence or parallel, the researchers construct a hypersurface where one axis represents the result of a measurement and the other two (or more) axes represent different noise parameters. This hypersurface yields an estimate of the noise-free observable and gives information about the effect of each noise rate.
"It's like taking a series of flawed photographs," said Otten. "Each photo has a flaw, but in a different place in the picture. When we compile all the clear pieces from the flawed photos together, we get one clear picture."
Applying this technique effectively reduces quantum noise without the need for additional quantum hardware.
"This is a versatile technique that can be done with separate quantum systems undergoing the same process at the same time," said Otten.
Commenting Gray said, "Using our method, one would combine the results on the hypersurface and generate approximate noise-free observables. The results would help extend the usefulness of the quantum devices before decoherence sets in."
Otten and Gray have also developed a similar process to achieve noise-reduction results based on correcting one qubit at a time to approximate the result for all qubits being simultaneously corrected. A qubit, or quantum bit, is the equivalent in quantum computing to the binary digit or bit used in classical computing.
"In this approach, we assume that the noise can be reduced on each qubit individually, which, while experimentally challenging, leads to a much simpler data processing problem and results in an estimate of the noise-free result," noted Otten.
Above: The image shows an example of a 'hypersurface' fit to many experiments with slightly different noise parameters, 1 and 2. Black points are measurements of an observable with different noise rates. The red 'X' is the noise-free result. Blue, orange and green surfaces are first, third and fourth order fits.