(In addition to the standard surface code, Google includes several qubits that do the processing) phenomenon This is called “leakage” and causes the qubit to end up in a higher energy state instead of the two lower energy states defined as 0 and 1. )
The key result is that the system’s ability to detect and correct errors more than doubles as the distance increases from 3 to 5. It doubled again when the distance went from 5 to 7. This shows that the hardware qubits have reached a sufficient quality that adding more qubits to the logical qubits can have an exponential effect.
Google says, “As we increase the grid from 3×3 to 5×5 to 7×7, the error rate decreases by a factor of 2 each time.” michael newman. “And that’s the exponential error suppression we want.”
grow bigger
The second thing they demonstrated is that if you create the largest logical qubit that the hardware can support, with a distance of 15, you can retain quantum information for an hour on average. This is surprising, since Google’s previous research had found widespread simultaneous errors in its processors, which the team attributed to the effects of cosmic rays. (I don’t know if this diagnosis is correct, however, as IBM says it doesn’t see similar symptoms.) These occurred every 10 seconds. However, this study shows that a sufficiently large error code can correct these events, regardless of the cause.
However, these qubits do not last indefinitely. One of them appears to be an increase in localized transient errors. The second, more difficult problem to address involves widespread spikes in error detection that affect regions containing about 30 qubits. However, Google has only confirmed six of these events so far, so it’s difficult to really characterize them, we told Ars. “It’s starting to become a little difficult to study because it’s so rare that you have to get a lot of statistics just to actually confirm such an event,” Kelly said.