Considerable piece. I understood that error control was important, how this fits in is less clear. Reliability can be seen as key. Diving more deeply.
Error Control Begins to Shape Quantum Architectures By Chris Edwards
Communications of the ACM, January 2023, Vol. 66 No. 1, Pages 13-15 10.1145/3570518
Sound waves on both sides of a quantum computer, illustration ....
Quantum computing has a crucial weakness that may severely delay, if not kill outright, its chances of becoming a way of running algorithms that classical computers cannot handle: its susceptibility to noise.
Conventional electronic circuits face their own problems of how to deal with random changes to values in memory or circuits caused by cosmic rays and other interference. Codes that exploit just a few redundant data bits allow those random errors to be corrected on the fly.
The same core principle works for quantum computers, but with one key difference: the error correction must take account of the subtle changes in state that quantum circuits move through before the final, collapsed state is read out from each quantum bit (qubit). At that point, information that would signal an error is itself destroyed.
While attempts to show practical quantum error correction (QEC) working on actual hardware have come relatively recently, the concept is almost as old as the first algorithm. Less than a year after he presented his seminal algorithm for efficiently factoring large primes in 1995, Peter Shor, now professor of applied mathematics at the Massachusetts Institute of Technology, developed a code to catch and correct errors in qubits.
Shor showed it is possible to spread the information across multiple qubits and stabilizer qubits that are analogous to the parity bits used in digital error correction. Circuitry can analyze the symmetry properties of all the qubits in the code word without affecting the entanglement of the data qubits by just reading the stabilizer qubits.
Because they often rely on fragile properties such as electron orbitals or spin states, qubits are far more susceptible to unwanted changes than conventional electronic bits, which leads to poor performance in practical circuits. Production machines need the effective error rate per qubit to be less than one in a quadrillion; today, the error rate for physical qubits is one in 1,000 at best. Such high error rates call for as many as 30 additional qubits needed to protect just one physical qubit.
Even with high overheads, QEC as it exists today has limitations that increase the difficulty of making quantum computers reliable. Stabilizer codes cannot deal with errors that are generated by quantum-gate manipulations. That calls for additional flag bits that alert the control electronics to errors as they occur.
Stabilizer codes also do not work for all the operations needed for universal quantum computing. They can only work for gates in the Clifford group that apply a restricted set of phase and magnitude operations. Gates outside that group, such as the T-gate, need to be protected by other means.
There are proposals to encode qubits in a way that might make it possible to detect the subtle phase errors that afflict non-Clifford gates. Unfortunately, some recent experiments have questioned whether this encoding will support useful error detection in those gates.... '
No comments:
Post a Comment