/* ---- Google Analytics Code Below */

Tuesday, February 28, 2023

Google Reports Progress on Quantum Error Correction

Google reports their progress, key measure of how well such computers will work. 

Our progress toward quantum error correction

Feb 22, 2023, Sundar Pichai, CEO of Google and Alphabet

Three years ago, our quantum computers were the first to demonstrate a computational task in which they outperformed the fastest supercomputers. It was a significant milestone on our roadmap toward building a large-scale quantum computer, and the “hello world” moment so many of us had been hoping for. Yet in the long arc of scientific progress it was just one step towards making quantum applications meaningful to human progress.

Now, we’re taking another big step forward: For the first time ever, our Quantum AI researchers have experimentally demonstrated that it’s possible to reduce errors by increasing the number of qubits. In quantum computing, a qubit is a basic unit of quantum information that can take on richer states that extend beyond just 0 and 1. Our breakthrough represents a significant shift in how we operate quantum computers. Instead of working on the physical qubits on our quantum processor one by one, we are treating a group of them as one logical qubit. As a result, a logical qubit that we made from 49 physical qubits was able to outperform one we made from 17 qubits. Nature is publishing our research today.

Here’s why this milestone is important: Our quantum computers work by manipulating qubits in an orchestrated fashion that we call quantum algorithms. The challenge is that qubits are so sensitive that even stray light can cause calculation errors — and the problem worsens as quantum computers grow. This has significant consequences, since the best quantum algorithms that we know for running useful applications require the error rates of our qubits to be far lower than we have today. To bridge this gap, we will need quantum error correction.

Quantum error correction protects information by encoding it across multiple physical qubits to form a “logical qubit,” and is believed to be the only way to produce a large-scale quantum computer with error rates low enough for useful calculations. Instead of computing on the individual qubits themselves, we will then compute on logical qubits. By encoding larger numbers of physical qubits on our quantum processor into one logical qubit, we hope to reduce the error rates to enable useful quantum algorithms.  .... '

No comments: