pull down to refresh
116 sats \ 1 reply \ @freetx 9h \ on: Quantum computing and blockchains: Matching urgency to actual threats - a16z bitcoin
Great writeup.
To solve the decoherence problem, QC uses QEC (Quantum Error Correction). That is they use multiple qubits in parallel to discover and correct errors.
The result of all these "physical qubits" error correcting each other produces one "logical qubit" that is error free. (Logical qubits are the things that do the work)
With current error rates of roughly 0.3%, maintaining a single error-corrected logical qubit requires approximately 10,000 physical qubits.
So when people say "Shor's will take 10,000 qubits" - they are talking about LOGICAL qubits, so that may be something like 1,000,000 physical qubits to do that.
But there is a cruel paradox at the heart of this: Each time you bring another physical qubit online, that introduces more noise and more interference. Quite quickly your error rates start to rise faster than you are improving fidelity...so it sort of cascades into a negative feedback loop.
IBM's has a chip with 1200 physical qubits they achieved in 2023. They promised 4000 qubits in 2025 and missed their goal, but released a press release saying "100,000 qubits by 2033" (notice the bait-and-switch).
At this point most of QC is pie-in-the-sky snakeoil investor pumping. However, I think so much money has been spent that no one is ever going to admit defeat. So get used to yearly barrages of "we reached 8000 qubits" (fine print: physical).
"Cruel paradox" is a nice line ✍️
reply