pull down to refresh

I've been discussing the paper I wrote about yesterday at work with people who are much more knowledgeable about QC (Quantum Computing) than I am. It has been rather interesting. I specifically asked if anyone had debunked the author's points. Is he credible? What do people think? It’s been engaging. For the most part, people are not challenging his core thesis, which I find very interesting. They mostly disagree with his conclusions—primarily things he's said on mailing lists, not in the paper itself.

The Author

From what I'm hearing, Peter Gutmann is a contrarian voice in the cryptography world. This seems to be due to his dismissal of QC as a threat to cryptography and that it is something that can be ignored. I've heard some say that researchers aren’t going to “waste their time” refuting this paper. Side note: it's not just Bitcoin that is contentious.
I’ve pushed back on this attitude as a mistake. Dissident voices—even from minority opinions—can be correct. The thing is, several people couldn’t get past disagreeing with his opinion on ignoring QC and post-quantum encryption research. To me, that isn’t the most important thing. What I’m most interested in is understanding whether what we’ve been told—and what people are basing their reactions on—is nonsense.
Over the years I have learned that a large number of people are perfectly content to base their opinions on topics on things not directly related to the topic. They will straw-man something all day if it means they can continue in their current world view. That's not how I'm wired. It is rare that someone who is speaking truth is without flaws in their arguments. They often have flaws in their conclusions. But the curious mind and a truth seeker is better served by steel-manning their arguments. In highly technical fields people often are not the best communicators. They often take things person. They often lash out and over-sell their case. This doesn't always mean they should be ignored.

Is He Correct?

First off, let’s be clear: the paper does not say or attempt to prove that QC will never work or never be a threat to modern encryption. The author may personally hold that view, but it’s not the focus of the paper. So far, not a single person has told me his thesis is wrong. As a reminder, this is the core point of the paper:
This paper presents implementations that match and, where possible, exceed current quantum factorisation records using a VIC-20 8-bit home computer from 1981, an abacus, and a dog.
He isn’t trying to prove that QC is impossible. I’ve heard that’s his opinion, but again, the paper isn’t trying to prove that.

The Primary Disagreement

The majority of responses I’ve received—and there are a lot—are about what should be done. Almost no one believes QC should be ignored. Even those who agree with my opinion that most of what we’ve seen is hype still think researchers should be working on quantum-resistant encryption. The main reason is that so much encrypted data is hoovered up by state actors like the NSA and its counterparts in other countries. If or when that encryption is broken in the future, the consequences could be catastrophic.
I’ve repeated this throughout the thread: I’m here to learn. I have my instincts and opinions, but I don’t mind being wrong. I welcome being proven wrong. I have no problem with cryptographers working on stronger encryption. It’s none of my business how they spend their time. Clearly, a lot of money is being spent on QC research, and I think, after more discussion and learning, that ignoring it is likely a mistake.
That said, I don’t think we should be rushing or acting like the sky is falling. I’m thinking about Bitcoin here. If the tests we've been shown so far are this weak and flawed, I don’t believe QC is a threat that’s just around the corner. It could be another 25 years—or more. It may never work. But on the other hand its not like changing bitcoin is an easy thing to do. I'm not ready to call anyone proposing post-quantum changes a spook. Not yet at least.
This has been an interesting rabbit hole for me. I hope y’all have found it interesting as well.
142 sats \ 0 replies \ @freetx 1h
Another lens to view QC is with regard to development of modern CPUs.
A very very brief history:
  • 1947 - First Solid-State Transistor produced at Bell Labs
  • 1954 - Texas Instruments releases first commercially available solid-state transistor
  • 1971 - Intel 4004 becomes first commercially available solid-state CPU
Relating to QC tech, we would be somewhere around 1951 or so....that is we have some lab produced qubits but haven't yet got the qubit to a commercial state.
This is really just the start of the race, after "commercial qubits" we still need to figure out how to make them smaller and smaller so that they can be dense enough (1M quibits?) to perform useful work.
As Hossenfelder says: "We need 1M qubits and we are still about 1M qubits away...."
Now there is another aspect (that they hardly ever talk about) and that is software. Do we actually have algorithms that work on QC? So far the proof-of-concepts algos have each be proven that classical computers could do as well....the one example they've produced of a actual QC algo was a pointless result....meaning while it was strictly technically true that they way in which it performed the task was faster than a classical CPU, you could in fact perform the task much faster using a classical CPU just using a different procedure.
As an analogy, suppose I said I invented a house painting robot that could automatically paint a house in 148 hours....and I pointed out that there was no other automated process that could achieve that. That might strictly be true, but it fails to acknowledge that a single workman with a roller brush can probably paint the house in a fraction of the time for a fraction of the money.....
reply
168 sats \ 1 reply \ @Scoresby 4h
This has been an excellent balance to the Chaincode paper on quantum resistance (which was my other main touch point with the issue).
The points I found relevant from that paper are:
  1. In all proposed solutions, migration takes months or more likely years. There is a reasonable argument that goes "since we can't make this change quickly, we should be hyper sensitive to developments in quantum computing."
  2. all the proposed solutions involve larger signature sizes. Therefore there is a real cost to needlessly adopting quantum resistance. (I don't believe the chain code paper authors draw this conclusion themselves).
reply
102 sats \ 0 replies \ @kepford OP 4h
One person's response to my thread was that mathematically we already know the encryption can be broken but there isn't a machine that can really do it yet. I can totally see that if that is the case that it is just a matter of time. The real question is how much time and I don't think we can know that as outsiders. Not really sure insiders know either.
reply
Interesting.
If i'm understanding correctly, his claim is that the quantum factorization algorithms have only been demonstrated on toy cases that were specifically engineered to be doable in a lab, but haven't been demonstrated to work on general cases where the researcher isn't able to control the inputs, nor knows the correct answer of the output.
Is that right?
reply
Yep, that is my understanding as well as Steve Gibson's summary. His paper is pretty narrowly focused on that and honestly its interesting to read people's responses that haven't read the paper but seem to acknowledge he is correct even if reluctantly.
reply
Who else remembers this meme? It's been around forever and I don't recall it ever being debonked.
What QC chicken littles want you to believe, while grifting on immense gravy train of opaque funding, is that they have a magic contraption can just assume these numbers all at once by... making things really cold?
The much more likely explanation (because its inherently superpositional whether QC is a hoax or not, right?), is that "coherence" is just new scammer nomenclature for brute forcing things and therefore not possible.
reply