Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What surprises me is how non-linear this argument is. For a classical attack on, for example RSA, it is very easy to a factor an 8-bit composite. It is a bit harder to factor a 64-bit composite. For a 256-bit composite you need some tricky math, etc. And people did all of that. People didn't start out speculating that you can factor a 1024-bit composite and then one day out of the blue somebody did it.

The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.

And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.

This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.

If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.



See https://bas.westerbaan.name/notes/2026/04/02/factoring.html and https://scottaaronson.blog/?p=9665#comment-2029013 which are linked to in the first section of the article.

> Sure, papers about an abacus and a dog are funny and can make you look smart and contrarian on forums. But that’s not the job, and those arguments betray a lack of expertise. As Scott Aaronson said:

> Once you understand quantum fault-tolerance, asking “so when are you going to factor 35 with Shor’s algorithm?” becomes sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

To summarize, the hard part of scalable quantum computation is error correction. Without it, you can't factorize essentially anything. Once you get any practical error correction, the distance between 32-bit RSA and 2048-bit RSA is small. Similarly to how the hard part is to cause a self-sustaining fissile chain reaction, and once you do making the bomb bigger is not the hard part.

This is what the experts know, and why they tell us of the timelines they do. We'd do better not to dismiss them by being smug about our layperson's understanding of their progress curve.


I’ve worked with Bas. I respect him, but he is definitely a QC maximalist in a way. At the very least he believes that caution suggests the public err on the side of believing we will build them.

The actual challenge is we still don’t know if we can build QC circuits that factorize faster than classical both because the amount of qubits has gone from ridiculously impossible to probably still impossible AND because we still don’t know how to build circuits that have enough qbits to break classical algorithms larger or faster than classical computers, which if you’re paying attention to the breathless reporting would give you a very skewed perception of where we’re at.

It’s also easy to deride your critics as just being contrarian on forums, but the same complaint happens to distract from the actual lack of real forward progress towards building a QC. We’ve made progress on all kinds of different things except for actually building a QC that can scale to actually solve non trivial problems . It’s the same critique as with fusion energy with the sole difference being that we actually understand how to build a fusion reactor, just not one that’s commercially viable yet, and fusion energy would be far more beneficial than a QC at least today.

There’s also the added challenge that crypto computers only have one real application currently which is as a weapon to break crypto. Other use cases are generally hand waved as “possible” but unclear they actually are (ie you can’t just take any NP problem and make it faster even if you had a compute and even traveling salesman is not known to be faster and even if it is it’s likely still not economical on a QC).

Speaking of experts, Bas is a cryptography expert with a specialty in QC algorithms, not an expert in building QC computers. Scott Aronson is also well respected but he also isn’t building QC machines, he’s a computer scientist who understands the computational theory, but that doesn’t make him better as a prognosticator if the entire field is off on a fool’s errand. It just means he’s better able to parse and explain the actual news coming from the field in context.


Don't recognise you from your username, but thanks for the respect. (Update: ah, Vitali! Nice to hear from you.)

If you look back at my writing from 2025 and earlier, I'm on the conservative end of Q-day estimates: 2035 or later. My primary concern then is that migrations take a lot of time: even 2035 is tight.

I'm certainly not an expert on building quantum computers, but what I hear from those that are worries me. Certainly there are open challenges for each approach, but that list is much shorter now than it was a few years ago. We're one breakthrough away from a CRQC.


For me presuming Q-day will happen which is why I categorize that more as a maximalist camp, same as people who believe AGI is inevitable are AI maximalists. I could also be misremembering our conversation, but I thought you had said something like 2029 or 2030 in our 2020 conversation :)?

My concern is that there's so much human and financial capital behind quantum computing that the "experts" have lots of reason to try to convince you that it's going to happen any day now. The cryptographic community is rightly scared by the potential because we don't have any theoretical basis to contradict that QC speedups aren't physically possible, but we also don't have any proof (existence or theoretical) that proves they are actually possible.

The same diagrams that are showing physical q-bits per year or physical qbits necessary to crack some algorithm are the same ones powering funding pitches and that's very dangerous to me - it's very possible it's a tail wagging the dog situation.

The negative evidence here for me is that all the QC supremacy claims to date have constantly evaporated as faster classical algorithms have been developed. This means the score is currently 0/N for a faster than classical QC. The other challenge is we don't know where BQP fits or if it even exists as a distinct class or if we just named a theoretical class of problems that doesn't actually exist as a distinct class. That doesn't get into the practical reality that layering more and more error correction doesn't matter so much when the entire system still decoheres at any number at all relevant for theoretically being able to solve non-trivial problems.

Should we prepare for QC on the cryptography side? I don't know but I'm still less < 10% chance that CRQC happens in the next 20 years. I also look at the other situation - if CRQC doesn't ever happen, we're paying a meaningful cost both in terms of human capital spent hardening systems against it and ongoing in terms of slowing down worldwide communications to protect against a harm that never materializes (not to mention all the funding burned spent chasing building the QC). The problem I'm concerned about is that there's no meaningful funding spent trying to crack whether BQP actually exists and what this complexity class actually looks like.


> I could also be misremembering our conversation, but I thought you had said something like 2029 or 2030 in our 2020 conversation

Think that must've been around 2022. It'd have been me mentioning 2030 regulatory deadlines. So far progress in PQC adoption has been mostly driven by (expected) compliance. Now it'll shift to a security issue again.

> My concern is that there's so much human and financial capital behind quantum computing that the "experts" have lots of reason to try to convince you that it's going to happen any day now.

There've been alarmist publications for years. If it were just some physicists again, I'd have been sceptical. This is the security folks at Google pulling the alarm (among others.)

> [B]ut we also don't have any proof (existence or theoretical) that proves they are actually possible.

The theoretic foundation is pretty basic quantum mechanics. It'd be a big surprise if there'd be a blocker there. What's left is the engineering. The problem is that definite proof means an actual quantum computer... which means it's already too late.

> The other challenge is we don't know where BQP fits

This is philosophy. Even P=NP doesn't imply cryptography is hopeless. If the concrete cost between using and breaking is large enough (even if it's not asymptotically) we can have perfectly secure systems. But this is quite a tangent.

> Should we prepare for QC on the cryptography side?

A 10% chance it happens by 2030, means we'll need to migrate by 2029.

> it and ongoing in terms of slowing down worldwide communications

We've been working hard to make the impact negligible. For key agreement the impact is very small. And with Merkle Tree Certificates we also make the overhead for authentication negligible.


The thing is, producing the right isotopes of uranium is mostly a linear process. It goes faster as you scale up of course, but each day a reactor produces a given amount. If you double the number of reactors you produce twice as much, etc.

There is no such equivalent for qubits or error correction. You can't say, we produce this much extra error correction per day so we will hit the target then and then.

There is also something weird in the graph in https://bas.westerbaan.name/notes/2026/04/02/factoring.html. That graph suggests that even with the best error correction in the graph, it is impossible to factor RSA-4 with less then 10^4 qubits. Which seems very odd. At the same time, Scott Aaronson wrote: "you actually can now factor 6- or 7-digit numbers with a QC". Which in the graph suggests that error rate must be very low already or quantum computers with an insane number of qubits exist.

Something doesn't add up here.


We are stretching the metaphor thin, but surely the progress towards an atomic bomb was not measured only in uranium production, in the same way that the progress towards a QC is not measured only in construction time of the machine.

At the theory level, there were only theories, then a few breakthroughs, then some linear production time, then a big boom.

> Something doesn't add up here.

Please consider it might be your (and my) lack of expertise in the specific sub-field. (I do realize I am saying this on Hacker News.)


Not only, but a huge challenge was manufacturing enough fuel and was the real limiting part. They were working out hard science and engineering but more fuel definitely == bigger bomb in a very real way and it is quite linear because E=mc^2. And it was in many ways the bottleneck for the bombs - it literally guided how big they made the first bomb and the US manufactured enough for 3 - 1 test, 2 to drop


> That graph suggests that even with the best error correction in the graph, it is impossible to factor RSA-4 with less then 10^4 qubits. Which seems very odd.

It's because the plot is assuming the use of error correction even for the smallest cases. Error correction has minimum quantity and quality bars that you must clear in order for it to work at all, and most of the cost of breaking RSA4 is just clearing those bars. (You happen to be able to do RSA4 without error correction, as was done in 2001 [0], but it's kind of irrelevant because you need error correction to scale so results without it are on the wrong trendline. That's even more true for the annealing stuff Scott mentioned, which has absolutely no chance of scaling.)

You say you don't see the uranium piling up. Okay. Consider the historically reported lifetimes of classical bits stored using repetition codes on the UCSB->Google machines [1]. In 2014 the stored bit lived less than a second. In 2015 it lived less than a second. 2016? Less than a second. 2017? 2018? 2019? 2020? 2021? 2022? Yeah, less than a second. And this may not surprise you but yes, in 2023, it also lived less than a second. Then, in 2024... kaboom! It's living for hours [4].

You don't see the decreasing gate error rates [2]? The increasing capabilities [3]? The ever larger error correcting code demonstrations [4]? The front-loaded costs and exponential returns inherent to fault tolerance? TFA is absolutely correct: the time to start transitioning to PQC is now.

[0]: https://www.nature.com/articles/414883a

[1]: https://algassert.com/assets/2025-12-24-qec-foom/plot-half-l... (from https://algassert.com/post/2503 )

[2]: https://arxiv.org/abs/2510.17286

[3]: https://www.nature.com/articles/s41586-025-09596-6

[4]: https://www.nature.com/articles/s41586-024-08449-y


You can already factor a 6 digit number with a QC, but not with an algorithm that scales polynomially. The graph linked is for optimized variants of Shor's algorithm.


So today you have 1 gram. No bomb. Tomorrow you have 2 grams. Still no bomb.

...

365 days later, you have 365 grams after spending ungodly amounts of energy to separate isotopes. AND STILL NO BOMB! Not even a small one. These scientists are just some bullshit artists.

52kg later: BOOM!


Not a very good analogy, because by the time you get 26 kg, I still have 71 years before you get the bomb.


But you know beforehand how much you need. We can measure and make predictions with accuracy.


> Similarly to how the hard part is to cause a self-sustaining fissile chain reaction, and once you do making the bomb bigger is not the hard part.

I don't like this analogy very much, because in practice making a nuclear reaction is much, much easier than making a nuclear bomb. You don't need any kind of enrichment or anything, just a big enough pile of natural uranium and graphite [1].

Making a bomb on the other hand, required an insane amount of engineering: from doing isotope separation to enrich U235 to an absurd level (and / or, extract plutonium from the wastes of a nuclear reactor) to designing a way to concentrate a beyond critical mass of fissile element.

The Manhattan project isn't famous without reason, it was an unprecedented concerted effort that wouldn't have happened remotely as quickly in peacetime.

[1]: https://en.wikipedia.org/wiki/Chicago_Pile-1


> produce at least a small nuclear explosion

The Manhattan Project scientists actually did this before anybody broke ground at Los Alamos. It was called the Chicago Pile. And if the control rods were removed and the SCRAM disabled, it absolutely would have created a "small nuclear explosion" in the middle of a major university campus.

Given the level of hype and how long it's been going on, I think it's totally reasonable for the wider world to ask the quantum crypto-breaking people to build a Chicago Pile first.

https://en.wikipedia.org/wiki/Chicago_Pile-1


TIL about the Chicago Pile! (I don't know enough about the physics to tell if it could have indeed exploded.)

> On 2 December 1942

https://en.wikipedia.org/wiki/Chicago_Pile-1

> on July 16, 1945

https://en.wikipedia.org/wiki/Trinity_(nuclear_test)

Two years and a half. This is still a good metaphor for "once you can make a small one, the large one is not far at all."


A meltdown is not a nuclear explosion. It's not even what happens if you fail to make a nuke go off properly.


In truth the Chicago Pile crowd were all about power generation and didn't think it was feasible to make a nuclear bomb ..

( Not impossible, more strictly "beyond reach" economically and processing wise, operating on over estimates of the effort and approach )

They ignored letters from Albet Einstein on the topic, they ignored or otherwise disregarded several letters from the Canadian / British MAUD Committee / Tube Alloys group and it took a personal visit from an Australian for them to sit up and take note that such a thing was actually within reach .. although it'd take some man power and a few challenges along the way.

* https://en.wikipedia.org/wiki/MAUD_Committee is one place to start on all that.


What? No. No matter what anybody did with the Chicago Pile, it would never have produced a small version of a nuclear detonation.


> And that state of the art has not moved much in the last decade

This is far from true. On the experimental side, gate fidelities and physical qubit numbers have increased significantly (a couple of orders of magnitude). On the theory side, error correction techniques have improved astronomically -- overhead to of error corrections has dropped by many orders of magnitude. On the error correction side progress has been feverish over the last 4 years in particular.


IIRC the largest number factored still remains 21


Yeah that's treating D-Wave "breaking" RSA-2048 as the fraud that it is. They didn't factor anything, they computed a square root.

I'm still dubious about the accelerated timeline given what quite a bit of what is presented as progress in the field is fraud or borderline fraud when inspected closely. (e.g. some of the recent majorana claims by Microsoft are at best overhyped, at worst fraud)


His article specifically mentions that the threat is with the public key exchange, not the encryption that happens after the key exchange.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: