Fair question. So to clarify, in the early phase of the QC technology S curve, I would expect exponential improvements because a lot of the technology hitchhikes on semiconductor scaling. You might even see Groverlike square-rooting of keys when they’re sufficiently short (although, even then, not exactly, because you’re going to be taxed by noise to some degree depending on the competency of your error correction scheme). But at some point, you can’t shove any more data into a physical system because the Universe is discrete at its roots. From an information perspective, it’s a giant connected graph incapable of supporting unlimited precision. Therefore, you have to hit a wall at some point.
This is what space probably looks like as it evolves, below comapratively macroscopic QM:
Here’s the comprehensive metatheory from first principles:
https://wolframphysics.org/technical-introduction
Again, this doesn’t describe our Universe specifically, but based on their simulations, it seems likely that one of the possible parameterizations of the metatheory would result in the reproduction of QM. Here’s some analysis of what they’ve found so far:
More abstractly, here’s a good laymen’s intro to the ramifications of spacetime being a quasicrystal, in other words, a structurally complex 4D projection of, in this case, an 8D base reality. Sorry, it’s cringeworthy in bits, but it gets the point across:
https://www.youtube.com/watch?v=w0ztlIAYTCU
While I couldn’t find the part where Klee Irwin discusses the processing limits of the QSN, here’s a decent presentation of the concept of nonergodicity, which might be better termed “probabilistic dehomogenization”, meaning the notion that, at sufficiently small scales, not every allowed state actually gets visited by a quantum system because, well, it runs out of entropy. This is due to (1) the QSN has only so much memory to offer and, far more significantly, (2) there’s a feedback loop connecting macroscale structure to microscale phenomena which corrupts the otherwise unbiased state search of a QC operating in the real world, completely apart from ordinary quantum noise. QCs are highly structured artificial machines subject to this nonergodic self-influence, which is an anaethema to efficient state exploration. It’s quite unclear that good engineering practices will allow us to escape this strange attractor issue.
https://www.youtube.com/watch?v=lTMGiDi7ld4
The channel itself is worth some exploration, although it’s more of a collection of lectures than a tutorial:
https://www.youtube.com/c/QuantumGravityResearch/videos
Now, you might say “so what” because the information density limits of physical systems are astronomical by any measure. Even at one bit per atom, we’re talking Avogadro’s number of bits in a handful of sand. But remember: the bill adds up quickly when we’re talking about cutting key spaces by enough to be solvable in reasonable serial time; you’re asking the Universe for exponentially more operations every time you cut just one more bit off the space. The Universe isn’t analog, so at some point it has to resist by not actually trying all possible pathways, but merely a subset of them. But you wouldn’t necessarily notice because the QC seems to output a different candidate key every time you turn the crank. And indeed, if you could turn the crank an infinite number of times, you’d see every possible output, although some much more often than others. But that doesn’t mean the computation is ergodic on any given attempt. It just means it’s ergodic in the classical sense of serialized exploration. Therefore, unfortunately, it’s not the machine of Shor or Grover that was featured in the brochure.