There are a few important things to consider about the likely emergence of quantum computers (QCs) suitable for use in cracking encryption of any sort:

First of all, the speedup is likely to be (ultimately) limited to some constant number of bits, so your 100-bit key becomes a 70-bit key, and your 130-bit key becomes a 100-bit key, for instance. There’s an oft-repeated rule of thumb that because Grover’s algorithm can search an unsorted database in a time which is proportional to the square root of its number of records, that therefore a brute force key search would take an amount of time which is proportional to a classical one-by-one search of a key space involving half as many bits. So that 100-bit key is really just a 50-bit key. There are several problems with this assumption, but the overriding issue is that the quasicrystalline spin network (QSN) has finite spatial information density. This isn’t comprehended by mainstream quantum mechanics (QM), but it’s implicitly baked into the branchial space metatheory of Steven Wolfram (a metatheory because it’s a theoretical framework for discovering the parameterization which produces the theory of QM as we know it) and the empire wave quantum gravity theory of Klee Irwin et al. For his part, Irwin has gone on record (sorry, didn’t keep the link) explaining just this point and how it derives from QSN lattice topology. (He hasn’t determined the density limit, however.)

On the other hand, we don’t know what this “quantum coupon” really saves us in practice. Probably at least 20 bits, based on preliminary quantum supremacy demos in other combinatorial problems. I don’t know where the ceiling lies, which is an engineering problem in the short term and a physical limit in the long term, but suffice to say that you should watch the progress of QC development in order to assess the likely quantum coupon as it evolves over time. In the near term, the answer mostly depends on the effectiveness of quantum error correction, which allows the computation to remain stable enough for long enough to be useful.

Secondly, you still need to pay for the remaining key bits after applying the quantum coupon. Let’s say you’re left with 70 bits. It’s entirely reasonable to assume that classical computers will be able to do 2^70 computations in humanly relevant time and cost at some point in the coming years. (It’s already possible, in theory, with lots of money.) So you can just log onto Amazon Web Services in 2060, pay a few bucks, and away you go! But… no. This doesn’t work because the quantum coupon cuts the key space by a *factor* of, say, 2^30, from 2^100 to 2^70 possible keys. That doesn’t mean that it *gives you* 30 of the key bits. Moreover, you don’t know which 2^70 states are left because it’s impossible to export them from quantum superposition. Therefore – guess what – you need to crank that QC 2^70 times to give you the actual key! Considering that a single run of the algorithm might take milliseconds – say even nanoseconds – this is just not gonna happen. (Yes, you could in theory connect several QCs together if your error correction holds up, but that’s expensive classical parallelization subject to economic intractability.) The only practical hope that the cracker has, assuming that your password was chosen at random (it was!) and you’re not running flawed software (uh… hmm…), is to find a nonexistent QC offering a sufficiently hefty quantum coupon. That might emerge, but if it does, you’ll likely have some idea of its capability because the same basic hardware will likely be applied in other optimization problems, such as drug discovery.

I’m not dismissing the threat. QCs are a force to be reckoned with, especially given the breakthrough just this year in quantum error correction efficiency. But this isn’t the death of cryptography – even if you’re “only” using “long” keys with prequantum ciphers.