I’m just curious about your though of quantum computing’s effect on asymetrical encryption.

Should we assumed such computing power exist as we speak? Is there alternative encryption method in the process of replacing it, in case a super computer being able to calculate the private key emerge?

Let’s being “science-fictionious” while keeping it real. Here’s the HAL’s article I stumbled upon.

“Post quantum cryptography” is the keyword. As far as I know there are algorithms in development. Also IIRC not all cryptography is equally affected by quantum computing. I guess that relates to some symmetric algorithms. This raises the question of post quantum key negotiation and distribution. Probably you can find some info when researching the abhove mentioned keyword.

Been thinking about that lately as well but form what I understood the quantum processor need to have as many qbits as the key is long which is still probably a long ways. AFAIK the initial information source was this video on media.ccc

Symmetric crypto is safe: quantum computing allows to shorten bruteforce attack (N classical attempts) by sqrt(N) quantum operations (Grover’s algorithm). So unless you are using something marginally secure like aes-64, you are safe.

With asymmetric crypto there is completely another story: with powerful enough fully functional quantum machine (~2000 cubits) all popular algothims like RSA, DSA, elliptic curves (EC) wil be dead. EC will be first to die because of shorter keys, so smaller number of cubits will be necessary (~1000, but this depends on implementation and algo details).

So what can be done? There are MANY postquantum algorithms: hundreds of them already. Most of them have two shortcomings:

They are not well tested. Cryptography likes mature algorithms and implementations. Many of proposed algorithms turned to be vulnerable against common classic (non-quantum attacks).

Postquantum algos require very long keys (sum of public and private key lengths is often about 1MB, no consider how huge keychains will be). There are many practical problems with all these implementations.

Get in the practice of saying, “this tech is moving along quicker than anybody thought.” This news just broke a couple days ago, allowing for potentially thousands of qubits in a single machine.

The genius was making a control chip enter the near-0K temperatures alongside the qubits. It’s sort of like jumping from I2C (peripheral expects to be dumb and interrogated for logic to happen) to PCI(-e) where the end device is expected to perform lots of logic and the data channel is just for message passing (e.g. GPU crunches lots more calculations than the more abstract commands from the CPU). There are far fewer worries about decoherence when everything operates within milli-Kelvin range, allowing us to pack more qubits together for calculations.

Of course, once stable room-temperature qubits or quantum optical computing (exploit light for quantum behavior instead of electrons) advances dramatically, all bets are off. I believe these parts of the human tech tree are achievable.

This indeed a milestone engineering breakthrough, though this is not the only limiter for 1000-cubits scale for electronic circuit based gates. But error growth will still happen on large scale systems, so there are many challenges to work one.

But once done, say good buy to all RSA and EC crypto — for everything based on various forms of the factorization problem. So for common people it is better to prepare in advance. What can we do? Develop software or use it and report feedback.

Aside from aformentioned codecrypt there is Open Quantum Safe project:

The core project is liboqs with post-quantum crypto and there are forks of popular projects like openssh and openssl:

Beware that quantum crypto is still immature and specific algorithms or implementations may be found vulnerable in future for either or both traditional and quantum cryptoattacks. So use hybrid cryptography (traditional + quantum). All forks above provide such ability.