Quantum Cracking Counter Measures Reprise

(Note: This is a slight re-direct from the thread here: Quantum Cracking Counter Measures?)

Starting assumptions:

A) If quantum cracking ever becomes real, the first to get it won’t tell anyone.

B) If quantum cracking isn’t happening now, it will be happening at some point.

Needed functionality: Shared access to sensitive data around the world by a close group of parties.

With those starting assumptions, lets go back to potential countermeasures:

#1: Data Access Control: No complete central repository of any single file in the cloud.
Potential solutions: File sharding, and distributed storage. No central repository of a complete file. Perhaps real-time p2p file sharing between actors with no central repositiory (but transit collection now seems to be the hole).

#2: Quantum Proof Ciphers.
Thanks Skalman on One-time Pad & Caesars cipher. I’m looking for practical tools on this.

Qs for security experts:

Will these countermeasures work? Do you see any holes in them?

How to close the data transit hole? A sharded file, sent to multiple nodes, then reclaimed & re-assembled on the other end? Will that close the hole?

Sharded file storage: I’m looking at Sia & Maidsafe now. Any other solutions I should be looking at?

Software implementations of One-time Pad & Caesar cipher: Any practical open source software implementation now that have been audited?

Thanks

Forget Caesar cipher, it’s about as secure as writing in plain text.

As for an implementation of one-time pad, there’s not much to audit. A simple search result: https://www.megabeets.net/xor-files-python/

The difficulty is in keeping the pad secret and delivering it to the receiving party.

1 Like

I once though about a communication protocol with Diffie-Hellmann key negotiation (or a successor) to negotiate a key byte by byte and than applying it via OTP to that payload byte by byte. Or. Maybe blockwise. But I think such a system would depend on the key exchange and Diffie-Hellmann relies on the descrete logarithm problem, which is threatened by quantum computing. There is a elliptic curve variant. I don’t know if that is threatened, too.

A Website search on post quantum key exchange gives some results. I don’t have the time to read the stuff ATM. Alternative to key exchange would be key distribution and key negotiation.

Local file encryption however does not need key exchange.

With OTP you always have at least double the size of data to store or to transfer. Even on local usage you must store the key somewhere. If an attacker gets access to both, cipher text and key files its easy to reverse the OTP. You might encrypt the key again with a password derivation function but in your scenario that has to be safe against quantum attacks, too.

Maybe AES is more efficient.

I think it would be good to specify the proplem that should be solved: Data transfer or storage. Moderate amounts of data or big amounts of data like Tera bytes up to whole multi Tera byte storage devices. Static or dynamic data.

1 Like

I wrote these programs a long time ago. I am sure they can be improved, I compiled the first one and named it ‘cyp’. I compiled the 2nd one and named it dcyp. The first one will shift the ascii values of your message upwards, using your key, the 2nd will shift the values downward.

If nothing else it is something to look at for an encryption idea :slight_smile:

If your key is as long as the message itself, and you only use it once, it is basically a one-type pad.

// cyp.c

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

int main (int argc, const char * argv[])
{
    char cyp[8000] = { 0 };
    char cmdout[256] = { 0 };
    
    if (argc < 2) {
        printf("usage: file, key\n");
        return 0;
    } else if (argc == 2) {
        printf("Key: ");
        scanf("%s",cyp);
    } else {
        sprintf(cyp, "%s", argv[2]);
    }
    
    int cypLng = strlen(cyp);
    
    FILE *fpIn;
    FILE *fpOut;
    char charIn;
    int strPos = 0;
    
    fpOut = fopen("enc.out", "w");
    fpIn = fopen(argv[1], "r");
    
    while (( charIn = fgetc( fpIn )) != EOF) {
        int charOut = charIn + cyp[strPos++];
        if (charOut > 127) charOut-=128;
        fputc(charOut, fpOut);
        if (strPos >= cypLng) strPos = 0;
    }
    
    fclose(fpIn);
    fclose(fpOut);
    
    sprintf(cmdout, "mv enc.out %s", argv[1]);
    
    system(cmdout);
    
    return 0;
}
// dcyp.c

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

int main (int argc, const char * argv[])
{
    char dcyp[8000] = { 0 };
    char cmdout[256] = { 0 };
    
    if (argc < 2) {
        printf("usage: file, key\n");
        return 0;
    } else if (argc == 2) {
        printf("Key: ");
        scanf("%s",dcyp);
    } else {
        sprintf(dcyp, "%s", argv[2]);
    }
    
    int dcypLng = strlen(dcyp);
    
    FILE *fpIn;
    FILE *fpOut;
    char charIn;
    int strPos = 0;
    
    fpOut = fopen("enc.out", "w");
    fpIn = fopen(argv[1], "r");
    
    while (( charIn = fgetc( fpIn )) != EOF) {
        int charOut = charIn - dcyp[strPos++];
        if (charOut < 0) charOut+=128;
        fputc(charOut, fpOut);
        if (strPos >= dcypLng) strPos = 0;
    }
    
    fclose(fpIn);
    fclose(fpOut);
    
    sprintf(cmdout, "mv enc.out %s", argv[1]);
    
    system(cmdout);
    
    return 0;
}
1 Like

In mathematical terms, this isn’t different from the xor algorithm, and is just as secure. In practical terms as an one-time pad, it should error out when it runs out of pad, not reuse it :stuck_out_tongue: Reuse is the greatest enemy of something being one-time.

3 Likes

Yes, in these programs the key will repeat itself. Specifically this line

if (strPos >= dcypLng) strPos = 0;

To create a one-time pad, take that line out, and put in a check at the beginning to make sure the key is as long as the message itself. If you don’t put in the check, the programs will basically crash as it overruns the cyp (first program) or dcyp (second program) character arrays.

Anything that creates a pattern in encryption is easy for computers to break :slight_smile:

1 Like

Since the problem with OTP is how you distribute it, can you distribute the next OTP inside the current block? Something like this:

Suppose Alice wants to send an encrypted file to Bob with n-bit security:

Alice first sends a 2n-bit OTP to Bob out-of-channel (or they can negotiate with with a safe post-quantum key negotiation algorithm).
Then Alice splits the file into n-bit blocks. Alice also generates a random n-bit number and appends it to each block. The final encrypted block is the result of the XOR operation between each block with the current OTP.
the next OTP is defined as a 2n-bit hash function applied to the n-bit random number previously generated by Alice.

1 Like

I think you’re reinventing cipher block chaining.

1 Like

I think this is where it stops being OTP: you’re doubling the bits, so you have to reuse some information to generate half of the new ones.

Indeed you can’t send more bits than the pad length when using OTP. So eaither you send your message, or a new pad, but not both. There’s no safe way to extend the pad length over insecure channel.

1 Like

Hash functions can still be quantum-secure. Sure, applying a n->2n hash function means the set of possible 2n-bit OTP used by this algorithm is actually of size 2^n, so some security is lost there. But still your only option would be to brute-force it no?

Maybe some quantum algorithm can take advantage of that n->2n extension over a large number of messages. I’m not really knowledgeable in that stuff.

1 Like

To include a new OTP in an already encrypted message, the 2nd OTP must be shorter than the first. Example: Below is your 1st message (before encryption), note that this message is 60 characters long:

This is the top secret message. OTP: ABCD1234SoOnAndFurther

But the OTP embedded in the message is only 23 characters long. So the next message you send, using that OTP, could only be 23 characters, and if you insert another OTP in that message, the message itself would have to be short.

But, say, if in the first encrypted file you had a short message, and a very long OTP embedded for the next one, and you kept every message short, you might be able to keep that up for a while.

It is just that the OTP contained in each message would have to be progressively shorter. An interesting idea though… now I’m working through whether embedding a new shorter OTP in another encrypted message would create a pattern that a computer could crack.

1 Like

The easiest way to counter quantum cracking is quantum cryptography. There are things like quantum communication or quantum cryptography, where we encrypt by using the enormous power of quantum computers for encryption. There is some symmetric, lattice-based Post-Quantum (so quantum computer save) algorithm in development, that should be available 2022.
However, the only one that is interested in building giant Quantum Computers for decryption are states, which have even now such sophisticated algorithms that it is really hard to avoid getting cracked.
When quantum computers become a thing on the commercial market, you could buy some for you and the people you want to communicate with and use it for that.

1 Like

You never know, nobody, except maybe of top few crypto researchers, knows. That’s why crypto is said to be hard to implement right.

As it comes for quantum breakage of rsa and friends, this piece of reading implies that we are safe for the coming years. Google adpoted brute-force scheme, where massive, one million qubits machine could execute Shor’s algorithm on only 1000 bit long number. The overhead in qubits of 1000:1 comes from their prospective quantum error correcting scheme.
IBM plans for less overhead in error correction, like 25:1.
Google says their time to acheve their goal is 10 years. IBM is not so bold.

So, for now, use GPG with maximum length keys. It’s 4096 IIRC. I bet my last shirt that this won’t be in reach of quantum machines for the next 30 years. I’ve set expiration date on my keys accordingly.

In the mean time, I’ll just watch crypto gurus to come up with something quantum-resistant. I can’t figure this out myself, obviously. And one time pad is just not practical.

2 Likes

Keep in mind that the way GPG is usually used, once keys are broken, all past communication will be uncovered.

3 Likes

By that time, I’ll be 80-ish or 90-ish, probably dead. Not really my problem then. You youngsters are in for more trouble, I suppose.

Funny how this is a follow up of the last topic (but the question and use case is completely the same). It is generally about some symmetric encryption which has not even a theoretical attack vector for decrypting and this has not even been adressed in the last topic as I already mentioned that…

i’m curious to see if any of you would be willing to test the following :

generate a 4096 RSA key pair on the Librem 5. see which method is faster and how MANY can you recycle before you exhaust the battery. next, post your results.

this should be good sport for the algorithm to pick up on :wink: :joy:

There are a few important things to consider about the likely emergence of quantum computers (QCs) suitable for use in cracking encryption of any sort:

First of all, the speedup is likely to be (ultimately) limited to some constant number of bits, so your 100-bit key becomes a 70-bit key, and your 130-bit key becomes a 100-bit key, for instance. There’s an oft-repeated rule of thumb that because Grover’s algorithm can search an unsorted database in a time which is proportional to the square root of its number of records, that therefore a brute force key search would take an amount of time which is proportional to a classical one-by-one search of a key space involving half as many bits. So that 100-bit key is really just a 50-bit key. There are several problems with this assumption, but the overriding issue is that the quasicrystalline spin network (QSN) has finite spatial information density. This isn’t comprehended by mainstream quantum mechanics (QM), but it’s implicitly baked into the branchial space metatheory of Steven Wolfram (a metatheory because it’s a theoretical framework for discovering the parameterization which produces the theory of QM as we know it) and the empire wave quantum gravity theory of Klee Irwin et al. For his part, Irwin has gone on record (sorry, didn’t keep the link) explaining just this point and how it derives from QSN lattice topology. (He hasn’t determined the density limit, however.)

On the other hand, we don’t know what this “quantum coupon” really saves us in practice. Probably at least 20 bits, based on preliminary quantum supremacy demos in other combinatorial problems. I don’t know where the ceiling lies, which is an engineering problem in the short term and a physical limit in the long term, but suffice to say that you should watch the progress of QC development in order to assess the likely quantum coupon as it evolves over time. In the near term, the answer mostly depends on the effectiveness of quantum error correction, which allows the computation to remain stable enough for long enough to be useful.

Secondly, you still need to pay for the remaining key bits after applying the quantum coupon. Let’s say you’re left with 70 bits. It’s entirely reasonable to assume that classical computers will be able to do 2^70 computations in humanly relevant time and cost at some point in the coming years. (It’s already possible, in theory, with lots of money.) So you can just log onto Amazon Web Services in 2060, pay a few bucks, and away you go! But… no. This doesn’t work because the quantum coupon cuts the key space by a factor of, say, 2^30, from 2^100 to 2^70 possible keys. That doesn’t mean that it gives you 30 of the key bits. Moreover, you don’t know which 2^70 states are left because it’s impossible to export them from quantum superposition. Therefore – guess what – you need to crank that QC 2^70 times to give you the actual key! Considering that a single run of the algorithm might take milliseconds – say even nanoseconds – this is just not gonna happen. (Yes, you could in theory connect several QCs together if your error correction holds up, but that’s expensive classical parallelization subject to economic intractability.) The only practical hope that the cracker has, assuming that your password was chosen at random (it was!) and you’re not running flawed software (uh… hmm…), is to find a nonexistent QC offering a sufficiently hefty quantum coupon. That might emerge, but if it does, you’ll likely have some idea of its capability because the same basic hardware will likely be applied in other optimization problems, such as drug discovery.

I’m not dismissing the threat. QCs are a force to be reckoned with, especially given the breakthrough just this year in quantum error correction efficiency. But this isn’t the death of cryptography – even if you’re “only” using “long” keys with prequantum ciphers.

3 Likes

Do you have any sources for what you say? You make a lot of claims, but there’s literally 0 support for them in your post, and they don’t jibe with what I know about quantum algorithms. It’s especially interesting why you would say that quantum speedup is only “cutting off bits”.

2 Likes