The combination of more than one cryptographic algorithm

I consider the following: I have a data stream that I would like to protect as secure as possible. Does it make sense to use let say AES with some IV, and then Blowfish with some IV, and finally again AES with some IV?

The encryption / decryption process will be hidden (even protected from debugging), so it’s not easy to guess which crypto method and which IVs were used (however, I know that the strength of this crypto target chain can’t depend on this fact, since every protection against debugging is interrupted over time).

I have a computer for this (this amount of data is not that big), so the question is only if it is worth the implementation. For example, TripleDES worked very accurately using three IV and encryption / decryption / encryption schemes, so this is probably not complete stupidity. Another question: how much do I reduce safety when I use the same IV for the 1st and 3rd parts or even the same IV for all three parts?

I welcome any hints on this subject.

+4
source share
14 answers

I’m not sure about this particular combination, but it’s usually a bad idea to mix such things if this particular combination has not been thoroughly researched. Perhaps the mathematical transformations will actually counteract each other, and the final result will be easier to crack. A single pass of AES or Blowfish should be more than sufficient.

UPDATE: From my comment below ...

Using TripleDES as an example: think about how much time and effort from the best cryptographers in the world took to create this combination (note that DoubleDES had a vulnerability), and the best they can do is 112 bits of security, despite 192 bit key.

UPDATE 2: I have to agree with Diomidis that AES is extremely unlikely to be a weak link in your system. Almost every other aspect of your system is more likely to be compromised than AES.

UPDATE 3: Depending on what you are doing with the stream, you can simply use TLS (the successor to SSL). I recommend Practical Cryptography for more information - this is a pretty good job at solving a lot of problems that you need to solve. Among other things, it discusses stream ciphers , which may or may not be more suitable than AES (since AES is a block encipher , and you specifically mentioned that you have a data stream for encryption).

+11
source

I don’t think you have anything to lose by using one encryption algorithm on top of another, which is very different from the first. However, I would be afraid to run the second round of the same algorithm on top of the first, even if you launched another intermediate. The interaction between the two runs can open the vulnerability.

Having said that, I think that you suffer too much from the encryption part. Most data publishing does not occur, violating a standard encryption algorithm, such as AES, but with other flaws in the system. I would advise spending more time managing keys, processing unencrypted data, weaknesses in the implementation of the algorithm (the possibility of data or key leakage) and wider system problems, for example, what you do with data backups.

+4
source

A hacker will always attack the weakest element in the chain. Thus, it helps little to make a strong element even stronger. AES encryption is no longer possible with a 128-bit key length. The same goes for Blowfish. Choosing even longer key lengths makes it even more difficult, but in fact 128 bits have not yet been cracked (and probably will not be for the next 10 or 20 years). So this encryption is probably not the weakest element, so why is it getting stronger? He is already strong.

Think what else could be the weakest element? IV? In fact, I would not spend too much time choosing an excellent IV or hiding it. The weakest key is usually the encryption key. For instance. if you encrypt the data stored on the disk, but this data must be read by your application, your application must know IV, and it needs to know the encryption key, so both of them must be inside the binary code. This is the weakest element. Even if you take 20 encryption methods and associate them with your data, the IV and encryption keys of all 20 must be in binary format, and if the hacker can extract them, then the fact that you used 20 instead of 1 encryption method is security.

Since I still don’t know what the whole process is (which encrypts the data, which decrypts the data, where the data is stored, how it is transferred, who should know the encryption keys, etc.), it is very difficult to say what the weakest element is , but I doubt that AES or Blowfish encryption itself is your weakest element.

+4
source

Why are you trying to protect your data? Your brother, your competitor, your government or aliens?

Each of them has different levels at which you could consider the data “as safe as possible”, within the framework of a significant budget (time / money)

+1
source

Encryption is twice safer than encryption once, although this may not be clear at first.

Intuitively, it seems that encryption twice with the same algorithm does not provide additional protection, because an attacker can find a key that decrypts all the way from the final cyphertext to plain text .... But this is not so.

eg. I start with plaintext A and encrypt with key K1 to get B. Then I encrypt B key K2 to get C.

Intuitively, it seems reasonable to assume that there might be a key, K3 , which I could use to encrypt A and get C directly. If so, then the brute-force attacker will eventually stumble on K3 and be able to decrypt C , as a result of which an additional encryption step is not added security.

However, it is unlikely that such a key exists (for any modern encryption scheme). (When I say "very unlikely" here, I mean what a normal person would express with the word "impossible").

Why?
Consider the keys as functions that provide display from plain text to cyphertext.
If our keys are all KL bits in length, then there are 2 ^ KL such mappings.
However, if I use 2 keys of KL bits each, this gives me (2 ^ KL) ^ 2 mappings.
Not all of them may be equivalent to one-step encryption.

Another advantage of encrypting twice if two different algorithms are used is that if a vulnerability is detected in one of the algorithms, the other algorithm still provides some security.

As others have noted, a coarse forced key is usually the last resort. An attacker often tries to disrupt the process at some other point (for example, using social engineering to detect a passphrase).

Another way to increase security is to simply use a longer key with one encryption algorithm.

... Feel free to correct my math!

+1
source

In addition, do not waste time confusing the algorithm - apply the Kirchhoff principle and remember that AES itself is used (and it is recognized that it is used) in a large number of places where data should be “provided”.

0
source

Damien: You're right, I have to write it more clearly. I'm talking about a competitor, this is for commercial use. Thus, there is a significant budget, but I do not want to implement it, not knowing why I am doing this :)

Hank: Yes, that's what I'm afraid of too. The most favorable source of this idea was mentioned TripleDES. On the other hand, when I use one algorithm to encrypt some data, then we apply another one, it would be very strange if the “power” of all encryption would be less than using a stand-alone algorithm. But this does not mean that it cannot be equal ... That's why I ask for some hint, this is not my area of ​​knowledge ...
0
source

Diomidis: this is basically my point of view, but my colleague is trying to convince me that it really improves security. My suggestion was to use a stronger encryption key instead of one algorithm after another without any thought or in-depth knowledge of what I am doing.

0
source

I would not obscure the algorithms that you use. This "protection against obscurity" does not work for long. Decompiling code is one way of identifying the cryptography you use, but usually people don’t keep secrets for so long. That is why we have a secret key / public key. Primarily.

0
source

@Miro Kropacek - Your colleague is trying to add security through Voodoo. Instead, try creating something simple that you can analyze for flaws - for example, simply using AES.

I assume that it was he (she?) Who proposed increasing security with debugging protection too ...

0
source

In fact, you cannot make things less reliable if you encrypt several times using different IVs and keys, but the security gain can be much less than you expect: in the 2DES example, an average attack means that it is only twice as hard to break rather than squaring difficulties.

In general, however, it is much safer to stick to one well-known algorithm and increase the key length if you need additional security. Do not leave cryptosystems to experts (and I do not consider myself one of them).

0
source

Yes, it can be useful, but probably bust in most situations. Also, as Hank mentions, some combinations can actually weaken your encryption.

TrueCrypt provides a range of combination encryption algorithms such as AES-Twofish-Serpent. Of course, when using them there is a penalty for execution.

0
source

Changing the algorithm does not improve the quality (except that you expect the algorithm to be violated), it is only about the length of the key / block and some advantage in obfuscation. It is interesting to do this several times, because even if the first key is leaked, the resulting data does not differ from random data. There are block sizes that are better handled on this platform (for example, register size).

Attack-quality encryption algorithms are applied only by brute force and, therefore, depending on the processing power that you can spend on. This means that in the end, you can increase the likelihood of the average time that someone needs to decrypt.

If the data has real meaning, they better not attack the data except the key holder ...

0
source

I agree with what has been said above. Several stages of encryption will not buy you much. If you use a "safe" algorithm, then it is almost impossible to break. Using AES in standard streaming mode. See http://csrc.nist.gov/groups/ST/toolkit/index.html for accepted ciphers and modes. Everything that is recommended on this site should be safe enough when used properly. If you want to be more secure, use AES 256, although 128 should still be sufficient. The greatest risk is not attacks of the algorithm itself, but attacks on key management or attacks on the channel side (which may or may not be a risk depending on the application and use). If the application is vulnerable to attacks on key management or attacks on the side channel, then it really does not matter how many levels of encryption you use. Here I will focus my efforts.

-1
source

Source: https://habr.com/ru/post/1277061/


All Articles