# Debunking a Bogus Encryption Statement?

#### Cliff posted more than 7 years ago | from the this-and-other-gems dept.

215
deviantphil asks: *"Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?"* What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?

## Meet in the middle attack (5, Informative)

## dtfinch (661405) | more than 7 years ago | (#15959969)

That's why we have triple-DES instead of double-DES.

## Mod parent up (0)

## Anonymous Coward | more than 7 years ago | (#15960090)

## Re:Meet in the middle attack (0)

## Anonymous Coward | more than 7 years ago | (#15960112)

## Re:Meet in the middle attack (2, Informative)

## TheSHAD0W (258774) | more than 7 years ago | (#15960399)

ideal128-bit. I say ideal, because typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it, and do so using the same encryption hardware. But while triple-DES may be closer to the ideal 64-bit, even a non-ideal 128-bit algorithm should exceed that number of effective bits.## Re:Meet in the middle attack (4, Insightful)

## swillden (191260) | more than 7 years ago | (#15960470)

The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit.When talking about block ciphers, block size and key size are separate things. Generally, if you say algorithm X is an n-bit cipher, you're talking about key size, not block size. Given use of good block chaining modes, ciphers with larger block sizes don't really offer significant additional security.

typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve itOriginal DES hasn't really been reduced significantly. There are attacks which reduce the computational complexity to, IIRC ~41 bits, but at the expense of requiring impractical space. All in all, DES as stood up extremely well and its practical strength hasn't been significantly reduced by 30+ years of scrutiny.

The problem 3DES was created to address was a deliberate limitation of the design: The small keyspace. The strongest variant of Lucifer (the original IBM cipher which became DES), used a 128-bit key, but the NSA deliberately had it reduced to a 56-bit key, presumably because they thought they could break the cipher with the smaller key size, but it was still secure enough against others. Advances in computation technology, of course, have put a brute force search of a 56-bit keyspace within reach of run-of-the-mill desktop computers. 3DES addresses this by increasing the effective keyspace to 112 bits.

## Re:Meet in the middle attack (4, Informative)

## eddeye (85134) | more than 7 years ago | (#15960552)

It's even worse if your cipher of permutations forms a group. In that case, for any two ENC keys k1 and k2, there exists a third ENC k3 such that ENC_k2 (ENC_k1 (x)) = ENC_k3 (x). In other words you can find a third key that produces the same permutation as the composition of keys 1 and 2. This means that breaking a double-encryption (or triple, quadruple, etc) is no harder than breaking a single encryption: the resulting permutation can always be described by a single key.

That's why 3DES uses EDE instead of EEE. While DES doesn't form a group, it does have some group-like structures which reduce the workload quite a bit. This doesn't apply to all ciphers btw; there are many more possible 64-bit permutations than 64-bit keys, so compositions can fall well outside those covered by keyspace.

I think this is covered in chapter 7 of the Handbook of Applied Cryptography [uwaterloo.ca].

## Your keyspace wouldn't be that much bigger (5, Insightful)

## Profane MuthaFucka (574406) | more than 7 years ago | (#15959997)

Plus, you have to realize that the amount of time needed to brute force a single key of a certain size goes up non-linearly with each additional bit. If you just double the number of times you encrypt, you're pretty much just increasing the effort to brute force the thing linearly.

## Re:Your keyspace wouldn't be that much bigger (5, Funny)

## sr180 (700526) | more than 7 years ago | (#15960008)

## Re:Your keyspace wouldn't be that much bigger (0)

## Anonymous Coward | more than 7 years ago | (#15960050)

## Re:Your keyspace wouldn't be that much bigger (1)

## SCPRedMage (838040) | more than 7 years ago | (#15960290)

## Re:Your keyspace wouldn't be that much bigger (2, Informative)

## igb (28052) | more than 7 years ago | (#15960524)

ian

## Re:Your keyspace wouldn't be that much bigger (2, Insightful)

## Jeff DeMaagd (2015) | more than 7 years ago | (#15960080)

## Re:Your keyspace wouldn't be that much bigger (1, Funny)

## Anonymous Coward | more than 7 years ago | (#15960140)

## Re:Your keyspace wouldn't be that much bigger (3, Informative)

## TCM (130219) | more than 7 years ago | (#15960128)

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again? I'm feeling the effective key strength is somewhere between 64bit and 128bit, certainly more than 65bit, but not really 128bit.

IANAC(ryptologist).

## Re:Your keyspace wouldn't be that much bigger (5, Insightful)

## Kadin2048 (468275) | more than 7 years ago | (#15960297)

If you take a plaintext file and encrypt it into a file which has headers ("BEGIN ENCRYPTED CONTENT---"), and then encrypt the result again, assuming the attacker knows how you did it and that the intermediate file has plaintext headers, then they'll know the moment they broke the first 64-bit encryption layer. So in this example, you're basically at 65 bits.

Now if you don't include any headers, so that there's no terribly good way to determine whether you've gotten the right key or not, as you're brute-forcing the first layer, then I think you're right -- the strength of the overall system is somewhere in a grey area between 65 and 128 bits.

If someone was just thinking that they could use a file-encryption utility twice (which produces output files that have plaintext headers) and double the keyspace, they are dead wrong.

## Re:Your keyspace wouldn't be that much bigger (5, Informative)

## swillden (191260) | more than 7 years ago | (#15960505)

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again?No, you do a meet-in-the-middle attack, which is basically 2^64 in complexity if you're using two 2^64 keys.

There are some optimizations that can be done, but the basic idea is this: You start with one ciphertext block and its corresponding known plaintext. Then you encrypt the plaintext with all 2^64 possible keys and store the results (with some indexes to make it easy to search for a particular block). Then you decrypt the ciphertext with all 2^64 possible keys, looking each result up. When you find a match, you have found the pair of 64-bit keys that turns that plaintext into that ciphertext. So to search the entire space, you have to do 2^64+2^64 = 2^65 trial operations. On average, you only have to search half of the second keyspace, so the complexity of the search is 2^64+2^63 = 2^64 (plus the huge storage cost).

Triple encryption is also weakened by a meet-in-the-middle attack. Using three 64-bit keys, it would be nice to think you have a 192-bit key. But a meet-in-the-middle requires encrypting with all 64-bit keys for the first step, and decryption with all 128-bit keys for the second step, giving an effective strenth of 2^64+2^127 = 2^127 (plus the huge storage cost).

Finally, keep in mind that DES keys aren't 64 bits. They're 56 bits. So 3DES has a brute force search computational complexity of 111 bits, on average.

## Intuition doesn't work well in crypto (3, Insightful)

## billstewart (78916) | more than 7 years ago | (#15960258)

Seems like encrypting twiceis likely to be doomed to bogosity, unless there's a later clause in it likebut that's not what really happens. Crypto not only depends on a lot of deep math, it also depends on a lot of cryptographers spending time following bits around to see where they go and how to break them.Sometimes things do what your mathematical intuition tells you, if you're mature enough in the field to have a solid intuition, but often they don't. Problems can be very hard if looked at from one direction and very easy (or at least less hard) when looked at from another direction, and a cryptographer's job is to make sure they've checked out _all_ the directions because it only takes one weak one to break something. NP-complete problems are especially that way - they're potentially useful for crypto because there's one easy direction if you know the key, but many problems can't be transformed in a way that you can use the easy path and the attacker's stuck on the hard paths.

But even the bit-twiddly algorithms, like most hash functions or the S-box building blocks inside Feistel-structured algorithms, can often be cracked by people examining them closely enough to find conditions under which they can deduce bits. For instance, MD5 is pretty much busted these days.

And both mathematical crypto and bit-twiddly crypto has to worry about Moore's Law and brute force - some algorithms scale well, so you can double the strength of an N-bit key by adding 1 bit or maybe logN bits, while others don't, or they form groups so that encrypting a message multiple times with an N-bit key still only gives you N bits of strength (leaving aside pathological cases like rot13.)

## No it's not -- if there's no known plain text (1)

## Nicolas MONNET (4727) | more than 7 years ago | (#15960592)

In effect, if there's no fixed header, no way to quickly distinguish a random putative clear text from rubbish added in between the two encryption phase, then in effect a brute force attack would take 2^64*2^64 steps, that's to say 2^128

## Easy (3, Insightful)

## suricatta (617778) | more than 7 years ago | (#15959999)

## Re:Easy (1)

## flyboy974 (624054) | more than 7 years ago | (#15960599)

If I was to encrypt a file once with 64bit, I would have 2^64. Now, if the intruding party did not know that I encrypted the file a second time, they would then have to try to brute force again with an additional 64 bits. Therefor, I would really have 2^64 * 2^64, or 2^4096. The file is absolutely useless unless both sets are compromised.

We are not talking about a single additional bit here. We are talking about two seperate uncompromised keys in a file format which is completely unknown! So it wouldn't just double the encryption cost, because frankly any encryption algorytm doesn't mean that if you encode a file once, and then encode it again, you can decode it by adding a single bit to the key. Duh. You have to completely decrypt the first encrypted byte set into the again encrypted format. Thus there is 64 bits of primary encryption, and 64 bits of additional encryption.

So, even if you brute forced the first 64 bits, you get to do it again, thus you are really trying to brute force 2^128. Although that assumes that both keys are at the extream ends of the brute force.

## more is better (-1, Redundant)

## Anonymous Coward | more than 7 years ago | (#15960004)

## While you're at it... (3, Funny)

## unitron (5733) | more than 7 years ago | (#15960006)

## Re:While you're at it... (3, Funny)

## QuantumG (50515) | more than 7 years ago | (#15960015)

## Re:While you're at it... (2, Funny)

## RuBLed (995686) | more than 7 years ago | (#15960056)

Chuck Norriscan roundhouse kick that compressed data back into its'rawformat.## Re:While you're at it... (4, Funny)

## QuantumG (50515) | more than 7 years ago | (#15960156)

## Re:While you're at it... (1)

## Psiven (302490) | more than 7 years ago | (#15960308)

## Re:While you're at it... (1)

## Patrik_AKA_RedX (624423) | more than 7 years ago | (#15960565)

Here I got some intresting downloads for you:

My whole collection of the latest movies: '1'

All microsoft programs: '0'

A working beta of Duke4Ever: '0'

ISO's of the 20 best games of 2005 and 2006: '1'

Every MP3 you could find on the net: '1'

pron: 12.250.000 images, 512.054 movies: '0'

I'll upload the decompress util as soon as I got that last bug out of it. But you could finish the downloads in the mean time.

## Re:While you're at it... (1)

## megaditto (982598) | more than 7 years ago | (#15960590)

1) Find an OTP that XORs your data to all 1's

2) XOR date and the OTP

3) Compress the resulting file

4) Report the resulting compression size on slashdot

__

Tis not enough to succeed. Others must fail.

## It's like this.. (1)

## BigZaphod (12942) | more than 7 years ago | (#15960011)

Of course this is just an analogy from a non-expert... Use at your own risk.

## Re:It's like this.. (1)

## AP2k (991160) | more than 7 years ago | (#15960099)

## Re:It's like this.. (1)

## BigZaphod (12942) | more than 7 years ago | (#15960176)

At some point so mant strips of duct tape are going to hold down the box lid due to sheer mass.True, but in the end you could just pry the whole mess off with a crowbar or something. It's only as strong as the glue on the bottom piece.

## Debunking this claim (4, Informative)

## jordandeamattson (261036) | more than 7 years ago | (#15960012)

If you encrypt twice with a 64-bit key you will just be taking a plain text file and encrypting it and then taking your encrypted file and encrypting it once again. To break encryption on this file, getting access to the plain text, you would need to break 64-bit encryption twice.

While this would be more of a challenge than breaking 64-bit standalone encryption, it wouldn't be equivalent to the security of a 128-bit key. The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

Now, breaking a doubly encrypted file would present some interesting challenges. The most obvious is that your plaintext for the first layer of encryption is anything but plaintext. Confirming that one set of encrypted text is right one vs. some other set of encrypted text would definitely be a challenge. It would make it difficult to determine if you had found the right key to decrypt your first layer of encryption.

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

Yours,

Jordan

## Re:Debunking this claim (1)

## ydra2 (821713) | more than 7 years ago | (#15960055)

I've had good results double encrytpting with the XOR method. ROT13 is another good candidate for double encrytpion.... Doggonit! Somebody got my root password again! I'm not settling for anything less than quadruple encrytpion from now on.

## Re:Debunking this claim (1)

## budgenator (254554) | more than 7 years ago | (#15960147)

## Re:Debunking this claim (1)

## jordandeamattson (261036) | more than 7 years ago | (#15960175)

I agree that XOR and ROT13 are encryption systems which end up decrypting the cipher text when re-encrypted. But I will be honest that I wasn't thinking of them, but rather of "real" encryption systems (i.e. more robust), when I thought about systems where double encryption could lead to the cipher text being less secure than it was with one level of encryption.

I think the get mistake which spawned this discussion is that people tend to think of encryption as "lock boxes" for information. In the real world, if I put something in a safe and then put that safe in another safe, my stuff is twice as secure. But the analogy breaks down and I am not twice as safe.

Yours,

Jordan

## Re:Debunking this claim (-1, Troll)

## kjs3 (601225) | more than 7 years ago | (#15960059)

## Re:Debunking this claim (1)

## pizpot (622748) | more than 7 years ago | (#15960145)

## Re:Debunking this claim (1)

## kjs3 (601225) | more than 7 years ago | (#15960222)

Answering in a forum is something to do only when you know the answer.Replying to an answer in a forum is something to do only when you know the answer and the other guy doesn't. I actually do know the answer, but didn't feel the need to spoonfeed the D&H article to educate you and Jordan. You could go look it up, and find out what a Meet-in-the-Middle attack is, maybe look at subsequent research, understand how the math actually works, and understand why Jordan doesn't know what he's talking about. Hell, if you actually did your research, you could even have a snappy comeback ("yeah, but the hash table you'd need to assemble for the attack to work in practice is impractially large! Got you you smarmy bastard!"). If you two can't do your homework, why should I do it for you.

But don't dis Jordan, he did make 2 new good points.No, he did't. See above.

## Re:Debunking this claim (1)

## ScrappyLaptop (733753) | more than 7 years ago | (#15960395)

## Re:Debunking this claim (1)

## Capt'n Hector (650760) | more than 7 years ago | (#15960068)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!Rot-13 comes to mind...

## Re:Debunking this claim (-1, Troll)

## Anonymous Coward | more than 7 years ago | (#15960126)

## Re:Debunking this claim (4, Insightful)

## Coryoth (254751) | more than 7 years ago | (#15960212)

It isn't very common for the resulting encryption to be

lesssecure, but "no appreciable improvement" is certainly very possible.With block ciphers you can run into the issue that if the cipher scheme forms a group (as some do) then even if you use 2 different keys for each encryption round, there will be a single third key that provides identical encryption: an attacker never needs to break your two keys, they can simply find the single third key that is equivalent to your double encipherment. Depite encrypting twice with two different keys the result is no more secure than encrypting once. For a simple example of this, consider a ROT encryption scheme - if you encrypt a message with ROT9 then ROT12 that is just equivalent to ROT21, so instead of trying to find the two keys 9 and 12, the attacker can just solve the ROT21 problem (the worst case, of course, is if your combination of keys gives the identity element such as ROT9 and ROT17, or ROT13 twice, in which case you end up with an unencrypted document).

Even if that's not the case you can still get caught with a meet-in-the-middle approach if the attacker has some known plaintext (that is, they have some plaintext with the corresponding encrypted text and are attempting to extract your key) which does pretty much what it says, encrypting one way and decrypting the other to meet in the middle. Using such a scheme you can extract the keys with only marginally more effort than for single encryption.

## Re:Debunking this claim (1)

## snowgirl (978879) | more than 7 years ago | (#15960384)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!The trivial example of this is ROT-13. When performing the encryption a second time, it will actually unencrypt the text itself.

If the encryption is a simple XOR of any key size length, then a double encrypt will actually not increase the complexity to break it at all, as if you had a plaintext value P, then xor it with K_1, then xor it with K_2, then you have only simply xor'ed it with (K_1 xor K_2) so definitely don't use any sort of mathematical operation that is commutative.

## Re:Debunking this claim (2, Informative)

## xsonofagunx (902794) | more than 7 years ago | (#15960391)

On the contrast, a 64-bit encryption scheme leaves you with 18,446,744,073,709,551,616. Done twice, that means it's 36,893,488,147,419,103,232 which is still far less than 128-bit encryption. I may be doing my calculations wrong (big numbers are scary...) but I think the difference between brute-force cracking a 128-bit key and two 64-bit keys is 340,282,366,920,938,463,426,481,119,284,349,108,2

The thing that some people don't realize here, is (unless the encryption system has a known weak spot) you would have to brute-force 2^64 level-1-keys, and then (this is the scary part) you would have to try each of the 2^64 level-2-keys on each of the outputs from 2^64 level-1-key trials. So maybe it's not as far from a 128-bit key as one would think, but ONLY provided that the only way of attacking it was brute-force.

## Re:Debunking this claim (0)

## Anonymous Coward | more than 7 years ago | (#15960459)

If double-encrypting with 64 bits is twice as hard to break (since you have to break it twice), and if that's equivalent to a 65 bit key (assuming brute force and the worst possible luck on the attacker's part - having to walk through all the different possible keys) then it's phenomenally harder to break a 128-bit key

1 more bit gives a difficulty of (2^1) * (2^64) which equals 2 * (2^64) or 2^65

then isn't brute-forcing 128-bit encryption 2^64 times more difficult - i.e. (2^64) * (2^64) or 2^128?

These are big numbers and big differences.

Got my prize money ready? King Carlos can keep the medal...

## Re:Debunking this claim (1)

## swillden (191260) | more than 7 years ago | (#15960518)

The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.The difficulty of a brute force search is directly proportional to the number of keys to be searched. So 2^128 is how many times bigger than 2^64? It's 2^64 times bigger, so a 128-bit keyspace is 2^64 times harder to search than a 64-bit keyspace.

## no (1)

## crankshot999 (975406) | more than 7 years ago | (#15960019)

## Think about it as number of possibilities (5, Informative)

## Phantombrain (964010) | more than 7 years ago | (#15960023)

If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.

With 128 bit encryption, there are 3.40282367 × 10^38 (2^128) different possibilities, MUCh more than the double 64 bit encryption.

Obviously people don't usually bruteforce encrypted files, but you can see there are much more possiblities for 128 bit encryption than with double 64 bit.

## Re:Think about it as number of possibilities (1)

## tmjr3353 (925558) | more than 7 years ago | (#15960092)

## Re:Think about it as number of possibilities (1)

## budgenator (254554) | more than 7 years ago | (#15960220)

## Re:Think about it as number of possibilities (1)

## jchernia (590097) | more than 7 years ago | (#15960289)

g(x) will be admittedly more complex than f(x), but it will still require only 2^64 brute force attempts.

If you don't know the algorithm, then how are you brute forcing in the first place?

## Re:Think about it as number of possibilities (3, Interesting)

## tkw954 (709413) | more than 7 years ago | (#15960142)

This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maximum of 2^64*2^64=2^128 different keys, which would be equivalent to brute-forcing one 128 bit key.

## Bad math (5, Funny)

## Dan East (318230) | more than 7 years ago | (#15960043)

In case anyone is wondering, I'm hoping some DRM coder will gain valuable insight from this post.

Dan East

## Re:Bad math (4, Insightful)

## thefirelane (586885) | more than 7 years ago | (#15960072)

If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also.

This means the file would have a password of either 1 or 0

After thinking about it that way, it becomes quite clear: 128 bit encryption obviously requires more than 128 guesses to get the right key. As another poster pointed out, two 64 bit passes equals 65 bit encryption.

## Re:Bad math (1)

## G-funk (22712) | more than 7 years ago | (#15960134)

## Re:Bad math (2, Funny)

## Miffe (592354) | more than 7 years ago | (#15960189)

## Now, that's a really BAD MATH (3, Informative)

## Gadzinka (256729) | more than 7 years ago | (#15960493)

And how do you know, that you guessed right at every step? No, really. Good crypto doesn't let you distinguish from decryption with good or bad key, other than the content of the plaintext. And since the plaintext in this case is crypted text for another step, you have no chance of finding out, if it's right. So in reality, you have to check key 0 and 1 in first step, 0,1 for 0 from first step and 0,1 for 1 from first step etc...

Do the math wise ass, complexity of brute-forcing 128 1-bit passes is the same as complexity of brute-forcing single 128-bit pass -- 2^128

Robert## Re:Bad math (1, Insightful)

## Anonymous Coward | more than 7 years ago | (#15960550)

So, in your above scenerio, you enter 0 and get, say, 'g6bSyfS5d...' while 1 gives '3SEgvu6yt...'. Which one is correct and to be decoded with your second (1-bit) key and which one is garbage. If you don't know you'd have to try both. Giving you 4 possible results after your second key, all of which look like encrypted garble awaiting your third key...

So you actually can get a 128-bit key. Coming up with a strong 128-step 1-bit encryption algorithm is a different matter, but 2-pass 64-bit encryption algorithms are not uncommon (some programs use different algorithms for each step, so if a major flaw is discovered in one, the file still has the protection of the other).

## Yeah, right, "D. East" (1)

## NotQuiteReal (608241) | more than 7 years ago | (#15960261)

5letter name!Well, we all know that 2^5 = 32.

So, if we take your slashdot userid of 318230 and multiply by 32 we get 10,183,360 (you can stop checking the math after this).

So, take 10,183,360 and re-write as 10 18 33 60 and then translate those

four charactersinto "WEST" and it is perfectly clear thatheis one ofthem!I don't have to spell it out! Mr. East is leading you the wrong way, because you are getting too close!

## no (1)

## robo_mojo (997193) | more than 7 years ago | (#15960047)

## Re:no (1)

## robo_mojo (997193) | more than 7 years ago | (#15960101)

## Get a book on cryptography (5, Informative)

## cunkel (111089) | more than 7 years ago | (#15960048)

Applied Cryptography: Protocols, Algorithms, and Source Code in Cby Bruce SchneierHandbook of Applied Cryptographyby Alfred J. Menezes et al.## Snake oil FAQ (4, Interesting)

## Chuck Chunder (21021) | more than 7 years ago | (#15960058)

I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.

## Re:Snake oil FAQ (0)

## Anonymous Coward | more than 7 years ago | (#15960466)

Well, at least it wasn't plaintext like it was in the previous versions.

## same or different keys? (1)

## robo_mojo (997193) | more than 7 years ago | (#15960060)

## Simple counterexample for your co-worker (2, Insightful)

## metamatic (202216) | more than 7 years ago | (#15960075)

Take your file, encrypt it with a 16 bit number.

Take the result, and encrypt it again with a different 16 bit number.

Now, how many possible keys do you have to go through to get back the original data?

(Hint: Try single decryption using key3 = key1 XOR key2.)

This counterexample invalidates the claim that encrypting twice with an N-bit key is equivalent to encrypting with a 2N-bit key. It also demonstrates the worst possible case, that encrypting twice may be no more secure than encrypting once.

## Re:Simple counterexample for your co-worker (1)

## moyix (412254) | more than 7 years ago | (#15960107)

Any crypto experts want to weigh in?

## Re:Simple counterexample for your co-worker (4, Informative)

## pclminion (145572) | more than 7 years ago | (#15960164)

This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ?No, this is not always the case. For DES it is certainly NOT the case. The terminology you are looking for is "idempotent cipher." A cipher is idempotent if it obeys the equation you stated -- i.e., for any key pair K1,K2, there is a K3 such that E_K2(E_K1(x))=E_K3(x). DES is NOT an idempotent cipher.

These questions only seem deep to non-crypto-experts (no offense to you intended) -- and it was certainly accounted for in the design of DES (along with features which harden it against differential cryptanalysis, a technique that wasn't even publically known outside the NSA at the time -- the people who work on these things are far from stupid)

## Re:Simple counterexample for your co-worker (1)

## moyix (412254) | more than 7 years ago | (#15960173)

## Re:Simple counterexample for your co-worker (0)

## Anonymous Coward | more than 7 years ago | (#15960377)

"idempotent cipher."On the outside chance that anyone cares, idempotent means "same strength".

## Re:Simple counterexample for your co-worker (0)

## Anonymous Coward | more than 7 years ago | (#15960136)

It is also untrue that in the general case your double-encrypted document only takes twice as long to break, because there is no way of verifying that the first encryption was the correct one. Unless you have access to a cyphertext and corresponding plaintext, the general solution is at best O(2^2n) time where n is the key length. In the particular case in which an attacker has a cyphertext-plaintext pair to work with, it takes only O(2^n+1) time to find the keys. Many people have said this holds in the general case, which is simply not true.

## XOR (1)

## jours (663228) | more than 7 years ago | (#15960081)

Well, after all these years I'm still shocked at how often I hear someone refer to "XOR encryption". As for laying them to rest though, you really shouldn't have to. I can come up with thousands of ways to mangle data and claim it's encrypted...but the burden would be on me to prove that any of the methods were reasonable encryption. In my opinion you're probably best to just avoid conversations like this because you're certainly not going to make any headway with someone who knows so little about the field.

## Re:XOR (0)

## Anonymous Coward | more than 7 years ago | (#15960210)

## Re:XOR (1)

## jd (1658) | more than 7 years ago | (#15960560)

It is arguable that if you had a one-time pad system (which is provably totally secure so long as the key is secure and used no more than once) where each bit/byte of the message was XORed with a totally random bit/byte from start to finish and where (since it's a one-time pad) the total input stream for the message was less than or equal to the length of the random pad it was XORed with, that you could refer to this as "encryption by means of XOR". It is also entirely reasonable, particularly if the phrase is an accepted shorthand amongst a particular group, for this to be shortened to "XOR encryption", although that is not strictly correct. Shorthand rarely is.

Newcomers to encryption may misunderstand such shorthand, or may misunderstand the OTP methodology, or may have examined very common encryption algorithms on the Internet and developed the term themselves on the basis of what they see. Such newcomers deserve commendation for being observent enough to get to that point in the first place, then forwarded to the information needed to complete their understanding. DON'T tell me you've never misused terminology in a subject you were new to, and DON'T tell me you wouldn't profit in such cases by getting forwarded to the necessary information.

So we've experts using shortcuts and confused newcomers, are there other categories? Well, yes, there are the idiots. There are a lot of these, as can be seen from the very large number of methods marked as fatally flawed on the block crypto lounge and hash function lounge. And those only list the better algorithms. The number of unlisted (even if open source) algorithms is an order or two magnitude greater, but nobody bothers to look at them beyond the most trivial of analysis because it's so obvious that the approach is utterly flawed from beginning to end.

Unfortunately, a lot of idiots get to produce professional crypto products (the Quartz public-key system is one example) so it's not even possible to go by appearances.

## Like matricies? (0)

## Anonymous Coward | more than 7 years ago | (#15960085)

I do not know, but I would suspect that for double encryption by a 64 bit key could be decrypted by a single key different from each of the original keys.

## Encryption not like matricies... (1)

## slew (2918) | more than 7 years ago | (#15960440)

Of course a cipher being a mapping function that maps it's input set to an equivalently sized output set will generate a high-dimensional permutation group, but that's subtly (but significantly) different than being a group or matrix...

## Meet in the middle attack (1)

## dido (9125) | more than 7 years ago | (#15960098)

There's a reason why triple encryption is used rather than double encryption. There's a technique known as the Meet-in-the-middle attack [wikipedia.org] which, by trading off storage space for time, you can break a double encryption with two independent keys using only twice the time needed for single encryption, meaning that using two independent 64-bit keys would only require 2^65 work, rather than 2^128 as one might expect. That's the reason why you never hear of "double DES" but rather Triple DES with independent keys.

## easy solution (1)

## ceejayoz (567949) | more than 7 years ago | (#15960104)

## Simple for this - hard in general (3, Informative)

## Bender0x7D1 (536254) | more than 7 years ago | (#15960113)

However, in the general case of debunking bogus encryption statements, it can be quite difficult. Why? Because encryption is hard, hard, hard, hard stuff. I've taken a graduate course in cryptography and all it did was reinforce how easy it is to make a mistake.

The only encryption scheme I know of that is provably unbreakable is the one-time pad. However, proper implementaion of that is a pain. Is AES secure? We don't know. We think it is, but there are researchers looking for weaknesses because we don't know for sure. It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.

## Re:Simple for this - hard in general (1)

## SanityInAnarchy (655584) | more than 7 years ago | (#15960317)

That's why we have other crypto algorithms ready. I use Blowfish for my VPN connections.

## The only thing I can think of... (1)

## Duhavid (677874) | more than 7 years ago | (#15960152)

Is exactly equivilent to 1 round of the same, with a different key,

and no harder to crack. In fact, if your key is the same length as

your alphabet, it would be the plaintext again.

## Group theory (4, Informative)

## coyote-san (38515) | more than 7 years ago | (#15960159)

A single encryption key defines a one-to-one mapping between each plaintext block and ciphertext block. (In ECB (electronic codebook) mode, but the chained block encoding follows a similar analysis.) It has to be bidirectional so that the decryption is unique.

This means that, AT A MAXIMUM, your meaningful keyspace can be no larger than the 2^(number of bits in block). In practice it takes a lot of hard work to ensure that you have 2^(number of bits in key) distinct and non-trivial mappings, and ideally for any arbitrary plaintext and ciphertext block you can guarantee that there is a unique key that will produce that result. However it's easy to screw up and only have, e.g., 2^40 actual encryptions even though you have a keysize of 64 or even 128 bits.

Double encryption also defines a one-to-one mapping between each plaintext block and ciphertext block... but you have the same blocksize. If the keys form a "group", then any double encryption will be equivalent to a single encryption with a different key. In fact any N-encryptions will be equavalent to encyrption with a single different key. Some combinations will leave you with no encryption. (XOR encryption is a trivial example of this -- you'll get no encryption whenever you have an even number of encryptions.)

Worse, your keys may not form a group because there's some property that gets cancelled out in a double encryption. E.g., maybe something drops out each time both keys have a '1' in the same spot. In this case double encryption would be significantly weaker than single encryption.

Triple-DES encryption is an oddball since it isn't (or wasn't) known whether DES keys form a group. Remember that a DES key is only 56 bits, but a DES cipherblock is 64 bits -- you know that can't get every possible mapping between plaintext and ciphertext by running through the keyspace.

Hence 3DES. However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.

## Re:Group theory (2, Interesting)

## Beryllium Sphere(tm) (193358) | more than 7 years ago | (#15960435)

Wasn't, although the result came very late in the history of DES:

K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.

Beep! Retrieval complete.

## Re:Group theory (1)

## swillden (191260) | more than 7 years ago | (#15960526)

However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.The reason I learned is much more prosaic: With the EDE pattern, you can implement single DES simply by using the same key for both halves. That way you can build 3DES hardware that can also do 1DES operations, just by feeding it a 'symmetrical' key. No need for separate 1DES hardware, or for a 3DES/1DES switch.

## This is just so wrong (1)

## eyal (774028) | more than 7 years ago | (#15960581)

Out of those factorial(2^N) keys, only a very very very very very few are actually used, since the key size is nowhere near that large. In AES, for example, the block size is 128 bits, but the largest key size is 256 bits, even though the number of possible keys is 128!. This means that given a random plaintext/ciphertext block, it's actually very improbable that you'd find a key to generate that result.

The only reason that double encryption doesn't provide "double security", is that a meet-in-the-middle attack is possible (see other comments for more details), and that kind of attack only takes twice the time (on average) than a brute force attack on a single key. 3DES provides more security than regular DES, since a meet-in-the-middle attack on 3DES makes its security roughly equivilant to using a 112 bit key (56*2), and that's still too large to be practically broken by brute force alone.

## Wrong. (1)

## cr0z01d (670262) | more than 7 years ago | (#15960582)

For example, the maximum keysize of a 128 bit block cipher is about 10^22 bits.

## Possibly not as bunk as you think... (2, Insightful)

## inio (26835) | more than 7 years ago | (#15960166)

A good encryption scheme doesn't have any header, for exactly this reason. It reads in n words and writes out n words and that's that. So if your coworker is doing:

C = E_K2(E_K1(P))

there are indeed 128 bits of information required to decrypt the document. Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some). Given plaintext+intermediate+cyphertext, it's down to 2^64, because each half could be cracked separately.

With a good encryption algorithm the first pass ( E_K1(P) ) generates something approaching line noise - close enough so that there's no statistical way to tell real random bits from encrypted data (many RNGs now use encryption routines to boost incoming entropy). There is no structure, there is no header, and there is no way to verify that it's an encrypted anything. It might as well be a bunch of random bits. Given the cyphertext, your total search space to find the plaintext is still 2^64 * 2^64 = 2^128 sets of keys because there's no way to tell that the first round was performed correctly - it will always just be random bits.

Now, if there IS a header added you're back to n*2^64. For example, the software compresses (and thus adds a header to) the data before encrypting it. Given the cyphertext you can search for keys that decrypt to have a valid header, and then search for keys that produce reasonable english text from those outputs.

## Re:Possibly not as bunk as you think... (2)

## Coryoth (254751) | more than 7 years ago | (#15960386)

Mounting a known-plaintext attack (plaintext + ciphertext pair) using meet-in-the-middle on something encrypted twice with two different 64 bit keys requires only 2^65 trials. It's a time memory trade off, so it is a little memory expensive, but it's well short of the 2^128 trials you claim.

What you need is two ciphertexts (potentially just two blocks from an encrypted message) C1 and C2, plus two corresponding plaintexts P1 and P2 such that C1 = E_K1(E_K2(P1)) and C2 = E_K1(E_K2(P2)). For each possible key K you compute E_K(P1) and store all the results in memory. Then you compute D_K(C_1) for each possible K, and look up to see if it matches any of your stored results. If it's in there then try encrypting P2 with the two keys you've found. If that gives you C2 then you've (with very high probability, but technically some uncertainty) got the right keys. All up that takes at most 2 searches through the key space, or 2*2^64 = 2^65.

## Somebody rode the short yellow bus... (1)

## gru3hunt3r (782984) | more than 7 years ago | (#15960192)

Okay so a 64 bit key has 18446744073709551616 possible combinations (2 ^ 64).

A 128 bit key has (2 ^ 128) has roughly 3,402,823,669,209,384,634,633,746,074,317,700,000

combinations

encrypting a 64 bit file TWICE means you get

2^64 * 2 = or 36893488147419103232

Effectively -- 65 bit encryption.

Albiet this is dramatically oversimplifying the type of attacks, etc. I feel dumber for even reading this question.

## Re:Somebody rode the short yellow bus... (1)

## QuasiEvil (74356) | more than 7 years ago | (#15960372)

Ummm, I think the answer is rather obvious... I'm posting on

## Complications... (1)

## The Grey Ghost (884000) | more than 7 years ago | (#15960284)

Once you've established that bits aren't the only answer, start poking holes. For example, a guessable header produced by the first round of encryption would be known plaintext to work against the second round. As mentioned by others, the cipher itself may be vulnerable to an algebraic attack. Certain types of ciphers operate in such a way that the second round only changes the key needed to obtain the original plaintext C(k2,C(k1,P)) = C(k3,P)

Start poking holes in their knowledge of cryptanalysis.. that's the key to the argument.

## Look at the HAC (3, Informative)

## petard (117521) | more than 7 years ago | (#15960311)

In particular, the part that answers you question is found in Chapter 7 (block ciphers) [uwaterloo.ca]. The justification for fact 7.33 explains why, if you can store 2^64 blocks of ciphertext, you can break double 64-bit encryption in 2^64 operations.

If your coworker is not capable of understanding the math behind this reasoning, he really should not be making decisions about encryption

(Note: I'm not witholding the justification in this post simply to be obtuse. It's too much of a pain to format it reasonably in this input box... you'll just have to refer to the PDF.)

## Well, duh, so... (0)

## Anonymous Coward | more than 7 years ago | (#15960332)

Doesn't that mean that we don't have to protect the keys?"

This in reference to a 3DES application...

## I'm right and I know it (2, Interesting)

## SanityInAnarchy (655584) | more than 7 years ago | (#15960337)

The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"

And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.

All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.

## Example: Ont-time pad (1)

## gweihir (88907) | more than 7 years ago | (#15960345)

E_KC (plaintext) = E_KB(E_KA(plain_text))

If you just encrypt 64 bit with an one-time pad (trivial example), then this is obvious:

C = A XOR B

I.e. with one-time pad you gain exactly no security at all.

For many ciphers, a key C may not exist, but even if it does not exist, an attack on E_KB(E_KA(plain_text)) may be just as easy as on a single key. What is worse is that for some ciphers (e.g. DES) it is not known whether double encryption with two different keys is more secure or not, and how much it is more secure.. Until this is known for a cipher, it has to be assumed that encryption of each block with two keys is not more secure than single encryption.

However there are ways to make a cipher more secure if you have more key material. For example you can use one part as the key and one part as the IV for CBC. Or you can split the message, and encrypt each part with a different key. That gives the attacker a bit less material to work with. Or you can use the keys with different ciphers and stack those on top of each other. If the ciphers are used in stream mode, then the result is provably as secure as the strongest one.

You can also mix in any extra key material as additional data blocks, e.g. in CBC. This again makes a ciphertext-only attack much harder.

## An analogy... (1)

## spongeboy (681073) | more than 7 years ago | (#15960350)

You can guess the letters one at a time (i.e. "is the first letter 'a'?").

Your coworker has to guess the letters two at time (i.e. "is the phrase 'aa'?")

Generally speaking, who wins?

## Re:An analogy... (1)

## amliebsch (724858) | more than 7 years ago | (#15960598)

You can guess the letters one at a time (i.e. "is the first letter 'a'?"). Your coworker has to guess the letters two at time (i.e. "is the phrase 'aa'?") Generally speaking, who wins?To make this analogy work, the answer to the first person would have to be "MAYBE" every time they asked, because there's no intrinsic way to know when you've guessed the key to unencrypt pseudorandom ciphertext - it looks just like all the other pseudorandom data that invalid keys would generate.

## And encrypting 8 times with a 8 bits key ? (1)

## file-exists-p (681756) | more than 7 years ago | (#15960438)

See the subject.

## The most simple way of explaining this. (1)

## AlphaWolf_HK (692722) | more than 7 years ago | (#15960489)

For those who aren't mathematically inclined, what you effectively have is 2 as your base number (on is one, off is another one, thus two total positions for each switch) and the total number of switches is your exponent (remember from 8th grade math, 2^3 = 2*2*2, if you had three switches that is.) If you are still lost, put it this way: 64-bit encryption is 2^64 different possibilities, and 128-bit encryption is 2^128 different possibilities. Thus 128-bit encryption is 18,446,744,073,709,551,616 times stronger than 64-bit encryption.

If you encrypt in 64-bit, and then encrypt it in 64-bit again, you only double the strength as you are basically adding 2^64 plus 2^64, which equals 2^65, or 65-bit encryption effectively. Whereas 2^128 on the other hand, is actually 2^64 times 2^64. So if your friend wanted to achieve 128-bit encryption this way, he would effectively have to encrypt that 64-bit file 64 times, not just two times.

Now 2^64 may sound like a pretty big number, but with computers being as fast as they are today, we always can factor (or in laymens terms, "brute force") the encryption keys in order to figure out what that switch sequence would be at a pretty fast rate, so that number isn't really that large in the grand scheme of things. However, if it is 2^64 times that amount, then that number suddenly becomes much much larger, and thus takes exponentionally longer for an everyday home computer or even a cluster of computers to break.

To cryptography know-it-alls: I know, this is an oversimplified explanation, but it completely answers the full scope of his question, so spare me of the "but you forgot this" or "you can break some encryption without factoring" replies.

## linear vs. non-linear (1)

## Shirloki (563610) | more than 7 years ago | (#15960575)

Typically, with encryption, the time required to brute-force attack is exponential to the number of bits. So:

let n = the number of bits in the encryption key

let t = time required to crack said encryption

and a = an arbitrary constant

typically t is proportional to 2^n.

So, when you encrypt with a 128-bit key, t = a * 2^128

But if you use a 64-bit key twice, then t = 2a * 2^64 = a * 2^65 since all you have to do is crack a 64-bit key, then crack another.

As you can see, using two passes with a 64-bit key is 2^63 times worse than just using a 128-bit key once.