Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Debunking a Bogus Encryption Statement?

Cliff posted about 8 years ago | from the this-and-other-gems dept.

215

deviantphil asks: "Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?" What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?

cancel ×

215 comments

Sorry! There are no comments related to the filter you selected.

Meet in the middle attack (5, Informative)

dtfinch (661405) | about 8 years ago | (#15959969)

http://en.wikipedia.org/wiki/Meet-in-the-middle_at tack [wikipedia.org]

That's why we have triple-DES instead of double-DES.

Mod parent up (0)

Anonymous Coward | about 8 years ago | (#15960090)

This is right on the money, and far more correct than pretty much everything else I've seen posted here. It's even a first post! What's not to love?

Re:Meet in the middle attack (0)

Anonymous Coward | about 8 years ago | (#15960112)

Thanks for that. If I met you in a bar, I would give you a drink.

Re:Meet in the middle attack (2, Informative)

TheSHAD0W (258774) | about 8 years ago | (#15960399)

Even if DES is tripled rather than doubled for their "128-bit crypto", I still don't like it much. The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit. I say ideal, because typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it, and do so using the same encryption hardware. But while triple-DES may be closer to the ideal 64-bit, even a non-ideal 128-bit algorithm should exceed that number of effective bits.

Re:Meet in the middle attack (4, Insightful)

swillden (191260) | about 8 years ago | (#15960470)

The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit.

When talking about block ciphers, block size and key size are separate things. Generally, if you say algorithm X is an n-bit cipher, you're talking about key size, not block size. Given use of good block chaining modes, ciphers with larger block sizes don't really offer significant additional security.

typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it

Original DES hasn't really been reduced significantly. There are attacks which reduce the computational complexity to, IIRC ~41 bits, but at the expense of requiring impractical space. All in all, DES as stood up extremely well and its practical strength hasn't been significantly reduced by 30+ years of scrutiny.

The problem 3DES was created to address was a deliberate limitation of the design: The small keyspace. The strongest variant of Lucifer (the original IBM cipher which became DES), used a 128-bit key, but the NSA deliberately had it reduced to a 56-bit key, presumably because they thought they could break the cipher with the smaller key size, but it was still secure enough against others. Advances in computation technology, of course, have put a brute force search of a 56-bit keyspace within reach of run-of-the-mill desktop computers. 3DES addresses this by increasing the effective keyspace to 112 bits.

Re:Meet in the middle attack (4, Informative)

eddeye (85134) | about 8 years ago | (#15960552)

It's even worse if your cipher of permutations forms a group. In that case, for any two ENC keys k1 and k2, there exists a third ENC k3 such that ENC_k2 (ENC_k1 (x)) = ENC_k3 (x). In other words you can find a third key that produces the same permutation as the composition of keys 1 and 2. This means that breaking a double-encryption (or triple, quadruple, etc) is no harder than breaking a single encryption: the resulting permutation can always be described by a single key.

That's why 3DES uses EDE instead of EEE. While DES doesn't form a group, it does have some group-like structures which reduce the workload quite a bit. This doesn't apply to all ciphers btw; there are many more possible 64-bit permutations than 64-bit keys, so compositions can fall well outside those covered by keyspace.

I think this is covered in chapter 7 of the Handbook of Applied Cryptography [uwaterloo.ca] .

Your keyspace wouldn't be that much bigger (5, Insightful)

Profane MuthaFucka (574406) | about 8 years ago | (#15959997)

Seems like encrypting twice with a 64-bit key just means that instead of breaking a 64-bit key once, you have to do it twice. That's like using a 65-bit key, instead of a 64-bit key. Every bit doubles your keyspace.

Plus, you have to realize that the amount of time needed to brute force a single key of a certain size goes up non-linearly with each additional bit. If you just double the number of times you encrypt, you're pretty much just increasing the effort to brute force the thing linearly.

Re:Your keyspace wouldn't be that much bigger (5, Funny)

sr180 (700526) | about 8 years ago | (#15960008)

Think of it as this: If you encrypt something as rot-13 twice, does it become any more secure?

Re:Your keyspace wouldn't be that much bigger (0)

Anonymous Coward | about 8 years ago | (#15960050)

Rot-13 is a transformation. It hardly qualifies as encryption.

Re:Your keyspace wouldn't be that much bigger (1)

SCPRedMage (838040) | about 8 years ago | (#15960290)

Mod parent "-1, missed the joke".

Re:Your keyspace wouldn't be that much bigger (2, Informative)

igb (28052) | about 8 years ago | (#15960524)

Yes, but rot-N forms a group under composition (ie there exists a Z such that rot-X followed by rot-Y is equivalent to rot-Z). DES, for example, has been shown not to form a group, so there is no single key equivalent to the use of two distinct keys.

ian

Re:Your keyspace wouldn't be that much bigger (2, Insightful)

Jeff DeMaagd (2015) | about 8 years ago | (#15960080)

A better illustration is to ask them if they want two ten dollar bills or one one hundred dollar bill? Both of them will get you two zeros, but obviously where those zeros are makes a big difference in the value of the number.

Re:Your keyspace wouldn't be that much bigger (1, Funny)

Anonymous Coward | about 8 years ago | (#15960140)

Yeah genius, how about asking them to give *you* either two tens or a hundred. Trust me, it works out better that way.

Re:Your keyspace wouldn't be that much bigger (3, Informative)

TCM (130219) | about 8 years ago | (#15960128)

Seems like encrypting twice with a 64-bit key just means that instead of breaking a 64-bit key once, you have to do it twice.
I think there's more to this problem than you realize at first. Bruteforce relies on the ability to identify when you have successfully broken the cipher text. If you encrypt random data and then try to bruteforce it, how will you know at the first layer that you have successfully broken it? It's not like some magic bit is telling you whether the key was correct or not. If you decrypt with the wrong key, you just get (truly random?) garbled data.

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again? I'm feeling the effective key strength is somewhere between 64bit and 128bit, certainly more than 65bit, but not really 128bit.

IANAC(ryptologist).

Re:Your keyspace wouldn't be that much bigger (5, Insightful)

Kadin2048 (468275) | about 8 years ago | (#15960297)

I think a big part of that comes down to the file headers and how you actually implement the cryptographic algorithms into a system.

If you take a plaintext file and encrypt it into a file which has headers ("BEGIN ENCRYPTED CONTENT---"), and then encrypt the result again, assuming the attacker knows how you did it and that the intermediate file has plaintext headers, then they'll know the moment they broke the first 64-bit encryption layer. So in this example, you're basically at 65 bits.

Now if you don't include any headers, so that there's no terribly good way to determine whether you've gotten the right key or not, as you're brute-forcing the first layer, then I think you're right -- the strength of the overall system is somewhere in a grey area between 65 and 128 bits.

If someone was just thinking that they could use a file-encryption utility twice (which produces output files that have plaintext headers) and double the keyspace, they are dead wrong.

Re:Your keyspace wouldn't be that much bigger (5, Informative)

swillden (191260) | about 8 years ago | (#15960505)

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again?

No, you do a meet-in-the-middle attack, which is basically 2^64 in complexity if you're using two 2^64 keys.

There are some optimizations that can be done, but the basic idea is this: You start with one ciphertext block and its corresponding known plaintext. Then you encrypt the plaintext with all 2^64 possible keys and store the results (with some indexes to make it easy to search for a particular block). Then you decrypt the ciphertext with all 2^64 possible keys, looking each result up. When you find a match, you have found the pair of 64-bit keys that turns that plaintext into that ciphertext. So to search the entire space, you have to do 2^64+2^64 = 2^65 trial operations. On average, you only have to search half of the second keyspace, so the complexity of the search is 2^64+2^63 = 2^64 (plus the huge storage cost).

Triple encryption is also weakened by a meet-in-the-middle attack. Using three 64-bit keys, it would be nice to think you have a 192-bit key. But a meet-in-the-middle requires encrypting with all 64-bit keys for the first step, and decryption with all 128-bit keys for the second step, giving an effective strenth of 2^64+2^127 = 2^127 (plus the huge storage cost).

Finally, keep in mind that DES keys aren't 64 bits. They're 56 bits. So 3DES has a brute force search computational complexity of 111 bits, on average.

Intuition doesn't work well in crypto (3, Insightful)

billstewart (78916) | about 8 years ago | (#15960258)

Any sentence beginning with Seems like encrypting twice is likely to be doomed to bogosity, unless there's a later clause in it like but that's not what really happens. Crypto not only depends on a lot of deep math, it also depends on a lot of cryptographers spending time following bits around to see where they go and how to break them.

Sometimes things do what your mathematical intuition tells you, if you're mature enough in the field to have a solid intuition, but often they don't. Problems can be very hard if looked at from one direction and very easy (or at least less hard) when looked at from another direction, and a cryptographer's job is to make sure they've checked out _all_ the directions because it only takes one weak one to break something. NP-complete problems are especially that way - they're potentially useful for crypto because there's one easy direction if you know the key, but many problems can't be transformed in a way that you can use the easy path and the attacker's stuck on the hard paths.

But even the bit-twiddly algorithms, like most hash functions or the S-box building blocks inside Feistel-structured algorithms, can often be cracked by people examining them closely enough to find conditions under which they can deduce bits. For instance, MD5 is pretty much busted these days.

And both mathematical crypto and bit-twiddly crypto has to worry about Moore's Law and brute force - some algorithms scale well, so you can double the strength of an N-bit key by adding 1 bit or maybe logN bits, while others don't, or they form groups so that encrypting a message multiple times with an N-bit key still only gives you N bits of strength (leaving aside pathological cases like rot13.)

No it's not -- if there's no known plain text (1)

Nicolas MONNET (4727) | about 8 years ago | (#15960592)

If there is no known plain text in the second pass of the encryption, how do you know you've broken the first pass?

In effect, if there's no fixed header, no way to quickly distinguish a random putative clear text from rubbish added in between the two encryption phase, then in effect a brute force attack would take 2^64*2^64 steps, that's to say 2^128 ...

Easy (3, Insightful)

suricatta (617778) | about 8 years ago | (#15959999)

Each bit doubles the strength

Re:Easy (1)

flyboy974 (624054) | about 8 years ago | (#15960599)

While each bit does double the protection, the biggest protection is knowing what algorytm somebody used.

If I was to encrypt a file once with 64bit, I would have 2^64. Now, if the intruding party did not know that I encrypted the file a second time, they would then have to try to brute force again with an additional 64 bits. Therefor, I would really have 2^64 * 2^64, or 2^4096. The file is absolutely useless unless both sets are compromised.

We are not talking about a single additional bit here. We are talking about two seperate uncompromised keys in a file format which is completely unknown! So it wouldn't just double the encryption cost, because frankly any encryption algorytm doesn't mean that if you encode a file once, and then encode it again, you can decode it by adding a single bit to the key. Duh. You have to completely decrypt the first encrypted byte set into the again encrypted format. Thus there is 64 bits of primary encryption, and 64 bits of additional encryption.

So, even if you brute forced the first 64 bits, you get to do it again, thus you are really trying to brute force 2^128. Although that assumes that both keys are at the extream ends of the brute force.

more is better (-1, Redundant)

Anonymous Coward | about 8 years ago | (#15960004)

I always use ROT13 twice.

While you're at it... (3, Funny)

unitron (5733) | about 8 years ago | (#15960006)

While you're double super secret encrypting that file why not keep compressing and re-compressing it until you have it down to 1 bit as well? :-)

Re:While you're at it... (3, Funny)

QuantumG (50515) | about 8 years ago | (#15960015)

Bruce Schneier can compress arbitary random data... with his fists.

Re:While you're at it... (2, Funny)

RuBLed (995686) | about 8 years ago | (#15960056)

but Chuck Norris can roundhouse kick that compressed data back into its' raw format.

Re:While you're at it... (4, Funny)

QuantumG (50515) | about 8 years ago | (#15960156)

Geologists recently discovered that "earthquakes" are nothing more than Bruce Schneier and Chuck Norris communicating via a roundhouse kick-based cryptosystem.

Re:While you're at it... (1)

Psiven (302490) | about 8 years ago | (#15960308)

Rofl

Re:While you're at it... (1)

Patrik_AKA_RedX (624423) | about 8 years ago | (#15960565)

I use that technique all the time.
Here I got some intresting downloads for you:
My whole collection of the latest movies: '1'
All microsoft programs: '0'
A working beta of Duke4Ever: '0'
ISO's of the 20 best games of 2005 and 2006: '1'
Every MP3 you could find on the net: '1'
pron: 12.250.000 images, 512.054 movies: '0'

I'll upload the decompress util as soon as I got that last bug out of it. But you could finish the downloads in the mean time.

Re:While you're at it... (1)

megaditto (982598) | about 8 years ago | (#15960590)

This works like so:

1) Find an OTP that XORs your data to all 1's
2) XOR date and the OTP
3) Compress the resulting file
4) Report the resulting compression size on slashdot
__

Tis not enough to succeed. Others must fail.

It's like this.. (1)

BigZaphod (12942) | about 8 years ago | (#15960011)

Encrypting a single document multiple times in that way is like using a piece of duct-tape to keep a box closed. Putting another strip of duct-tape directly over the top of the previous strip does not make the box any harder to open.

Of course this is just an analogy from a non-expert... Use at your own risk. :-)

Re:It's like this.. (1)

AP2k (991160) | about 8 years ago | (#15960099)

At some point so mant strips of duct tape are going to hold down the box lid due to sheer mass.

Re:It's like this.. (1)

BigZaphod (12942) | about 8 years ago | (#15960176)

At some point so mant strips of duct tape are going to hold down the box lid due to sheer mass.

True, but in the end you could just pry the whole mess off with a crowbar or something. It's only as strong as the glue on the bottom piece.

Debunking this claim (4, Informative)

jordandeamattson (261036) | about 8 years ago | (#15960012)

The length of the key (64-bits, 128-bits) at a simple level describes the complexity or "hardness" of the encryption. Enrcypting with a 64-bit key will give you one level of "hardness" and 128-bit will give you another, greater, level of hardness.

If you encrypt twice with a 64-bit key you will just be taking a plain text file and encrypting it and then taking your encrypted file and encrypting it once again. To break encryption on this file, getting access to the plain text, you would need to break 64-bit encryption twice.

While this would be more of a challenge than breaking 64-bit standalone encryption, it wouldn't be equivalent to the security of a 128-bit key. The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

Now, breaking a doubly encrypted file would present some interesting challenges. The most obvious is that your plaintext for the first layer of encryption is anything but plaintext. Confirming that one set of encrypted text is right one vs. some other set of encrypted text would definitely be a challenge. It would make it difficult to determine if you had found the right key to decrypt your first layer of encryption.

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

Yours,

Jordan

Re:Debunking this claim (1)

ydra2 (821713) | about 8 years ago | (#15960055)

"As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research."

I've had good results double encrytpting with the XOR method. ROT13 is another good candidate for double encrytpion.... Doggonit! Somebody got my root password again! I'm not settling for anything less than quadruple encrytpion from now on.

Re:Debunking this claim (1)

budgenator (254554) | about 8 years ago | (#15960147)

I remember that there was an system that was faulty and with some keys double encryption would result in the file being decrypted, but I can't remember which system it was.

Re:Debunking this claim (1)

jordandeamattson (261036) | about 8 years ago | (#15960175)

Hey Ydra -

I agree that XOR and ROT13 are encryption systems which end up decrypting the cipher text when re-encrypted. But I will be honest that I wasn't thinking of them, but rather of "real" encryption systems (i.e. more robust), when I thought about systems where double encryption could lead to the cipher text being less secure than it was with one level of encryption.

I think the get mistake which spawned this discussion is that people tend to think of encryption as "lock boxes" for information. In the real world, if I put something in a safe and then put that safe in another safe, my stuff is twice as secure. But the analogy breaks down and I am not twice as safe.

Yours,

Jordan

Re:Debunking this claim (-1, Troll)

kjs3 (601225) | about 8 years ago | (#15960059)

In other words, you don't know what you are talking about, tragicly think you do, have the arrogance to speak with faux authority, and graced us with 5 paragraphs to prove it. Thanks. Clue: W. Diffie & M. Hellman to start, gain clue, show your math. Yours, Kjs3

Re:Debunking this claim (1)

pizpot (622748) | about 8 years ago | (#15960145)

Answering in a forum is something to do only when you know the answer. But don't dis Jordan, he did make 2 new good points.

Re:Debunking this claim (1)

kjs3 (601225) | about 8 years ago | (#15960222)

Answering in a forum is something to do only when you know the answer.

Replying to an answer in a forum is something to do only when you know the answer and the other guy doesn't. I actually do know the answer, but didn't feel the need to spoonfeed the D&H article to educate you and Jordan. You could go look it up, and find out what a Meet-in-the-Middle attack is, maybe look at subsequent research, understand how the math actually works, and understand why Jordan doesn't know what he's talking about. Hell, if you actually did your research, you could even have a snappy comeback ("yeah, but the hash table you'd need to assemble for the attack to work in practice is impractially large! Got you you smarmy bastard!"). If you two can't do your homework, why should I do it for you.

But don't dis Jordan, he did make 2 new good points.

No, he did't. See above.

Re:Debunking this claim (1)

ScrappyLaptop (733753) | about 8 years ago | (#15960395)

Rebuting the reply to an answer in a forum is something to do only when..oh, never mind.

Re:Debunking this claim (1)

Capt'n Hector (650760) | about 8 years ago | (#15960068)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

Rot-13 comes to mind...

Re:Debunking this claim (-1, Troll)

Anonymous Coward | about 8 years ago | (#15960126)

Dude, for someone who clearly not an authority you sure talk a lot. I hope your day job is selling carts, not programming computers.

Re:Debunking this claim (4, Insightful)

Coryoth (254751) | about 8 years ago | (#15960212)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research.


It isn't very common for the resulting encryption to be less secure, but "no appreciable improvement" is certainly very possible.

With block ciphers you can run into the issue that if the cipher scheme forms a group (as some do) then even if you use 2 different keys for each encryption round, there will be a single third key that provides identical encryption: an attacker never needs to break your two keys, they can simply find the single third key that is equivalent to your double encipherment. Depite encrypting twice with two different keys the result is no more secure than encrypting once. For a simple example of this, consider a ROT encryption scheme - if you encrypt a message with ROT9 then ROT12 that is just equivalent to ROT21, so instead of trying to find the two keys 9 and 12, the attacker can just solve the ROT21 problem (the worst case, of course, is if your combination of keys gives the identity element such as ROT9 and ROT17, or ROT13 twice, in which case you end up with an unencrypted document).

Even if that's not the case you can still get caught with a meet-in-the-middle approach if the attacker has some known plaintext (that is, they have some plaintext with the corresponding encrypted text and are attempting to extract your key) which does pretty much what it says, encrypting one way and decrypting the other to meet in the middle. Using such a scheme you can extract the keys with only marginally more effort than for single encryption.

Re:Debunking this claim (1)

snowgirl (978879) | about 8 years ago | (#15960384)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

The trivial example of this is ROT-13. When performing the encryption a second time, it will actually unencrypt the text itself.

If the encryption is a simple XOR of any key size length, then a double encrypt will actually not increase the complexity to break it at all, as if you had a plaintext value P, then xor it with K_1, then xor it with K_2, then you have only simply xor'ed it with (K_1 xor K_2) so definitely don't use any sort of mathematical operation that is commutative.

Re:Debunking this claim (2, Informative)

xsonofagunx (902794) | about 8 years ago | (#15960391)

Going purely by the numbers, I believe that the possible combinations for a 128-bit security system would be 3.4028236692093846346337460743177 * 10^38

On the contrast, a 64-bit encryption scheme leaves you with 18,446,744,073,709,551,616. Done twice, that means it's 36,893,488,147,419,103,232 which is still far less than 128-bit encryption. I may be doing my calculations wrong (big numbers are scary...) but I think the difference between brute-force cracking a 128-bit key and two 64-bit keys is 340,282,366,920,938,463,426,481,119,284,349,108,22 4 - so a 128-bit key has that many more possible keys than even using two separate keys and encrypting twice with 64-bit encryption.

The thing that some people don't realize here, is (unless the encryption system has a known weak spot) you would have to brute-force 2^64 level-1-keys, and then (this is the scary part) you would have to try each of the 2^64 level-2-keys on each of the outputs from 2^64 level-1-key trials. So maybe it's not as far from a 128-bit key as one would think, but ONLY provided that the only way of attacking it was brute-force.

Re:Debunking this claim (0)

Anonymous Coward | about 8 years ago | (#15960459)

I dunno if I'm being unduly thick here, but it seems simple to me:

If double-encrypting with 64 bits is twice as hard to break (since you have to break it twice), and if that's equivalent to a 65 bit key (assuming brute force and the worst possible luck on the attacker's part - having to walk through all the different possible keys) then it's phenomenally harder to break a 128-bit key

1 more bit gives a difficulty of (2^1) * (2^64) which equals 2 * (2^64) or 2^65

then isn't brute-forcing 128-bit encryption 2^64 times more difficult - i.e. (2^64) * (2^64) or 2^128?

These are big numbers and big differences.

Got my prize money ready? King Carlos can keep the medal...

Re:Debunking this claim (1)

swillden (191260) | about 8 years ago | (#15960518)

The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

The difficulty of a brute force search is directly proportional to the number of keys to be searched. So 2^128 is how many times bigger than 2^64? It's 2^64 times bigger, so a 128-bit keyspace is 2^64 times harder to search than a 64-bit keyspace.

no (1)

crankshot999 (975406) | about 8 years ago | (#15960019)

If you encrypt a file and then encrypt the encrypted file all that has to be done is crack 2 layers of the same type of encryption. Try it on a file ive tried it before.

Think about it as number of possibilities (5, Informative)

Phantombrain (964010) | about 8 years ago | (#15960023)

I'm going to think of it as if you were trying to bruteforce it.
If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.
With 128 bit encryption, there are 3.40282367 × 10^38 (2^128) different possibilities, MUCh more than the double 64 bit encryption.

Obviously people don't usually bruteforce encrypted files, but you can see there are much more possiblities for 128 bit encryption than with double 64 bit.

Re:Think about it as number of possibilities (1)

tmjr3353 (925558) | about 8 years ago | (#15960092)

Just thinking about a problem an above poster supposed (since we're talking about brute forcing here) -- how can I discern when I've properly gotten from the second created cypher to the first? The deciphered text will still appear as gibberish. The only way I can know*** that I've correctly gotten the keys is to go both levels down to the original plaintext. Now, think about the numbers you've already brought up -- if I need to have both keys to know I've gotten the solution, then I am left with a total of 3.40282367 × 10^38 possible combinations -- the same number of possibilities present in an 128 bit key. ***It's entirely possible that I'm wrong and there is some way to know that I've gotten the proper second ciphered text. If so, I'd appreciate it if someone would be kind enough to enlighten me. :)

Re:Think about it as number of possibilities (1)

budgenator (254554) | about 8 years ago | (#15960220)

I'm no cryptographer but it seems to me that there are infinite possible encryption systems, keys and key length combinations, so any time you would double encrypt a file, there should be an equivalent system and key that would encrypt the file the same way in one pass, therefore you you should be able to break the encryption in one pass without the intermediate step. I would expect that double encrypting at 64 bit strength would be equivelent to single encrytion at 65 bit strength, and these strenghts are only suitable for diversions these days.

Re:Think about it as number of possibilities (1)

jchernia (590097) | about 8 years ago | (#15960289)

If you know the algorithm, then you can view the double encryption as a single function - f(f(x))-> g(x).
g(x) will be admittedly more complex than f(x), but it will still require only 2^64 brute force attempts.

If you don't know the algorithm, then how are you brute forcing in the first place?

 

Re:Think about it as number of possibilities (3, Interesting)

tkw954 (709413) | about 8 years ago | (#15960142)

If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.

This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maximum of 2^64*2^64=2^128 different keys, which would be equivalent to brute-forcing one 128 bit key.

Bad math (5, Funny)

Dan East (318230) | about 8 years ago | (#15960043)

Your both wrong. Encrypting twice at 64 bit is like 64*64=4096 bit encryption. Personally, I like to encrypt 7 times at 2 bit encryption for 128 bit strength. If its something I really want to keep secret, I encrypt four times at 128 bit for 268435456 bit encryption. However, don't make the mistake I made once of encrypting 128 times at 1 bit. Because that is just 1^128, which yields nothing more than 1 bit encryption.

In case anyone is wondering, I'm hoping some DRM coder will gain valuable insight from this post.

Dan East

Re:Bad math (4, Insightful)

thefirelane (586885) | about 8 years ago | (#15960072)

In case anyone missed it though, there was a good insight in that humor: a clear way of expressing the answer to the posed question:

If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also.

This means the file would have a password of either 1 or 0 ... you would be prompted for this password, and if you got it wrong, you could try again. Obviously you'd get it on the second try. You could repeat this 128 times, and that would be the total of all protection. Naturally, this means the most number of guesses you'd need to make is 256.

After thinking about it that way, it becomes quite clear: 128 bit encryption obviously requires more than 128 guesses to get the right key. As another poster pointed out, two 64 bit passes equals 65 bit encryption.

Re:Bad math (1)

G-funk (22712) | about 8 years ago | (#15960134)

Well, two 64bit runs is 128 bit encryption in that there are 2^128 different possible keys. But it's only as useful as 65 bit encryption because you only need to guess one half at a time. Its just so pantently stupid that it really took some work for me to wrap my head around a "for-dummies" explanation... Which I was promptly beaten to of course :)

Re:Bad math (2, Funny)

Miffe (592354) | about 8 years ago | (#15960189)

Naturally, this means the most number of guesses you'd need to make is 256.
Well, you only have to guess 255, since the last one wont really be a guess. =)

Now, that's a really BAD MATH (3, Informative)

Gadzinka (256729) | about 8 years ago | (#15960493)

If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also. This means the file would have a password of either 1 or 0 ... you would be prompted for this password, and if you got it wrong, you could try again. Obviously you'd get it on the second try. You could repeat this 128 times, and that would be the total of all protection. Naturally, this means the most number of guesses you'd need to make is 256.

And how do you know, that you guessed right at every step? No, really. Good crypto doesn't let you distinguish from decryption with good or bad key, other than the content of the plaintext. And since the plaintext in this case is crypted text for another step, you have no chance of finding out, if it's right. So in reality, you have to check key 0 and 1 in first step, 0,1 for 0 from first step and 0,1 for 1 from first step etc...

Do the math wise ass, complexity of brute-forcing 128 1-bit passes is the same as complexity of brute-forcing single 128-bit pass -- 2^128

Robert

Re:Bad math (1, Insightful)

Anonymous Coward | about 8 years ago | (#15960550)

Unless said one-bit decypter doesn't tell/know when its right or not.

So, in your above scenerio, you enter 0 and get, say, 'g6bSyfS5d...' while 1 gives '3SEgvu6yt...'. Which one is correct and to be decoded with your second (1-bit) key and which one is garbage. If you don't know you'd have to try both. Giving you 4 possible results after your second key, all of which look like encrypted garble awaiting your third key...

So you actually can get a 128-bit key. Coming up with a strong 128-step 1-bit encryption algorithm is a different matter, but 2-pass 64-bit encryption algorithms are not uncommon (some programs use different algorithms for each step, so if a major flaw is discovered in one, the file still has the protection of the other).

Yeah, right, "D. East" (1)

NotQuiteReal (608241) | about 8 years ago | (#15960261)

Sure, just take "Dan East" and shorten to the more common "DEAST" and you get a simple 5 letter name!

Well, we all know that 2^5 = 32.

So, if we take your slashdot userid of 318230 and multiply by 32 we get 10,183,360 (you can stop checking the math after this).

So, take 10,183,360 and re-write as 10 18 33 60 and then translate those four characters into "WEST" and it is perfectly clear that he is one of them!

I don't have to spell it out! Mr. East is leading you the wrong way, because you are getting too close!

no (1)

robo_mojo (997193) | about 8 years ago | (#15960047)

Encrypting a message twice with 64-bit encryption only makes the message twice as hard to crack. Encrypting a message with 128-bit encryption instead makes it 2^64 times as hard to crack. So your friend's conclusion is off by a large factor.

Re:no (1)

robo_mojo (997193) | about 8 years ago | (#15960101)

On second thought, encrypting a message >1 times may not make it any more secure than encrypting it once. Depending on the encryption scheme, you may just be adding keys together.

Get a book on cryptography (5, Informative)

cunkel (111089) | about 8 years ago | (#15960048)

Well, one common way to dispel a myth is to find an authoritative source on the subject that explains why the myth is untrue. Perhaps a book on cryptography would be helpful, for example:
  • Applied Cryptography: Protocols, Algorithms, and Source Code in C by Bruce Schneier
  • Handbook of Applied Cryptography by Alfred J. Menezes et al.
Either one will explain such things as why double-DES is not twice as strong as DES and common pitfalls in thought and design of cryptographic systems.

Snake oil FAQ (4, Interesting)

Chuck Chunder (21021) | about 8 years ago | (#15960058)

Is a good place to start [interhack.net] if you want to protect yourself against people hawking dodgy crypto.

I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.

Re:Snake oil FAQ (0)

Anonymous Coward | about 8 years ago | (#15960466)

I used to work for a company That shall remain nameless [mcafee.com] that encrypted their administrative password for their enterprise products with Base64 encoding.

Well, at least it wasn't plaintext like it was in the previous versions.

same or different keys? (1)

robo_mojo (997193) | about 8 years ago | (#15960060)

I wonder if he goes through the trouble of making up two different keys or just uses the same key twice.

Simple counterexample for your co-worker (2, Insightful)

metamatic (202216) | about 8 years ago | (#15960075)

Consider the (naive and insecure) XOR encryption.

Take your file, encrypt it with a 16 bit number.

Take the result, and encrypt it again with a different 16 bit number.

Now, how many possible keys do you have to go through to get back the original data?

(Hint: Try single decryption using key3 = key1 XOR key2.)

This counterexample invalidates the claim that encrypting twice with an N-bit key is equivalent to encrypting with a 2N-bit key. It also demonstrates the worst possible case, that encrypting twice may be no more secure than encrypting once.

Re:Simple counterexample for your co-worker (1)

moyix (412254) | about 8 years ago | (#15960107)

This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ? This is not actually an obvious thing to prove, and I have a feeling it may vary from cipher to cipher.

Any crypto experts want to weigh in?

Re:Simple counterexample for your co-worker (4, Informative)

pclminion (145572) | about 8 years ago | (#15960164)

This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ?

No, this is not always the case. For DES it is certainly NOT the case. The terminology you are looking for is "idempotent cipher." A cipher is idempotent if it obeys the equation you stated -- i.e., for any key pair K1,K2, there is a K3 such that E_K2(E_K1(x))=E_K3(x). DES is NOT an idempotent cipher.

These questions only seem deep to non-crypto-experts (no offense to you intended) -- and it was certainly accounted for in the design of DES (along with features which harden it against differential cryptanalysis, a technique that wasn't even publically known outside the NSA at the time -- the people who work on these things are far from stupid)

Re:Simple counterexample for your co-worker (1)

moyix (412254) | about 8 years ago | (#15960173)

No offense taken! I've only had a semester's worth of crypto education, roughly enough to make sure I know that the subject is subtle enough that I should consult someone actually skilled in the area before talking out of my ass :) Thanks for the reply.

Re:Simple counterexample for your co-worker (0)

Anonymous Coward | about 8 years ago | (#15960377)

"idempotent cipher."

On the outside chance that anyone cares, idempotent means "same strength".

Re:Simple counterexample for your co-worker (0)

Anonymous Coward | about 8 years ago | (#15960136)

This doesn't invalidate the general claim. For a general encryption function, applying two keys one after another is not all that different from applying a key twice as long with the sole exception of the meet-in-the-middle attack which FP mentioned.

It is also untrue that in the general case your double-encrypted document only takes twice as long to break, because there is no way of verifying that the first encryption was the correct one. Unless you have access to a cyphertext and corresponding plaintext, the general solution is at best O(2^2n) time where n is the key length. In the particular case in which an attacker has a cyphertext-plaintext pair to work with, it takes only O(2^n+1) time to find the keys. Many people have said this holds in the general case, which is simply not true.

XOR (1)

jours (663228) | about 8 years ago | (#15960081)

> What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?

Well, after all these years I'm still shocked at how often I hear someone refer to "XOR encryption". As for laying them to rest though, you really shouldn't have to. I can come up with thousands of ways to mangle data and claim it's encrypted...but the burden would be on me to prove that any of the methods were reasonable encryption. In my opinion you're probably best to just avoid conversations like this because you're certainly not going to make any headway with someone who knows so little about the field.

Re:XOR (0)

Anonymous Coward | about 8 years ago | (#15960210)

XOR is a valid encryption method. The only caveat is that you need to be using a one-time pad, with all the difficulties that entails.

Re:XOR (1)

jd (1658) | about 8 years ago | (#15960560)

A lot of encryption schemes use XOR, some use XOR exclusively (which makes sense, as it is an exclusive or, after all), but you are correct that XOR is not an encryption method per-se. It is a mathematical operation that can be used to encrypt, but that is all.


It is arguable that if you had a one-time pad system (which is provably totally secure so long as the key is secure and used no more than once) where each bit/byte of the message was XORed with a totally random bit/byte from start to finish and where (since it's a one-time pad) the total input stream for the message was less than or equal to the length of the random pad it was XORed with, that you could refer to this as "encryption by means of XOR". It is also entirely reasonable, particularly if the phrase is an accepted shorthand amongst a particular group, for this to be shortened to "XOR encryption", although that is not strictly correct. Shorthand rarely is.


Newcomers to encryption may misunderstand such shorthand, or may misunderstand the OTP methodology, or may have examined very common encryption algorithms on the Internet and developed the term themselves on the basis of what they see. Such newcomers deserve commendation for being observent enough to get to that point in the first place, then forwarded to the information needed to complete their understanding. DON'T tell me you've never misused terminology in a subject you were new to, and DON'T tell me you wouldn't profit in such cases by getting forwarded to the necessary information.


So we've experts using shortcuts and confused newcomers, are there other categories? Well, yes, there are the idiots. There are a lot of these, as can be seen from the very large number of methods marked as fatally flawed on the block crypto lounge and hash function lounge. And those only list the better algorithms. The number of unlisted (even if open source) algorithms is an order or two magnitude greater, but nobody bothers to look at them beyond the most trivial of analysis because it's so obvious that the approach is utterly flawed from beginning to end.


Unfortunately, a lot of idiots get to produce professional crypto products (the Quartz public-key system is one example) so it's not even possible to go by appearances.

Like matricies? (0)

Anonymous Coward | about 8 years ago | (#15960085)

Wouldn't this be like matrix multiplication? One can multiply any set of same dimensional matricies and have a single matrix at the end. Thus you can provide translations and rotations to a large set of points using this final matrix instead of applying each matrix to each point.

I do not know, but I would suspect that for double encryption by a 64 bit key could be decrypted by a single key different from each of the original keys.

Encryption not like matricies... (1)

slew (2918) | about 8 years ago | (#15960440)

Actually, people that develop ciphers go out of their way to make sure the scrambling algorithms they use are not groups, or fields or other simple to analyze mathematical abstract algebraic construct like matrices. If a cipher was a group or a matrix, then repeated operations wouldn't be very secure (e.g., 3des would be pretty must just des with an alternate set of keys) just like a sequence of matrix operations can be replaced by a composite matrix.

Of course a cipher being a mapping function that maps it's input set to an equivalently sized output set will generate a high-dimensional permutation group, but that's subtly (but significantly) different than being a group or matrix...

Meet in the middle attack (1)

dido (9125) | about 8 years ago | (#15960098)

There's a reason why triple encryption is used rather than double encryption. There's a technique known as the Meet-in-the-middle attack [wikipedia.org] which, by trading off storage space for time, you can break a double encryption with two independent keys using only twice the time needed for single encryption, meaning that using two independent 64-bit keys would only require 2^65 work, rather than 2^128 as one might expect. That's the reason why you never hear of "double DES" but rather Triple DES with independent keys.

easy solution (1)

ceejayoz (567949) | about 8 years ago | (#15960104)

Ask him if encrypting something with a 1 bit key 128 times is equally secure.

Simple for this - hard in general (3, Informative)

Bender0x7D1 (536254) | about 8 years ago | (#15960113)

In this case, you can simply refer to the meet-in-the-middle attack as mentioned by dtfinch.

However, in the general case of debunking bogus encryption statements, it can be quite difficult. Why? Because encryption is hard, hard, hard, hard stuff. I've taken a graduate course in cryptography and all it did was reinforce how easy it is to make a mistake.

The only encryption scheme I know of that is provably unbreakable is the one-time pad. However, proper implementaion of that is a pain. Is AES secure? We don't know. We think it is, but there are researchers looking for weaknesses because we don't know for sure. It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.

Re:Simple for this - hard in general (1)

SanityInAnarchy (655584) | about 8 years ago | (#15960317)

It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.

That's why we have other crypto algorithms ready. I use Blowfish for my VPN connections.

The only thing I can think of... (1)

Duhavid (677874) | about 8 years ago | (#15960152)

Is to show how two rounds of the Caeser Shift [wikipedia.org]
Is exactly equivilent to 1 round of the same, with a different key,
and no harder to crack. In fact, if your key is the same length as
your alphabet, it would be the plaintext again.

Group theory (4, Informative)

coyote-san (38515) | about 8 years ago | (#15960159)

Let me see if I can explain this in a few paragraphs....

A single encryption key defines a one-to-one mapping between each plaintext block and ciphertext block. (In ECB (electronic codebook) mode, but the chained block encoding follows a similar analysis.) It has to be bidirectional so that the decryption is unique.

This means that, AT A MAXIMUM, your meaningful keyspace can be no larger than the 2^(number of bits in block). In practice it takes a lot of hard work to ensure that you have 2^(number of bits in key) distinct and non-trivial mappings, and ideally for any arbitrary plaintext and ciphertext block you can guarantee that there is a unique key that will produce that result. However it's easy to screw up and only have, e.g., 2^40 actual encryptions even though you have a keysize of 64 or even 128 bits.

Double encryption also defines a one-to-one mapping between each plaintext block and ciphertext block... but you have the same blocksize. If the keys form a "group", then any double encryption will be equivalent to a single encryption with a different key. In fact any N-encryptions will be equavalent to encyrption with a single different key. Some combinations will leave you with no encryption. (XOR encryption is a trivial example of this -- you'll get no encryption whenever you have an even number of encryptions.)

Worse, your keys may not form a group because there's some property that gets cancelled out in a double encryption. E.g., maybe something drops out each time both keys have a '1' in the same spot. In this case double encryption would be significantly weaker than single encryption.

Triple-DES encryption is an oddball since it isn't (or wasn't) known whether DES keys form a group. Remember that a DES key is only 56 bits, but a DES cipherblock is 64 bits -- you know that can't get every possible mapping between plaintext and ciphertext by running through the keyspace.

Hence 3DES. However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.

Re:Group theory (2, Interesting)

Beryllium Sphere(tm) (193358) | about 8 years ago | (#15960435)

>it isn't (or wasn't) known whether DES keys form a group.

Wasn't, although the result came very late in the history of DES:

K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.

Beep! Retrieval complete.

Re:Group theory (1)

swillden (191260) | about 8 years ago | (#15960526)

However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.

The reason I learned is much more prosaic: With the EDE pattern, you can implement single DES simply by using the same key for both halves. That way you can build 3DES hardware that can also do 1DES operations, just by feeding it a 'symmetrical' key. No need for separate 1DES hardware, or for a 3DES/1DES switch.

This is just so wrong (1)

eyal (774028) | about 8 years ago | (#15960581)

If you have a block size of N bits, then there are 2^N possible plaintext/ciphertext blocks, but there are factorial(2^N) possible keys. Think of each key as a permutation of the blocks (which is actually what it is).

Out of those factorial(2^N) keys, only a very very very very very few are actually used, since the key size is nowhere near that large. In AES, for example, the block size is 128 bits, but the largest key size is 256 bits, even though the number of possible keys is 128!. This means that given a random plaintext/ciphertext block, it's actually very improbable that you'd find a key to generate that result.

The only reason that double encryption doesn't provide "double security", is that a meet-in-the-middle attack is possible (see other comments for more details), and that kind of attack only takes twice the time (on average) than a brute force attack on a single key. 3DES provides more security than regular DES, since a meet-in-the-middle attack on 3DES makes its security roughly equivilant to using a 112 bit key (56*2), and that's still too large to be practically broken by brute force alone.

Wrong. (1)

cr0z01d (670262) | about 8 years ago | (#15960582)

A block cipher uses a key to map plaintext blocks to ciphertext blocks. Given 2^N plaintext blocks and 2^N ciphertext blocks, the number of possible encryptions are 2^N!, not 2^N. This is a staggering difference in magnitude.

For example, the maximum keysize of a 128 bit block cipher is about 10^22 bits.

Possibly not as bunk as you think... (2, Insightful)

inio (26835) | about 8 years ago | (#15960166)

... as long as the keys for each pass are different and there's no header.

A good encryption scheme doesn't have any header, for exactly this reason. It reads in n words and writes out n words and that's that. So if your coworker is doing:

C = E_K2(E_K1(P))

there are indeed 128 bits of information required to decrypt the document. Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some). Given plaintext+intermediate+cyphertext, it's down to 2^64, because each half could be cracked separately.

With a good encryption algorithm the first pass ( E_K1(P) ) generates something approaching line noise - close enough so that there's no statistical way to tell real random bits from encrypted data (many RNGs now use encryption routines to boost incoming entropy). There is no structure, there is no header, and there is no way to verify that it's an encrypted anything. It might as well be a bunch of random bits. Given the cyphertext, your total search space to find the plaintext is still 2^64 * 2^64 = 2^128 sets of keys because there's no way to tell that the first round was performed correctly - it will always just be random bits.

Now, if there IS a header added you're back to n*2^64. For example, the software compresses (and thus adds a header to) the data before encrypting it. Given the cyphertext you can search for keys that decrypt to have a valid header, and then search for keys that produce reasonable english text from those outputs.

Re:Possibly not as bunk as you think... (2)

Coryoth (254751) | about 8 years ago | (#15960386)

Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some).


Mounting a known-plaintext attack (plaintext + ciphertext pair) using meet-in-the-middle on something encrypted twice with two different 64 bit keys requires only 2^65 trials. It's a time memory trade off, so it is a little memory expensive, but it's well short of the 2^128 trials you claim.

What you need is two ciphertexts (potentially just two blocks from an encrypted message) C1 and C2, plus two corresponding plaintexts P1 and P2 such that C1 = E_K1(E_K2(P1)) and C2 = E_K1(E_K2(P2)). For each possible key K you compute E_K(P1) and store all the results in memory. Then you compute D_K(C_1) for each possible K, and look up to see if it matches any of your stored results. If it's in there then try encrypting P2 with the two keys you've found. If that gives you C2 then you've (with very high probability, but technically some uncertainty) got the right keys. All up that takes at most 2 searches through the key space, or 2*2^64 = 2^65.

Somebody rode the short yellow bus... (1)

gru3hunt3r (782984) | about 8 years ago | (#15960192)

Is everybody here retarded?

Okay so a 64 bit key has 18446744073709551616 possible combinations (2 ^ 64).
A 128 bit key has (2 ^ 128) has roughly 3,402,823,669,209,384,634,633,746,074,317,700,000, 000
combinations

encrypting a 64 bit file TWICE means you get
2^64 * 2 = or 36893488147419103232
Effectively -- 65 bit encryption.

Albiet this is dramatically oversimplifying the type of attacks, etc. I feel dumber for even reading this question.

Re:Somebody rode the short yellow bus... (1)

QuasiEvil (74356) | about 8 years ago | (#15960372)

>Is everybody here retarded?

Ummm, I think the answer is rather obvious... I'm posting on /., aren't I?

Complications... (1)

The Grey Ghost (884000) | about 8 years ago | (#15960284)

As with any simple question, there's not a truly correct straightforward answer. First off, encryption strength is not just about the number of bits. A larger keyspace is preferably to ward off brute force attacks, but no system has eliminated the risk of a skilled and determined cryptanalyst.

Once you've established that bits aren't the only answer, start poking holes. For example, a guessable header produced by the first round of encryption would be known plaintext to work against the second round. As mentioned by others, the cipher itself may be vulnerable to an algebraic attack. Certain types of ciphers operate in such a way that the second round only changes the key needed to obtain the original plaintext C(k2,C(k1,P)) = C(k3,P)

Start poking holes in their knowledge of cryptanalysis.. that's the key to the argument.

Look at the HAC (3, Informative)

petard (117521) | about 8 years ago | (#15960311)

The Handbook of Applied Cryptography is an excellent book and is now available online for free [uwaterloo.ca] . If you passed a couple college calculus courses, the material should be very accessible to you. In general, it's an excellent source of facts and analysis to debunk incorrect statements about cryptography.

In particular, the part that answers you question is found in Chapter 7 (block ciphers) [uwaterloo.ca] . The justification for fact 7.33 explains why, if you can store 2^64 blocks of ciphertext, you can break double 64-bit encryption in 2^64 operations.

If your coworker is not capable of understanding the math behind this reasoning, he really should not be making decisions about encryption :-).

(Note: I'm not witholding the justification in this post simply to be obtuse. It's too much of a pain to format it reasonably in this input box... you'll just have to refer to the PDF.)

Well, duh, so... (0)

Anonymous Coward | about 8 years ago | (#15960332)

"So the algorithm's Open Source, right?
Doesn't that mean that we don't have to protect the keys?"

This in reference to a 3DES application...

I'm right and I know it (2, Interesting)

SanityInAnarchy (655584) | about 8 years ago | (#15960337)

This is how I usually end this kind of conversation -- I go over their head. If I know more than they do about the subject, I speak jargon until their eyes glaze over, and then say something to the effect of "Do you really want to know why you're wrong, or will you take my word for it?" I'm a good enough teacher to back that up if they really do want to know. Or sometimes I start from the other end -- if I have a good enough analogy ready, I can start with that -- the money one is a good analogy for the crypto (Would you rather give me two $10 bills or one $100 bill? Would you let me settle a $100 debt with you by giving you two $10 bills?)

The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"

And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.

All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.

Example: Ont-time pad (1)

gweihir (88907) | about 8 years ago | (#15960345)

The basic problem is that if you do E_KB(E_KA(plain_text)), there may be a key C with

  E_KC (plaintext) = E_KB(E_KA(plain_text))

If you just encrypt 64 bit with an one-time pad (trivial example), then this is obvious:

  C = A XOR B

I.e. with one-time pad you gain exactly no security at all.

For many ciphers, a key C may not exist, but even if it does not exist, an attack on E_KB(E_KA(plain_text)) may be just as easy as on a single key. What is worse is that for some ciphers (e.g. DES) it is not known whether double encryption with two different keys is more secure or not, and how much it is more secure.. Until this is known for a cipher, it has to be assumed that encryption of each block with two keys is not more secure than single encryption.

However there are ways to make a cipher more secure if you have more key material. For example you can use one part as the key and one part as the IV for CBC. Or you can split the message, and encrypt each part with a different key. That gives the attacker a bit less material to work with. Or you can use the keys with different ciphers and stack those on top of each other. If the ciphers are used in stream mode, then the result is provably as secure as the strongest one.

You can also mix in any extra key material as additional data blocks, e.g. in CBC. This again makes a ciphertext-only attack much harder.

     

An analogy... (1)

spongeboy (681073) | about 8 years ago | (#15960350)

Okay, lets play a game. Get your boss to write down 2 letters. You and your coworker have to guess the 2 letters.

You can guess the letters one at a time (i.e. "is the first letter 'a'?").

Your coworker has to guess the letters two at time (i.e. "is the phrase 'aa'?")

Generally speaking, who wins?

Re:An analogy... (1)

amliebsch (724858) | about 8 years ago | (#15960598)

You can guess the letters one at a time (i.e. "is the first letter 'a'?"). Your coworker has to guess the letters two at time (i.e. "is the phrase 'aa'?") Generally speaking, who wins?

To make this analogy work, the answer to the first person would have to be "MAYBE" every time they asked, because there's no intrinsic way to know when you've guessed the key to unencrypt pseudorandom ciphertext - it looks just like all the other pseudorandom data that invalid keys would generate.

And encrypting 8 times with a 8 bits key ? (1)

file-exists-p (681756) | about 8 years ago | (#15960438)

See the subject.

The most simple way of explaining this. (1)

AlphaWolf_HK (692722) | about 8 years ago | (#15960489)

Think of encryption as having a bunch of switches which can be off or on lined up in a sequence. In order to break the encryption, you have to somehow be able to guess the exact position of each and every one of these switches. In the case of 64-bit encryption, you have 64 of those on or off switches. In the case of 128-bit encryption, you have 128.

For those who aren't mathematically inclined, what you effectively have is 2 as your base number (on is one, off is another one, thus two total positions for each switch) and the total number of switches is your exponent (remember from 8th grade math, 2^3 = 2*2*2, if you had three switches that is.) If you are still lost, put it this way: 64-bit encryption is 2^64 different possibilities, and 128-bit encryption is 2^128 different possibilities. Thus 128-bit encryption is 18,446,744,073,709,551,616 times stronger than 64-bit encryption.

If you encrypt in 64-bit, and then encrypt it in 64-bit again, you only double the strength as you are basically adding 2^64 plus 2^64, which equals 2^65, or 65-bit encryption effectively. Whereas 2^128 on the other hand, is actually 2^64 times 2^64. So if your friend wanted to achieve 128-bit encryption this way, he would effectively have to encrypt that 64-bit file 64 times, not just two times.

Now 2^64 may sound like a pretty big number, but with computers being as fast as they are today, we always can factor (or in laymens terms, "brute force") the encryption keys in order to figure out what that switch sequence would be at a pretty fast rate, so that number isn't really that large in the grand scheme of things. However, if it is 2^64 times that amount, then that number suddenly becomes much much larger, and thus takes exponentionally longer for an everyday home computer or even a cluster of computers to break.

To cryptography know-it-alls: I know, this is an oversimplified explanation, but it completely answers the full scope of his question, so spare me of the "but you forgot this" or "you can break some encryption without factoring" replies.

linear vs. non-linear (1)

Shirloki (563610) | about 8 years ago | (#15960575)

I'm a casual observer in this field, so someone correct me if need be.

Typically, with encryption, the time required to brute-force attack is exponential to the number of bits. So:

let n = the number of bits in the encryption key

let t = time required to crack said encryption

and a = an arbitrary constant

typically t is proportional to 2^n.

So, when you encrypt with a 128-bit key, t = a * 2^128

But if you use a 64-bit key twice, then t = 2a * 2^64 = a * 2^65 since all you have to do is crack a 64-bit key, then crack another.

As you can see, using two passes with a 64-bit key is 2^63 times worse than just using a 128-bit key once.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>