Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
User Journal

Journal einhverfr's Journal: Why The Encryption Back Door Proposals are Bad (Technically) 2

Permission is hereby granted to distribute modified or unmodified copies of this content far and wide. I, the author, do request though do not require that the link to the New York Times story is preserved in any redistribution, however.

(Copyright (c) 2010, Chris Travers)

The New York Times has reported today that the Obama Administration is seeking legislation to require backdoors into encryption software that could be used for wiretapping. I believe this is deeply problematic for both technical and social reasons, but the technical reasons are probably the worst. Because this area is not well covered in the existing articles, I figure it's worth giving a quick primer here.

  Types of Encryption

The simplest form of encryption is what's called symmetric encryption. It comes in various forms, some simpler than others, but the basic process is conceptually simple. Two parties share a secret. One party takes the message and encodes that message with the shared secret, and the other party decodes it using that same shared secret. This encryption is reversible and the key is the same on both sides.

A trivial example might include what we think of as ROT-13 (used for obfuscation) where every letter is rotated 13 places forward. So "this is a sample message" becomes "guvf vf n fnzcyr zrffntr." Of course such a cypher is easily broken, but there are very good quality symmetric cyphers available, such as AES.

The real problem with symmetric cyphers is that they require that both sides knows the same key before encrypted communication begins. If you are communicating with a lot of third parties, you would find you'd either have to publish the key (making sure everyone else could decrypt the same messages!) or find some way of getting the keys to the other parties in advance. This obviously renders this form of encryption useless for initiating secure communications with individuals one has never met.

To solve this problem, public key encryption was designed. Public key encryption uses two keys, called a public key and a private key. Knowledge of the public key is not sufficient to derive the private key through any sort of feasible process, and these keys are usually very long (AES may be 256 or even 512 bits long, but public/private key pairs are often 1024, 2048, or 4096 bits long per key), making brute force even harder (since the public key is expected to be publicly available).

The public key is then published and the private key is retained. A user can then look up a public key, encrypt a message with it, and only the holder of the private key can decrypt it. Similarly a private key holder can sign a cryptographic hash of a message and anyone with the public key can validate this "digital signature." (A cryptographic hash is another form of encryption with is one-way, and is used in document validation, tamper-proofing, and password checking.)

Public key encryption depends on the idea that ONLY the appropriate party has the private key. When you make a secure purchase on, say, Amazon.com, Amazon sends you their public key, and you and them use this to negotiate a symmetric cypher (probably using AES or RC4). In this way you know the key was properly exchanged and eavesdropping on this sale by criminals is not possible. When you enter your credit card data is not intercepted by criminals. Protection of the private key is very, very important to this process, but even knowing the private key does not enable you to eavesdrop on a conversation in process since that's done with a symmetric cypher.

SSL, PGP, IPSec Opportunistic Encryption, and related technologies all use asymmetric encryption, but the differences tend to be in how keys are published and who is vouching for them. SSL is designed so that you know who you are talking to because a third party (like Verisign) is vouching for the identity of the server.

Problems with Backdoors in Public Key Encryption

To effectively wiretap public-key-based communications, you have to have access to the private key, or you have to tap them post-decryption. Tapping post-decryption works fine in some contexts, such as what you are purchasing at Amazon.com. However, it does not properly work when trying to capture the content of encrypted emails, since these are usually encoded with the recipient's private key. Communications encrypted in this way are not generally vulnerable to interception in the middle. Moreover, communication itself could include encrypted files as attachments and such which could be handled entirely outside the flow of the program (I can encrypt a file and then attach it and my email program doesn't care if it is encrypted).

There isn't a real way to retrofit peer to peer communications programs to allow this sort of interception without compromising the core of how encryption works. A company may maintain their own certificate authority and use it to publish keys for internal company communications. A person taking a company laptop home may then use those certificates to encrypt emails. There is no way to intercept the content of these communications without requiring that the company keep copies of all private keys, thus compromising their own security. Similarly, if I email out an OpenPGP key or an OpenSSH key, these are not sufficient to wiretap the communications that would be encrypted using those keys. The only way out would be to require the makers of the software to include a facility sending the private key to some sort of escrow service which could then provide the key to law enforcement, but this compromises the basic integrity of the software, and any attempt on open source programs could be easily circumvented.

Consequently, this doesn't actually affect the sorts of technologies an organized crime ring is likely to use. Instead it makes each of us more vulnerable to government spying, and it makes key data, such as credit card data, far more accessible to criminals.

Such a law would thus benefit organized crime at the expense of the average consumer. It's an unbelievably bad idea no matter how you look at it.

This discussion has been archived. No new comments can be posted.

Why The Encryption Back Door Proposals are Bad (Technically)

Comments Filter:
  • You make good points about the technical faults with the proposed measures. Another thing to keep in mind is that it won't be possible for the US gov't to meaningfully enforce this. They may be able to get the big providers- such as Skype and Microsoft's BitLocker (assuming there aren't already backdoors in that)- to comply, but there is plenty of FOSS encryption software which will easily be able to get around any attempt at regulation of this magnitude. The feds try to get the developer to rewrite the app

    • Agreed. Again, it doesn't prevent criminals from encrypting their communications. It just prevents average citizens from doing so securely. The next steps are obviously to require private key access and to ban wiretap circumvention tools (like the FOSS versions). Again, not enforcible, but it means that if you don't allow wiretapping, then that's evidence of illegal intent.

      Of course, there might be 4th or 5th Amendment challenges to such measures.

      But there is a rise in attempts to run around the Constit

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...