Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Encryption Government United States Your Rights Online

Did NIST Cripple SHA-3? 169

An anonymous reader writes "In the process of standardizing the SHA-3 competition winning algorithm Keccak, the National Institute of Standards and Technology (NIST) may have lowered the bar for attacks, which might be useful for or even initiated by NSA. 'NIST is proposing a huge reduction in the internal strength of Keccak below what went into final SHA-3 comp,' writes cryptographer Marsh Ray on Twitter. In August, John Kelsey, working at NIST, described (slides 44-48) the changes to the algorithm, including reduction of the bit length from 224, 256, 384 and 512-bit modes down to 128 and 256-bit modes."
This discussion has been archived. No new comments can be posted.

Did NIST Cripple SHA-3?

Comments Filter:
  • by Anonymous Coward on Saturday September 28, 2013 @06:59AM (#44978193)

    I say we just use the algorithms Schneier has invented and nothing else. Why do we even go to these standards approvers in the first place. The open source community should get together and hold they're own competition and forget anyone who's in anyway associated with any org starting with N*. Can someone please make an open source "Scheneier Suite" of cryptography written in C for the world to make use of already please!?

    -- stoops

    • by philip.paradis ( 2580427 ) on Saturday September 28, 2013 @07:05AM (#44978207)

      I do most of my work in Perl, and I happen to heavily utilize Blowfish and Twofish. Perhaps you should think about what your application pipeline requirements actually need in terms of crypto and then look into the various modules that interoperate under the umbrella of Crypt::CBC [cpan.org].

    • I say we just trust Schneier unconditionally, because he's the good guy.

      ALL HAIL CRYPTOTOAD!

    • I think we can get a volunteer to do almost that. But they are insisting on calling the suito of routines the âoeNew Scheneier Algorithms" for some reason.

      Seriously, one of the major problems to be surmounted is not just availability, but getting it accepted as a standard. The NSA is going to have Microsoft distributing their brand of protection: Microsoft is organized in the US, and will. Use the oS national standard.

      But there are other countries out there. China, while a big producer of goods, is goi

      • by bytesex ( 112972 )

        IP was standardized, right? I mean, you don't have to have clearance, or be a government rep, to visit the IETF? Well, maybe IP is a bad example as such, but nowadays, there are many networking protocols that come out of the public domain. Why couldn't it be the same for cryptography?

      • But it doesn't have to be a NIST standard. It could be an ISO or ANSI standard (encryption may be used at least as much for communication as for storage, so that might make sense), for instance. ISO probably makes more sense anyway, as NIST is a purely US standards organization.

        Then we can be in the weird position where only the NSA uses the NSA-weakened algorithms...

        • ISO can be bought, as shown so well by Microsoft. They've lost any trust they ever had.

          • "ISO can be bought, as shown so well by Microsoft. They've lost any trust they ever had."

            Perhaps that's true. But the fact that NIST has been the instrument of Government interference with cryptography has been known since the early 90s, with the Skipjack/Clipper debacle.

            • I agree completely. I'm just advocating the use of a standards agency the isn't US government controlled and can't be bought. ISO fails on at least one of those requirements.

          • I should add:

            This is a war that the government has LOST. More than once. I really have to wonder why they keep trying.

            Maybe the intelligence community is taking that word "intelligence" a bit too literally. They are NOT smarter than everybody else. And in fact that's why they try to be sneaky.
      • Back doors are built into an implementation, not a standard.
    • by pla ( 258480 ) on Saturday September 28, 2013 @08:03AM (#44978307) Journal
      I say we just use the algorithms Schneier has invented and nothing else. Why do we even go to these standards approvers in the first place.

      Two reasons.
      1) Because having a standard means that everyone using SHA-3 will get the same result, instead of every implementation coming out with a different answer of totally unknown integrity. With a standard, I can verify the integrity of program-X's hashing simply by comparing it to a small sample of know plantexts and hash values.
      2) Because most software houses dream of someday getting a government contract - Maybe military, but don't forget about the 14% of Americans that in some way work for the government. Any software they use needs to adhere to the standards issued by the government, or no dice.

      And really, simple as that.
      • I'm against placing one person in charge of anything important but I'd trust a Schneier standard a hell of a lot more than a government standard. If I could believe he hadn't been leaned upon by the government. Can we responsibly believe that?

        • by pla ( 258480 )
          If I could believe he hadn't been leaned upon by the government. Can we responsibly believe that?

          Unfortunately, I would have to say conclusively "no". We've already seen quite a few big names on our side tacitly admit that the NSA has pushed on them - Phil Zimmerman, PJ of Groklaw, even Linux Torvalds.

          Currently, I'd say we've reached the point where we can't trust any software in the wild. At an absolute minimum, if we didn't personally compile something, it goes in the "likely compromised" pile. An
          • Comment removed based on user account deletion
            • by HiThere ( 15173 )

              We don't know that there was any direct pressure. We don't know there wasn't, either. And there was clearly indirect pressure.

              They're guilty, we just aren't sure exactly HOW guilty.

        • Why do we have to go with Schneier? Why not have a standardized version of all the final candidate algorithms?

          • by amorsen ( 7485 )

            Having support for a large number of different algorithms in a program or standard increases the risk of downgrade attacks. If just one of the algorithms turns out to be weak, an attacker might be able to lure the two parties into picking a less secure algorithm when they negotiate.

        • You can just use NSA standard when doing work for the government and something else when doing other work, what's the big deal?

      • With a standard you can have confidence that everyone's implementation of SHA-3 has been compromised and crippled by the NSA.

    • by ledow ( 319597 ) on Saturday September 28, 2013 @08:10AM (#44978337) Homepage

      In case you haven't noticed, the NSA are spies. They do nothing but infiltrate groups of interest all day long.

      Such a group of OS programmers would be the perfect target. And why do we trust Schneier more than anyone else such that his involvement means something is acceptable? I love the guy, but no, that's not how trust works for mass-public security systems. If the NSA/GCHQ spies are working at anywhere near the levels they were back in their heyday of WW2, then Bruce would be my prime candidate for "beyond suspicion" and thus my first inclination that - somewhere, somehow - he could be a shill for them. I'm not seriously saying he is or isn't, but the point of security is that NOBODY should hold any special power over anyone else, certainly not the ability to single-handedly "approve" a worldwide security standard.

      No, what we do is carry on as normal. Put all the algorithms to public testing. As attacks are found, knock out the vulnerable ones like a game of Guess Who, and only ever use whatever is still standing. You can't defend against attacks that you do not know about and if such agencies really ARE as worried as we think they might be about the world moving to encryption they can't break, then my first thought would be "what are they moving us towards, without trying to look like they are doing so?" - and there you run into Blowfish/Twofish and similar algorithms that they've had the opportunity to analyse for years now. It would be the perfect coup - make people think you are attacking them, then "be involved" with the only alternative of elliptic-curves and thus make everyone think that's your preference and hence subtly move them onto something else of your choice without even MENTIONING it or being involved with it.

      Don't try to out-think a bunch of geniuses working with military-level funding and a real interest in keeping you on something broken. Just follow procedure - stay on what you've got until there's actual evidence it's broken. Don't jump ship to new and interesting and relatively untested things for no reason other than you feel uncomfortable.

      • by Alef ( 605149 ) on Saturday September 28, 2013 @10:48AM (#44979033)

        It would be an insanely unlikely coup. Think about what you are suggesting: First they get the entire world to use AES, to the point where leading CPU manufacturers have even included special instructions in the hardware specifically for encoding and decoding AES. They do this only so that an alternative algorithm (Twofish) would get less scrutiny by independent researchers for a number of years. They then orchestrate an elaborate leak indicating that they have attacks against some unnamed publicly used crypto algorithm. Meanwhile, or even before that, they have recruited an established and well known writer and cryptographist, and have him attack them openly in the public debate, only to give an apparent credibility to the algorithms he has designed. The intent of this is to get everyone in the industry to suddenly switch all cryptography to his somewhat less scrutinised algorithm (probably after reading about it on Slashdot), despite the fact that the author, who they had recruited to attack them, still claims that the math behind AES is solid, and despite the fact that replacing AES would now require replacing hardware and software that permeates our entire society at enormous costs.

        If there is ever a time for the tinfoil hat metaphor...

        • It would be an insanely unlikely coup. (...) If there is ever a time for the tinfoil hat metaphor...

          "Professor Quirrell had remarked over their lunch that Harry really needed to conceal his state of mind better than putting on a blank face when someone discussed a dangerous topic, and had explained about one-level deceptions, two-level deceptions, and so on. So either Severus was in fact modeling Harry as a one-level player, which made Severus himself two-level, and Harry's three-level move had been successful; or Severus was a four-level player and wanted Harry to think the deception had been successful.

      • In addition to Alef's comment

        That Guardian article where he teaches everyone, including terrorists, how to avoid the NSA, undoing 10 years of infiltration work:

        http://www.theguardian.com/world/2013/sep/05/nsa-how-to-remain-secure-surveillance [theguardian.com]

        Also, he's been helping anti-surveillance campaigns including NO2ID for years.

    • Because if your software does not comply with FIPS or whatever other standard of the day is in effect, the government can not purchase it. When hundreds of millions (sometimes billions) of dollars in revenue are on the line, people will make a lot of concessions.

    • by vadim_t ( 324782 )

      Because the US government has requirements about what it accepts.

      You can't just implement whatever algorithm you like, then sell a router with that to the government. It must comply with whatever standard the government decided to adopt. And given that the government buys a lot of things, it wouldn't make economical sense to make equipment you could never sell to them.

      This snowballs, and effectively sets a global standard for encryption. Sure, in your home you can do whatever you like, but the important thi

    • Can someone please make an open source "Scheneier Suite" of cryptography written in C for the world to make use of already please!?

      Working on it for my master thesis ;)
      Just a "Schneier Suite" would be limiting, though. We need more than just the basic algorithms, and not only from Schneier.

      Anyway, I'm developing a new transport/encryption/authentication/federated protocol, which combines ideas from SSL, Kerberos and a lot more, plus some new...
      I already have written all the specification, I'm starting to code it now.

      Keep your ears open for the "Fenrir" project, I'll probably release something in 3-4 months... Although the stable r

    • Because if the NSA points out a cryptographic weakness, it's there.

    • by memnock ( 466995 )

      Probably because with a name like National Institute of Standards and Technology, it sounds like a neutral organization. Some kind of innocuous academic committee or such. Not to mention, when it was first named, there was probably a benevolent view of such government agencies.

      Now though, people who seem to be paying attention are distrustful of the government's "national security" policies. And with good reason, considering what the NSA has been doing since (and probably before) 9/11.

      Now anything that ment

  • by Anonymous Coward on Saturday September 28, 2013 @07:36AM (#44978261)

    The way I see it, I think its wise to avoid all PKI standards using Elliptic curve cryptography algoritms. In contrast to the mathematical basis of prime based algorithms, these mathematics are relatively recent - and have been pushed by the NSA (who is known to be decenia ahead of publicly known mathematics).

    There is no mathematical indication for me to believe that Eleptic curve cryptography is fundamentally broken. But why use 'new mathematics' when hundreds of years of public mathematic geniusses have been thinking about fast factoring of prime numbers?
    I don't get that...

    The most important argument used is that key length is more manageable. One could also interprete it as an indication that there might be security bit reduction attacks still unknown to us, but known by the NSA. Possibly. Possibly not.

    But why take the risk?

    Some more info about elliptic-curve-cryptography:

    http://www.linuxjournal.com/content/elliptic-curve-cryptography

    • by fatphil ( 181876 )
      > http://www.linuxjournal.com/content/elliptic-curve-cryptography

      """
      They do this by splitting the shared secret key used in traditional cryptography into two parts: a public key for identifying oneself and a secret key for proving an identity electronically.
      """
      That's bordering on the "not even wrong" level of fucked-upness. Alas it falls on the side of being woefully incorrect. Possibly dangerounsly misleading too.
    • Factoring primes is easy if you have massive computer arrays or quantum computers. It's reasonable to assume the NSA has both.
      • You could have a million times all the computers in the world and it would not be enough to factor a 2048-bit RSA key. Considering no public researcher in the world has been able to make a quantum computer with more than a handful of qubits, I don't think the quantum computer thing is reasonable either.
      • by HiThere ( 15173 )

        Not this year. Massive computer arrays will never handle factoring large numbers with large prime factors until there is a major theoretical breakthrough (which can't be predicted).

        Quantum computers ARE a major threat, but not this year. The NSA has publicly ordered a large one, but large this year is probably only large enough to test their approach. So if your data is time sensitive, you are still safe.

        Two or three years from now...who can say. Progress *is* being made on quantum computers. Perhaps i

    • We'll eventually need to move to ECC or something similar to deal with the rapidly-increasing key sizes required in more traditional asymmetric encryption, but as far as we know that need won't be for at least another decade or three.
    • Just an FYI, breaking RSA is probably not equivalent to factoring. If you can factor you can certainly break RSA, but no one has proven that you cannot break RSA without factoring. The problem that is actually equivalent to breaking RSA is finding Nth roots in a composite group, which has not been studied for hundreds of years.
    • by bonniot ( 633930 )

      hundreds of years of public mathematic geniusses have been thinking about fast factoring of prime numbers

      There is a pretty fast algorithm for factoring prime numbers.

  • Sinister (Score:5, Informative)

    by pterry ( 100705 ) on Saturday September 28, 2013 @07:57AM (#44978291) Homepage
    A crippled cipher can be used to read your private data. A crippled hash function can be used to substitute bad data for good.
    • actually, it's about assword security. and the "pre image" problem.
      more collisions mean it's easier to find a password that gives a stored hash
      but it's not crippled, its just that a 512 key gives you n/2 security - 256bit security

      afaics, anyway

    • In a crippled hash function, you can add a Trojan horse to a downloaded while keeping the same hash value. Even Linux repositories would be vulnerable (the hashes are usually gpg-signed, but the hash doesn't change), and allow execution of arbitrary code.

  • and gave a master key to the CIA/FBI/NSA and all the other three letter goons & spooks that are part of the US Govt, and now the blowback for such a breach of trust is nobody trusts the US Govt anymore, i am sure MS_Windows will take a big hit in sales because of this (since it is closed source) at least BSD & Linux can have its code audited and i bet other nations are scrambling to do just that for their systems that they want to keep secured,

    i wonder how much data and info the US Govt spys stea
  • by Anonymous Coward

    He has told me stories of NSA personnel coming by for meetings. He said he had no idea why they were there, so YYMV.

    That said, NSA had indeed been on the NIST campus.

    • by sphealey ( 2855 )

      NIST is required by law to consult with the NSA before publishing cryptographic standards. What "consult" means is unknown.

      More conventionally, it stands to reason that NSA personnel would be participating in NIST projects on computer security, cryptography, and theoretically math, since they [NSA] have a lot of experts in those fields working for them.

      sPh

    • by mclearn ( 86140 )
      NIST and NSA have all sorts of partnerships (look at NIAP as an example). On the whole, however, they are distinct organizations with some overlapping function. NIST, for example validates cryptography implementations through the CMVP and the CAVP. Also of note is that the NSA has two arms: an offensive arm and a defensive arm. I'm somewhat annoyed with the /. crowd for not recognizing this and realizing that it is the offensive NSA arm which is potentially responsible for deliberate cryptographic weakening
  • eat THEIR dog food? (Score:5, Interesting)

    by v1 ( 525388 ) on Saturday September 28, 2013 @08:16AM (#44978353) Homepage Journal

    so why don't we just look at what organizations like the US military use to secure and sign their data, and use that? (the methods of course, not their keys) That sounds to me like the only way to make sure they're not suggesting or influencing us to use something they (or their opponents) could easily break?

    • Indeed, that's been SOP among cypherphreaks for some time. Even coming down to using "military-strength" encryption keys and the like; if the government says 1024 bits is enough, use 4096. And so on.

    • Because who says they're using what they tell uncleared opponents they are using? Maybe the wrapper is what they say they're using and underneath there's a more secure method that they have never disclosed to the public.

    • by rriven ( 737681 )
      The main workhorse to protect the SIPRNET [wikipedia.org] is the KG-175D [gdc4s.com] or Taclane Micro. The next problem you run into is getting one with the same software the military uses.
    • Comment removed based on user account deletion
    • by Kjella ( 173770 )

      so why don't we just look at what organizations like the US military use to secure and sign their data, and use that? (the methods of course, not their keys)

      Well if we're going for the spectacularly evil I'd pick an algorithm that has many subtly flawed weak keys and a small number of secure keys, then secretly implement additional key generation checks in military software. You both use the same cipher, but they can still read your data and you can't read theirs. Vendors can even supply software built on public cipher standards to be used with government-provided keys and be none the wiser. As long as the ones issuing the keys is in on the charade, it could be

    • so why don't we just look at what organizations like the US military use to secure and sign their data, and use that?

      Because you can't trust that either. The US military is not fighting enemies at its own level, thus it can afford to risk operational data leaking, especially if it still takes a while to decode. And even if it doesn't want to risk it, who's to say the NSA wouldn't? It's not like they are the ones at risk of lead poisoning, and it'll make their job easier.

      That's the problem with corruption:

      • The US military is not fighting enemies at its own level, thus it can afford to risk operational data leaking, especially if it still takes a while to decode

        Doesn't anyone here remember all that footage from planes in Bosnia that was sent unencrypted and downloaded by people with slightly tweaked satellite TV gear? Some stuff isn't even encoded at all.

    • by AHuxley ( 892839 )
      The US military might not be all that "trusted" and the NSA likes keep tabs on .mil too?
      Like the domestic and international codes sold and made weak, why would the US military staff get a free pass to real crypto?
      As long as it kept the Russians out- what is at the base/camp/fort is fair game :)
      Would US political leaders not want some insights into the mind sets of their top generals (or emerging top staff) using US gov networks?
      They could be under the influence of a charismatic leader/"spy" or new fait
  • Here's why... (Score:3, Insightful)

    by Anonymous Coward on Saturday September 28, 2013 @08:54AM (#44978473)

    When the SHA-3 competition was announced, the pretty much only working method of getting a hash function was using the Merkle-Damgård construction. Bit security limits where set under the assumption that the submitted proposals use MD, since nothing else was known. However, Keccak does not use it and gains better security guarantees. For this reason, NIST had an opportunity to weaken it a bit while still keeping the old security requirements and making the hash function much more efficient in the process.

    • Just like that time A5/1 GSM encryption was weakened from 64 to 56 bits in US to make it "much more efficient".

  • Developers should implement Keccak, and NIST and NSA can have their SHA-3, whatever it becomes, all to themselves.
  • The real truth in the slides is that the algorithms are expected to have a collision and pre-image resistance that is 1 half the digest size. In this case the 128 and 256 numbers mean that the collision resistance is 2^128 and 2^256.

  • by fuujuhi ( 2088482 ) on Saturday September 28, 2013 @01:44PM (#44980021)

    NIST's proposal (presented at last CHES conference) is NOT reducing the internal strength of Keccak.

    NIST proposes some standard values for a parameter called "capacity" in Keccak, and for which Keccak's authors always said that it can be freely chosen by the designers. A high capacity means a higher security, and a lower capacity means a better performance. NIST's current forecast for FIPS202 specifies 2 values for the capacity, namely 256 and 512, that would bring the SHA-3 standard to an equivalent security level as the AES (2^128 operations required to break c=256 and 2^256 operations required to break c=512). One may actually consider that these security levels are the same as the ones in the original submission, because these are the minimum security levels offered by *ALL* finalists (including Keccak). Indeed all candidates for SHA3-256 offers a collision resistance of 2^128 operations, and 2^256 operations for SHA3-512.

    The discussion here is that actually choosing c=256 means that the cost to find pre-image is also reduced to 2^128 operation, instead of 2^256 as in say SHA2-256. There are ongoing discussions on the mailing list about the theoretical consequences of this choice, but what strikes me most is why people are so much focusing on the strongest security bound of a primitive (pre-image here) and are completely ignoring the weakest security bound (collision resistance). Of course one may always design an application that would be immune to collision resistance, but if one only looks at the primitive, saying that SHA2-256 offers a security of 2^256 because it has a pre-image resistance of that level is clearly fooling himself. In that sense, NIST proposal was to level the security bound of the primitive to its guaranteed minimum as for block ciphers, and allows a security bound of either 2^128 (c=256) or 2^256 (c=512). Those with an ounce of common sense will observe that 2^128 is completely astronomical, and absolutely out of reach of any thinkable devices in the future, even for the NSA! And if you don't care about performance (you probably don't design products then), and are absolutely paranoïd, there is then still the freedom to chose a capacity c=512, as allowed in current proposal, and probably waste computer cycles for no gain whatsoever.

    I of course have no clue on the possible influence of the NSA, but for having attended to SHA-3 and similar conferences, I must say that NIST's work in SHA-3 is remarkable and *unprecedented* in the cryptographic community. NIST ran the most *OPEN* process ever for the evaluation and selection of the new SHA-3 standard. I think that the intention of NIST is to write a standard that will satisfy the majority of the community (hence their openness and presentation at CHES), and that will offer the most of potential of the winner candidate. Keccak is really a "new" object in the cryptographic community, that is quite different from previous proposals, and no wonder to me that its adoption triggers some questions. However the hidden suggestion that NIST would have a secret agenda is clearly participating to current tin-foil propaganda of some would-be security specialists that are trying to acquire attention, and brings zero to the current standardization process.

  • Time to switch to open standards instead of this NIST bullshit.
  • We should not have one SHA-3 with the security parameters selected by NIST or anyone else.
    For the vast majority of usages the speed of the hashing is a non-issue, they are all plenty fast enough
    yet some implementations, specifically those with limited hardware my have other concerns.
    We should approve the basic algorithm, and have a family of hash functions with different security parameters
    to be selected for each usage.
    Most of us should use an extra secure variant most of the time.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...