Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Security Software News

New SHA Functions Boost Crypto On 64-bit Chips 60

An anonymous reader writes "The National Institute of Standards and Technology, guardian of America's cryptography standards, has announced a new extension to the SHA-2 hashing algorithm family that promises to boost performance on modern chips. Announced this week, two new standards — SHA-512/224 and SHA-512/256 — have been created to directly replace the SHA-224 and SHA-256 standards. They take advantage of the speed improvements inherent in SHA-512 on 64-bit processors to produce checksums more rapidly than their predecessors — but truncate them at a shorter length, reducing the overall timespan and complexity of the digest." Further details are available from NIST (PDF).
This discussion has been archived. No new comments can be posted.

New SHA Functions Boost Crypto On 64-bit Chips

Comments Filter:
  • Wasn't there an article recently complaining that the speed of SHA made it relativly useless as a hashing algorithm to protrect passwords? Surely the increase in speed would have a greater effect on cracking speed than on the speed of legitimite authentication.
    • Re:faster?? (Score:4, Interesting)

      by sl3xd ( 111641 ) * on Friday February 18, 2011 @05:34PM (#35249392) Journal

      I thought this as well - you'd think being able to compute a hash faster makes it a bit easier to compute a rainbow table with the hash.

      Then again, there are many other perfectly reasonable ways you'd want the hash to be faster - for instance, how git uses the sha1 hash throughout - or any hash-summing of a file to verify the contents are unchanged.

      So the 'faster hash' really only means that it might be something to consider when using it for a password hash - but for data integrity checking, it can be a real boon.

      • Re: (Score:2, Interesting)

        by parlancex ( 1322105 )
        Does git seriously use SHA1 for file integrity verification? Different hashes are for different purposes. The CRC class of hash functions actually makes certain statistical guarantees for the longest run of possible errant bytes in source data and are extremely faster, making them far more suitable for file integrity checks.
        • Re:faster?? (Score:4, Informative)

          by petermgreen ( 876956 ) <plugwash@p10linI ... inus threevowels> on Friday February 18, 2011 @06:00PM (#35249606) Homepage

          IIRC the CRC hashes are only designed to protect against accidental changes while secure hashes are designed to protect against both accidental and malicious changes. This makes them more suited to distributed systems where not every participant is trustworthy.

          • by Kjella ( 173770 )

            Yep, CRC doesn't deserve being called a hash at all. It's just a checksum really, like the control digit on your credit card number. Good against random corruption, no good against a maliciously inserted payload.

        • CRC has its limits. (Score:4, Informative)

          by jhantin ( 252660 ) on Friday February 18, 2011 @06:04PM (#35249638)

          Different hashes are for different purposes.

          No argument there.

          The CRC class of hash functions actually makes certain statistical guarantees for the longest run of possible errant bytes in source data and are extremely faster, making them far more suitable for file integrity checks.

          CRC is great for packet-sized input, but not so great over larger chunks of data; also, the way its design targets burst errors means that widely separated point errors aren't as effectively caught. There's a reason Ethernet jumbo frames haven't gone much over 9000 bytes -- Ethernet's CRC-32 is much less effective at message sizes over 12000 bytes [wareonearth.com] or so. Cryptographically strong hashes tend to be less sensitive to input length.

          • That's apples and oranges.

            The 12k limit is more related to just having 32 bits than to cryptographic/non-cryptographic nature.

            64k*64k ^=32 bit, so at 64k you are guaranteed that there is at least one two-bit error that is undetected. At 2048, you are also virtually guaranteed of a 3-bit errors that goes undetected -- that's true whether the hash is a plain CRC or a cryptographic one.

            A CRC-128 would have been as good for this purpose as MD5 or SHA1.

    • Re: (Score:1, Informative)

      by sexconker ( 1179573 )

      Wasn't there an article recently complaining that the speed of SHA made it relativly useless as a hashing algorithm to protrect passwords? Surely the increase in speed would have a greater effect on cracking speed than on the speed of legitimite authentication.

      Yes and no.

      Yes there was such an article.
      No it doesn't mean shit - that's what salts and multiple rounds of an algorithm are for.

      But then again, yes this is bad news bears because nobody can seem to keep their password file out of reach of hackers, nobody can seem to figure out why and how they should use a salt, and no one ever configures their crypto to do anything but the bog standard shit. This is a result of idiots mindlessly screaming "Don't roll your own crypto!!!!" and forgetting that the last wor

      • Yes but a) assuming your password file is invulnerable is stupid otherwise we would all just use plaintext, b) multiple rounds are great but tbh only exist because the current crypto algortihms are so fast so you have to multiple cycles to slow crackins and c) salts are great but if you have a fast algorithm factoring that into your cracking process isnt impossible. TBH though most of these issues boil down to the fact that MD5 and SHA are stupid algorithms to use for encryption and if you take the encrypt
        • by blair1q ( 305137 )

          My password file is in plaintext.

          Its location, however, is knowable only by breaking a 4096-bit key that changes daily.

          • by reiisi ( 1211052 )

            I assume, then, that you have plenty of (probably dynamically generated) decoys scattered around so that it really can't be found without knowing the location in advance?

    • Re:faster?? (Score:4, Insightful)

      by Goaway ( 82658 ) on Friday February 18, 2011 @05:39PM (#35249440) Homepage

      Cryptographic hashes for a huge number of things besides protecting passwords, which indeed they are somewhat poorly suited for.

    • by shish ( 588640 )
      If you want it to be slow, just hash password + salt + 10 gigabytes of pseuo-random noise
      • Hmm, pseudo random noise? Really? Then output could depend on the pseudo-random generator you use (which may be non-portable unless you implement your prandom gen yourself).

        I would just use more iterations:

        digest = HMAC( pass, salt );
        for ( i = 2048; i --> 0; ) digest = H( digest );
        return digest;

    • by Jessta ( 666101 )

      If your goal is to hide the original data that was hashed then SHA on small amounts of data is not a good idea.
      If your goal is to verify that the data you have matches the hash you have then have faster SHA is good.
       

    • by Cato ( 8296 )

      Using password salt and multiple iterations of SHA-xxx is enough to defeat rainbow tables, particularly if you choose a non-standard number of iterations - see http://slashdot.org/comments.pl?sid=1987632&cid=35150388 [slashdot.org] for a bit more.

  • Unless I'm missing something why would we ever want performance improvements in a secure hash function? SHA isn't for verifying data, there are superior hashes in that respect with regards to performance and certain statistical guarantees. A secure hash is supposed to have 2 properties: 1) It needs to be irreversible 2) It should be slow so as to reduce the feasibility of brute force attacks.

    A very slow hash function that takes maybe 5ms to process would be extremely usable for authentication in practical
    • by Goaway ( 82658 )

      The single use for hashes where you want them to be slow is protecting passwords in databases. This is a tiny fraction of the use cases for cryptographic hashes.

      • The people using cryptographic hashes for non-cryptographic purposes are idiots. As an example already cited in above comments the CRC class of hash functions actually makes certain statistical guarantees for the longest run of possible errant bytes in source data and are extremely faster, making them far more suitable for file integrity checks and other similar tasks.
        • There are plenty of uses for cryptographic hashes that do not involve passwords.

          For one thing, many people's definition of "integrity" includes protection against deliberate tampering, not just protection against bit rot/transmission errors, and CRC's linear nature makes it completely unsuitable for such use. For another CRC's "statistical guarantees for the longest run of possible errant bytes" make it good at detecting burst errors, but also make it possible for single-bit errors to go completely unnotice

        • Your view is too narrow. Algorithms designed for one purpose may also suit another.

          I created a library that uses any cryptographic hash function (MD5, Whirlpool, SHA1-512, etc), in a special Cipher Block Chaining (CBC) mode to provide a flexible strong encryption & decryption system.

          Having a more secure & faster encryption/decryption system for real-time encryption is great (bonus, without rewriting any of the encryption code I can take advantage of the new hash algorithms as they become availab

          • by jvonk ( 315830 )

            See: using one way encryption (hashes) to perform two-way (reversible) encryption.

            Okay, I have been wondering about this since last evening.

            I'll bite: how does one perform reversible encryption using hashes, given that hashes are not bijective from the domain to the fixed hash output length codomain? Even if you form an algorithmic construct that limits the domain such that the hash function could potentially be injective in the codomain, how do you ensure that it is surjective? Furthermore, isn't the entire point of cryptographic hash function design to make it so that the inverse fu

            • by Goaway ( 82658 )

              Read up on modes of operation of ciphers. (For instance, see http://en.wikipedia.org/wiki/Block_cipher_modes_of_operation [wikipedia.org])

              Specifically, you will notice that cipher-feedback, output-feedback and counter modes (CFB, OFB, CTR) all use only the encryption half of the symmetric cipher they are based on. Thus, you can trivially replace the symmetric cipher with a one-way hash function.

              • I always appreciate the opportunity to learn... I hadn't considered the approach of using XOR symmetry in this particular way.
    • A fast hash can be made slow very easily: just pipe its result through the function again, and do this a million times, and use this as the hash.

      • by Goaway ( 82658 )

        Somewhat easily, yes, but not quite that easily. You should never use a cryptographic algorithm carelessly like that. Always look up the recommended ways to do these things, because naïve algorithms like the one you suggest tend to have unexpected weaknesses.

    • by Bengie ( 1121981 )

      I'm curious why slowness should be even considered as useful. Even without a hash, brute forcing a 10char password would take at 1bil comb/sec would take over a century.

      Even if they made hashing 10xs faster, it wouldn't accomplish much.

      The real question is how easily can a collision be found.

      • by pjt33 ( 739471 )

        What's in view isn't brute-forcing over the universe of possible passwords but over the universe of probable passwords - i.e. /usr/share/dict/words.

  • Just what I would want to use to encrypt that really sensitive data.... No thanks. Let's keep encryption software based. It isn't controllable for backdoors as it is for 99.999% of the people depending on it, let us not make that 100%.
    • by maxume ( 22995 )

      If you don't trust the hardware to be secure for some activity, it hardly matters what software you are running on it.

    • I read this as an algorithm that is better-suited for modern 64-bit processors, NOT one which is implemented specially in hardware. At the very minimum, this would mean that it can easily be calculated using 64-bit integers (and using the entire 64 bits, not just the low 32 bits), and perhaps also easily implemented using SSE2, and allow lots of parallelism, etc.
      • by bk2204 ( 310841 )

        SHA-512 is indeed faster than SHA-256 on 64-bit processors. SHA-512 uses 80 rounds using 64-bit variables on block sizes of 128 bytes, and SHA-256 uses 64 rounds using 32-bit variables on block sizes of 64 bytes. Since on most 64-bit machines 64-bit operations are roughly as fast as 32-bit operations, you see a speed increase because you're processing twice as much data and doing only a little more work (80 rounds versus 64). Both algorithms are very similar internally, so a round in each algorithm gener

  • This is absolutely silly. I can't see why anyone, let alone NIST, would want this. They should know better:

    - SHA-512 is only faster than SHA-256 in pure x86-64 versus x86; add SSE to the mix and start doing four SHA-256 blocks in parallel, and SHA-512 is about the same speed, or slower!
    - SHA-256 is not particularly slow, overall: 150MB/s is quite possible with it. Half the speed of SHA-1, yes, but still not bad. That is gigabit on one core, and more than the sustained read speed of a hard disk (although not

    • What I don't get is the focus on 64 bit x86 computers. They take the fastest personal computers and see how well they run a hash function. It's way more interesting to see what happens when using smaller chips. Personally I think that the reference platform for the latest hash method should have been a 32 bit ARM or something, not a little endian 64 bit Intel chip.

    • You need to remember the concept of secret intelligence. The US government won't release any information on something unless they already have something better. So, I would assume that our intelligence communities allowed this 'release' because they already have something that may or may not be better than even SHA-3.
  • Link to the standard (Score:4, Informative)

    by owlstead ( 636356 ) on Friday February 18, 2011 @07:47PM (#35250522)

    If anyone is interested in the source material, here it is:

    http://csrc.nist.gov/publications/drafts/fips180-4/Draft-FIPS180-4_Feb2011.pdf [nist.gov]

    Fresh from the press, it seems.

    By the way, the SHA-512/224, SHA-512/256, SHA-384 and SHA-512 are only different in their initial hash value, so it is very easy to implement these algorithms. Just change the constant and cut the required number of output bits. Personally, I think it is at least two hash functions too many.

  • Remember governments want their cake and they want to eat it plus more. I do recall many years ago before 9/11 I was part of the movement who sent mass <b>"keywords"</b> to force the FBI into admitting the existence of Carnivore. These were the days before wikileaks and the source code was leaked onto planetsourcecode.com for 8 hours.

    Eventually the FBI admitted what they were doing then scrapped the code and set about an entire overhaul of systems including the CIA DHS etc. The new code name was
  • Further details are available from NIST (PDF).

    Unfortunately, no, they aren't... Too bad, since there's so little detail in the summary I have no idea what this is actually about.

    Not Found

      The requested URL /publications/drafts/ fips180-4/Draft-FIPS180-4_Feb2011.pdf was not found on this server.

    • The requested URL /publications/drafts/ fips180-4/Draft-FIPS180-4_Feb2011.pdf was not found on this server.

      Please see owlstead's post [slashdot.org] which contains a good link to the document.

  • Does the new standard change the actual hash function, or does it merely provide a faster way to compute SHA2?
    • I'm not an expert on crypto, but it seems to me that, for instance, SHA-512/256 would not produce the same digest from the same input as SHA-256. I just conducted the following test on the linux command line:

      $ echo hello | sha512sum
      e7c22b994c59d9cf2b4 8e549b1e24666636045 930d3da7c1acb299d1 c3b7f931f94aae41edd a2c2b207a36e10f8bcb 8d45223e54878f5b316e 7ce3b6bc019629 -

      $ echo hello | sha256sum
      5891b5b522d5df086d0ff 0b110fbd9d21bb4fc716 3af34d08286a2e846f6be03 -

      The first is the SHA-512 hash of the word "hello"

      • by butlerm ( 3112 )

        Since we would already be calculating the 512-bit hash, why not just use it instead of truncating it?

        Because there are many applications where carrying the extra 256 bits either breaks compatibility or is storage/transmission cost prohibitive for some reason or another. ZFS style block checksums, for example. Hashed authentication of network packets is another.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...