New SHA Functions Boost Crypto On 64-bit Chips 60
An anonymous reader writes "The National Institute of Standards and Technology, guardian of America's cryptography standards, has announced a new extension to the SHA-2 hashing algorithm family that promises to boost performance on modern chips. Announced this week, two new standards — SHA-512/224 and SHA-512/256 — have been created to directly replace the SHA-224 and SHA-256 standards. They take advantage of the speed improvements inherent in SHA-512 on 64-bit processors to produce checksums more rapidly than their predecessors — but truncate them at a shorter length, reducing the overall timespan and complexity of the digest."
Further details are available from NIST (PDF).
Re: (Score:1)
Yes I do ;-) Want to be next, sweetie?
faster?? (Score:2)
Re:faster?? (Score:4, Interesting)
I thought this as well - you'd think being able to compute a hash faster makes it a bit easier to compute a rainbow table with the hash.
Then again, there are many other perfectly reasonable ways you'd want the hash to be faster - for instance, how git uses the sha1 hash throughout - or any hash-summing of a file to verify the contents are unchanged.
So the 'faster hash' really only means that it might be something to consider when using it for a password hash - but for data integrity checking, it can be a real boon.
Re: (Score:2, Interesting)
Re:faster?? (Score:4, Informative)
IIRC the CRC hashes are only designed to protect against accidental changes while secure hashes are designed to protect against both accidental and malicious changes. This makes them more suited to distributed systems where not every participant is trustworthy.
Re: (Score:2)
Yep, CRC doesn't deserve being called a hash at all. It's just a checksum really, like the control digit on your credit card number. Good against random corruption, no good against a maliciously inserted payload.
CRC has its limits. (Score:4, Informative)
Different hashes are for different purposes.
No argument there.
The CRC class of hash functions actually makes certain statistical guarantees for the longest run of possible errant bytes in source data and are extremely faster, making them far more suitable for file integrity checks.
CRC is great for packet-sized input, but not so great over larger chunks of data; also, the way its design targets burst errors means that widely separated point errors aren't as effectively caught. There's a reason Ethernet jumbo frames haven't gone much over 9000 bytes -- Ethernet's CRC-32 is much less effective at message sizes over 12000 bytes [wareonearth.com] or so. Cryptographically strong hashes tend to be less sensitive to input length.
Re: (Score:2)
That's apples and oranges.
The 12k limit is more related to just having 32 bits than to cryptographic/non-cryptographic nature.
64k*64k ^=32 bit, so at 64k you are guaranteed that there is at least one two-bit error that is undetected. At 2048, you are also virtually guaranteed of a 3-bit errors that goes undetected -- that's true whether the hash is a plain CRC or a cryptographic one.
A CRC-128 would have been as good for this purpose as MD5 or SHA1.
Re: (Score:1, Informative)
Wasn't there an article recently complaining that the speed of SHA made it relativly useless as a hashing algorithm to protrect passwords? Surely the increase in speed would have a greater effect on cracking speed than on the speed of legitimite authentication.
Yes and no.
Yes there was such an article.
No it doesn't mean shit - that's what salts and multiple rounds of an algorithm are for.
But then again, yes this is bad news bears because nobody can seem to keep their password file out of reach of hackers, nobody can seem to figure out why and how they should use a salt, and no one ever configures their crypto to do anything but the bog standard shit. This is a result of idiots mindlessly screaming "Don't roll your own crypto!!!!" and forgetting that the last wor
Re: (Score:2)
Re: (Score:3)
My password file is in plaintext.
Its location, however, is knowable only by breaking a 4096-bit key that changes daily.
Re: (Score:2)
I assume, then, that you have plenty of (probably dynamically generated) decoys scattered around so that it really can't be found without knowing the location in advance?
Re: (Score:2)
It looks exactly like a /. posting. So yeah.
Re:faster?? (Score:4, Insightful)
Cryptographic hashes for a huge number of things besides protecting passwords, which indeed they are somewhat poorly suited for.
Re: (Score:2)
Re: (Score:2)
I would just use more iterations:
digest = HMAC( pass, salt );
for ( i = 2048; i --> 0; ) digest = H( digest );
return digest;
Re: (Score:2)
If your goal is to hide the original data that was hashed then SHA on small amounts of data is not a good idea.
If your goal is to verify that the data you have matches the hash you have then have faster SHA is good.
Re: (Score:2)
Using password salt and multiple iterations of SHA-xxx is enough to defeat rainbow tables, particularly if you choose a non-standard number of iterations - see http://slashdot.org/comments.pl?sid=1987632&cid=35150388 [slashdot.org] for a bit more.
Why? (Score:2)
A very slow hash function that takes maybe 5ms to process would be extremely usable for authentication in practical
Re: (Score:2)
The single use for hashes where you want them to be slow is protecting passwords in databases. This is a tiny fraction of the use cases for cryptographic hashes.
Re: (Score:2)
Re: (Score:2)
There are plenty of uses for cryptographic hashes that do not involve passwords.
For one thing, many people's definition of "integrity" includes protection against deliberate tampering, not just protection against bit rot/transmission errors, and CRC's linear nature makes it completely unsuitable for such use. For another CRC's "statistical guarantees for the longest run of possible errant bytes" make it good at detecting burst errors, but also make it possible for single-bit errors to go completely unnotice
Re: (Score:2)
Your view is too narrow. Algorithms designed for one purpose may also suit another.
I created a library that uses any cryptographic hash function (MD5, Whirlpool, SHA1-512, etc), in a special Cipher Block Chaining (CBC) mode to provide a flexible strong encryption & decryption system.
Having a more secure & faster encryption/decryption system for real-time encryption is great (bonus, without rewriting any of the encryption code I can take advantage of the new hash algorithms as they become availab
Re: (Score:2)
See: using one way encryption (hashes) to perform two-way (reversible) encryption.
Okay, I have been wondering about this since last evening.
I'll bite: how does one perform reversible encryption using hashes, given that hashes are not bijective from the domain to the fixed hash output length codomain? Even if you form an algorithmic construct that limits the domain such that the hash function could potentially be injective in the codomain, how do you ensure that it is surjective? Furthermore, isn't the entire point of cryptographic hash function design to make it so that the inverse fu
Re: (Score:2)
Read up on modes of operation of ciphers. (For instance, see http://en.wikipedia.org/wiki/Block_cipher_modes_of_operation [wikipedia.org])
Specifically, you will notice that cipher-feedback, output-feedback and counter modes (CFB, OFB, CTR) all use only the encryption half of the symmetric cipher they are based on. Thus, you can trivially replace the symmetric cipher with a one-way hash function.
Thank you (Score:2)
Re: (Score:2)
A fast hash can be made slow very easily: just pipe its result through the function again, and do this a million times, and use this as the hash.
Re: (Score:3)
Somewhat easily, yes, but not quite that easily. You should never use a cryptographic algorithm carelessly like that. Always look up the recommended ways to do these things, because naïve algorithms like the one you suggest tend to have unexpected weaknesses.
Re: (Score:2)
I'm curious why slowness should be even considered as useful. Even without a hash, brute forcing a 10char password would take at 1bil comb/sec would take over a century.
Even if they made hashing 10xs faster, it wouldn't accomplish much.
The real question is how easily can a collision be found.
Re: (Score:2)
What's in view isn't brute-forcing over the universe of possible passwords but over the universe of probable passwords - i.e. /usr/share/dict/words.
Encryption on chip approved by (Score:2)
Re: (Score:1)
If you don't trust the hardware to be secure for some activity, it hardly matters what software you are running on it.
Re: (Score:1)
Re: (Score:3)
SHA-512 is indeed faster than SHA-256 on 64-bit processors. SHA-512 uses 80 rounds using 64-bit variables on block sizes of 128 bytes, and SHA-256 uses 64 rounds using 32-bit variables on block sizes of 64 bytes. Since on most 64-bit machines 64-bit operations are roughly as fast as 32-bit operations, you see a speed increase because you're processing twice as much data and doing only a little more work (80 rounds versus 64). Both algorithms are very similar internally, so a round in each algorithm gener
Re: (Score:3)
I suppose that would be one way to make Bittorrent CPU-bound rather than IO-bound.
Re: (Score:1)
...and you want an inefficient algorithm to check the integrity of your disk image why? It's not like someone's going to reverse engineer a 200MB file out of a tiny hash. It's also unlikely that they'll be able to force a collision.
Hashes have many uses; hashing an access key (such as an ascii password string) is only one, and there are hashing algorithms designed for that (the SHA family is not included). For that matter, there are meta-algorithms designed to safely use SHA-style algorithms cryptographi
Re: (Score:1)
No they are not meant to be inefficient. For the specific case of storing passwords, yes there you need a cryptographic construct that is inefficient but the hash is simply a component in that construct, you make it inefficient with salt and key stretching.
Hashes like SHA256 gets its strength from the fact that no matter how fast you make it run it will still take more time and energy than is left in the universe to brute force the entire key space.
What's the point of this? SHA-3 is next year. (Score:1)
This is absolutely silly. I can't see why anyone, let alone NIST, would want this. They should know better:
- SHA-512 is only faster than SHA-256 in pure x86-64 versus x86; add SSE to the mix and start doing four SHA-256 blocks in parallel, and SHA-512 is about the same speed, or slower!
- SHA-256 is not particularly slow, overall: 150MB/s is quite possible with it. Half the speed of SHA-1, yes, but still not bad. That is gigabit on one core, and more than the sustained read speed of a hard disk (although not
Re: (Score:2)
What I don't get is the focus on 64 bit x86 computers. They take the fastest personal computers and see how well they run a hash function. It's way more interesting to see what happens when using smaller chips. Personally I think that the reference platform for the latest hash method should have been a 32 bit ARM or something, not a little endian 64 bit Intel chip.
Re: (Score:1)
Link to the standard (Score:4, Informative)
If anyone is interested in the source material, here it is:
http://csrc.nist.gov/publications/drafts/fips180-4/Draft-FIPS180-4_Feb2011.pdf [nist.gov]
Fresh from the press, it seems.
By the way, the SHA-512/224, SHA-512/256, SHA-384 and SHA-512 are only different in their initial hash value, so it is very easy to implement these algorithms. Just change the constant and cut the required number of output bits. Personally, I think it is at least two hash functions too many.
Ability to Spymore (Score:2)
Eventually the FBI admitted what they were doing then scrapped the code and set about an entire overhaul of systems including the CIA DHS etc. The new code name was
She's real fine my 404... (Score:2)
Unfortunately, no, they aren't... Too bad, since there's so little detail in the summary I have no idea what this is actually about.
Not Found
The requested URL /publications/drafts/ fips180-4/Draft-FIPS180-4_Feb2011.pdf was not found on this server.
Re: (Score:2)
The requested URL /publications/drafts/ fips180-4/Draft-FIPS180-4_Feb2011.pdf was not found on this server.
Please see owlstead's post [slashdot.org] which contains a good link to the document.
Does SHA2 still produce the same results? (Score:1)
Re: (Score:2)
I'm not an expert on crypto, but it seems to me that, for instance, SHA-512/256 would not produce the same digest from the same input as SHA-256. I just conducted the following test on the linux command line:
The first is the SHA-512 hash of the word "hello"
Re: (Score:3)
Since we would already be calculating the 512-bit hash, why not just use it instead of truncating it?
Because there are many applications where carrying the extra 256 bits either breaks compatibility or is storage/transmission cost prohibitive for some reason or another. ZFS style block checksums, for example. Hashed authentication of network packets is another.