Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption News Technology

SHA-3 Winner Announced 100

An anonymous reader writes "The National Institute of Standards and Technology (NIST) has just announced the winner of the SHA-3 competition: Keccak, created by Guido Bertoni, Joan Daemen and Gilles Van Assche of STMicroelectronics and Michaël Peeters of NXP Semiconductors. 'Keccak has the added advantage of not being vulnerable in the same ways SHA-2 might be,' says NIST computer security expert Tim Polk. 'An attack that could work on SHA-2 most likely would not work on Keccak because the two algorithms are designed so differently.' For Joan Daemen it must be a 'two in a row' feeling, since he also is one of the authors of AES."
This discussion has been archived. No new comments can be posted.

SHA-3 Winner Announced

Comments Filter:
  • by Anonymous Coward on Tuesday October 02, 2012 @06:24PM (#41531799)
    It's time to start building some new rainbow tables?
  • Re:Rolls Eyes (Score:5, Interesting)

    by mlts ( 1038732 ) * on Tuesday October 02, 2012 @07:01PM (#41532083)

    Having a good SHA algorithm is a good thing. Yes, hash collisions may not seem to be something that can happen often, but if there is a chance that one can make a document saying "hell no" be changed to "yes, definitely", that can bankrupt a company.

    Hashes also have other uses, especially as "bit blenders", so if one is able to figure out a way to decrease entropy, then keys generated from a device like /dev/random can be significantly less secure.

    Each crypto algorithm is important. I just wish NIST would not just pick one candidate, but perhaps 2-3 at a time [1]. The reason for this is that if something happened that made the algorithm insecure, the standard libraries would have a backup. It also means that embedded controllers that are made to the standard wouldn't have to be chucked and replaced should one algorithm be cracked.

    [1]: Not just hashing, but encryption as well. I wish NIST went wish not just Rijndael, but Serpent and Twofish for a standard. Similar with not just going with just RSA, but RSA, Merkel, DSA, ElGamel, and elliptic encryption. That way, should an attack like TWIRL or quantum computers make RSA pointless, people can switch over to Merkel or another algorithm without needing a hardware upgrade. Plus, for high security work, multiple algorithms can be cascaded [2] to ensure that one weak link won't compromise everything.

    [2]: No, three 256-bit algorithms will not get you 768 bits. In reality, you end up with 258 bits of security. However, if one of the algos ends up being broken and only offering 32 bits of unique keys, the other two would continue to keep at least 256 bits of keylength.

  • Re:Rolls Eyes (Score:5, Interesting)

    by plover ( 150551 ) on Tuesday October 02, 2012 @09:05PM (#41533103) Homepage Journal

    I just wish NIST would not just pick one candidate, but perhaps 2-3 at a time [1]. The reason for this is that if something happened that made the algorithm insecure, the standard libraries would have a backup. It also means that embedded controllers that are made to the standard wouldn't have to be chucked and replaced should one algorithm be cracked.

    As with anything, be careful what you wish for. I've seen successful attacks on protocols that support multiple versions or algorithms, made possible by devices that support them all for various compatibility reasons. Let's suppose someone does discover an attack on SHA-3A but SHA-3B remains secure. Everybody switches their servers to SHA-3B. A flexible protocol might allow an attacker to generate an error that forces clients to re-hash their passwords with SHA-3A. This has happened more often than you might think; NTLMv2 implementations falling back to NTLM being one of the more spectacular of the exemplary failures.

    Your example creates a theoretical weakness - a failure in either SHA-3A or SHA-3B could put such a device at risk. If you try to prevent this by building in an environmental protocol switch, so the device could be set in the future as to which algorithm to use, why not initially design the device to support loading a more modern algorithm in the future?

  • by gstrickler ( 920733 ) on Wednesday October 03, 2012 @01:07AM (#41534673)

    It's only silly if mainstream implementations actually make use of varying those parameters across each installation. However, if something like Windows uses the same parameters for several hundred million installations, a rainbow table will be just fine. Given the history of major software vendors, that's not a guaranteed outcome. If they use the salt properly (randomly generated for each installation, or each encoded item), then it should make them rainbow tables pointless.

Always try to do things in chronological order; it's less confusing that way.

Working...