Forgot your password?
typodupeerror
Open Source Bug Security

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion? 582

Posted by Soulskill
from the or-at-least-marginally-less-unsafe dept.
jammag writes: "Heartbleed has dealt a blow to the image of free and open source software. In the self-mythology of FOSS, bugs like Heartbleed aren't supposed to happen when the source code is freely available and being worked with daily. As Eric Raymond famously said, 'given enough eyeballs, all bugs are shallow.' Many users of proprietary software, tired of FOSS's continual claims of superior security, welcome the idea that Heartbleed has punctured FOSS's pretensions. But is that what has happened?"
This discussion has been archived. No new comments can be posted.

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion?

Comments Filter:
  • Also (Score:4, Informative)

    by danheskett (178529) <danheskett AT gmail DOT com> on Tuesday April 15, 2014 @05:41PM (#46761341)

    I would like to just point out this is a huge win in my book for Debian. Those of us running an all Debian oldstable environment, getting backported security patches, and sticking with the tried and true version of OpenSSL instead of that newfangled 1.0 code release got to write nice letters to our customers saying we still don't use Windows and we were never vulernable.

    LONG LIVE OLDSTATBLE.

  • by Cid Highwind (9258) on Tuesday April 15, 2014 @05:56PM (#46761517) Homepage

    "Yes, we can trace the changelogs in the software & note who was checking the changes and missed them, but that all can be circumvented."

    Actually it can't. That's kind of the point of git.

    "The fact is we don't know if Heartbleed was an honest mistake or not...we don't know who knew and when..."

    We do know who and what and when, because the person who wrote it and the person who signed off on it have commented publicly about the bug.

    Maybe you're thinking of Apple's "goto fail" SSL exploit where we really don't know who or what or when and probably never will because it's not likely Apple is going to release their RCS logs.

  • Nobody was seriously inerested in forking it... But the OpenBSD people have now gotten their claws into it, and chances are it's gonna be fixed bigtime .... or else!.

    The problem was found because the code was Open Source. If it had been closed source, then the bug would still be secret. To the extent to which the bug was recognized (or commissioned) and exploited by the likes of the NSA, it would have probably remained secret for a lot longer.

    According to Microsoft's EULA, for example, finding -- much less fixing -- such a bug is illegal. If the NSA had paid them to put such a bug into the Windows version of SSL, then it would probably remain unpatched for years after someone had pointed it out to them as an exploitable bug.,, and anybody openly reporting such a bug, even after 6 months of trying to get MS to fix it, would be roundly criticized for disclosing the bug 'prematurely'.
    Even then, it would probably not be fixed by Microsoft until at least the next monthly bug release cycle (or even the one after that.

    With the code being Open Source, the problem got fixed faster than yesterday. Period. If the OpenSSL people refused to fix it, then it would have been forked. ... and more to the point: Such a security-centric fork would have been legal.

    .. and that is the power and freedom of Free, and Open Source software.

  • by almitydave (2452422) on Tuesday April 15, 2014 @06:26PM (#46761809)

    " just about every SSL-encrypted internet communication over the last two years has been compromised."

    No, it really hasn't.

    It's accurate to say that just about every Open-SSL encrypted session for servers that were using NEW versions of OpenSSL (not all those ones out there still stuck on 0.9.8(whatever) that never had the bug) were potentially vulnerable to attack.

    That's bad, but it's a universe away from "every SSL session is compromized!!!" because that's not really true.

    They were vulnerable to attack, that is to say, the security was compromised. He didn't say they were hacked, stolen, eavesdropped, or surreptitiously recorded.

    compromise [reference.com]: to expose or make vulnerable to danger, suspicion, scandal, etc.; jeopardize: a military oversight that compromised the nation's defenses.

    I've noticed that a lot of TV sci-fi confuses "compromise" with "breach"; as in hull, shields, defenses, etc.

  • by Darinbob (1142669) on Tuesday April 15, 2014 @10:09PM (#46763367)

    Encryption is meant to make the original text be obscure, however the means of encryption should not remain obscure. What "security through obscurity" refers to is the common and naive practice of assuming that no one will guess your security methods, and the problem is that people do find this stuff out. Ie, assuming that no one will guess your backdoor debugger password. Now it is fine to start with a strong set of security practices and then only after that is in place it can be made more obscure. But usually when something is made obscure it is because the security is really weak in the first place.

    As for ActiveX, the problem was not that the end user would go and hunt down a trusted plug in and install it, but that it relied upon the web to tell you if something was trusted and then automatically install it (and for the average user this happened even without their knowledge). This was done at the same time that Java was promoted as an alternative, a system that was intended to be designed for security by sandboxing the code (though of course it had flaws) as well as being cross platform, whereas ActiveX was all about taking plain x86 code and executing it as long as it was signed.

    The real problem with ActiveX was the idiotic idea from Microsoft that it should be installed automatically without bothering the users with annoying questions such as asking for permission first; they did the same boneheaded move by allowing executables in emails to be executed without a confirmation. It wasn't until they started added UAC that it seemed they understood what the problem was.

  • by perpenso (1613749) on Tuesday April 15, 2014 @11:51PM (#46763917)

    The quote is "given enough eyeballs, all bugs are shallow." That's a clear admission that open software, like all other software, contains bugs; that's why you want the many eyeballs. Any claim otherwise is a symptom of not understanding plain English. Eric's whole point was that the bugs in open software will be found and fixed faster than the bugs in other software, due to the population of interested people who will study it, looking for the bugs.

    Perhaps it is not being stated clearly but the point that you are missing is the fact that this bug in some of the most critical network software in use had been around for 2 years. This fact demonstrates the hyperbole of the quote. Its a well crafted quote, illustrates a concept well, but people read way too much into it. Few FOSS users are developers, few developers are qualified readers. Eyeballs are a plus, but not a panacea. The gap between proprietary and open exists but it is exaggerated.

    A second and more important fact is that the bug was not discovered by eyeballs on source code. The techniques used seem to be the same applied to proprietary closed source code.
    "“We developed a product called Safeguard, which automatically tests things like encryption and authentication,” Chartier said. “We started testing the product on our own infrastructure, which uses Open SSL. And that’s how we found the bug.”"
    http://readwrite.com/2014/04/1... [readwrite.com]

    Nothing in that quote implies (to anyone with reasonable understanding of English and basic logic) that open software doesn't have bugs.

    Straw man.

  • by perpenso (1613749) on Tuesday April 15, 2014 @11:56PM (#46763939)

    The visibility doesn't make it so bugs don't exist. It makes them more likely to be found. This one existed and was found.

    After two years in the wild. And apparently *not* by eyeballs on source code. Proprietary or open seems irrelevant to this discovery.

    "“We developed a product called Safeguard, which automatically tests things like encryption and authentication,” Chartier said. “We started testing the product on our own infrastructure, which uses Open SSL. And that’s how we found the bug.”"
    http://readwrite.com/2014/04/1... [readwrite.com]

  • by yanyan (302849) on Wednesday April 16, 2014 @04:44AM (#46765083)

    so 'git' is just unhackable...its perfectly secure...no way someone could've put a gun to the guy's head while he sat in front of his computer to make these changes...

    with a hot chick giving him a blowjob, can't leave that part out.

Repel them. Repel them. Induce them to relinquish the spheroid. - Indiana University fans' chant for their perennially bad football team

Working...