Please create an account to participate in the Slashdot moderation system


Forgot your password?
Open Source Bug Security

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion? 582

jammag writes: "Heartbleed has dealt a blow to the image of free and open source software. In the self-mythology of FOSS, bugs like Heartbleed aren't supposed to happen when the source code is freely available and being worked with daily. As Eric Raymond famously said, 'given enough eyeballs, all bugs are shallow.' Many users of proprietary software, tired of FOSS's continual claims of superior security, welcome the idea that Heartbleed has punctured FOSS's pretensions. But is that what has happened?"
This discussion has been archived. No new comments can be posted.

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion?

Comments Filter:
  • by symbolset ( 646467 ) * on Tuesday April 15, 2014 @05:19PM (#46761113) Journal
    Which is run by a former Microsoft executive who was in charge of security. I guess he can gloat about being personally responsible.
  • This was positive (Score:5, Interesting)

    by danheskett ( 178529 ) <danheskett&gmail,com> on Tuesday April 15, 2014 @05:37PM (#46761289)

    Heartbleed was positive for the world. The bug was found by code review, twice independently in a short period of days. It was patched rapidly across a hundred different versions and platforms, and now the world is vastly more safe. The system worked exactly as it should.

    It is entirely likely that Heartbleed is out there for a closed platform. Or worse. And it's likely that it is being exploited right now by not only our own Government in the US, but our foreign rivals for economic and political gain. And what's worse, there is probably code out there that is defunct, full of Heartbleeds, bleeding exploits into the wild uncontrollably.

    The only downside it exposed is that some projects have a lock on what they do. OpenSSL is so good that everyone uses it, and no one is seriously interested in forking it or doing a new implementation.

  • Re:Wat? (Score:5, Interesting)

    by tysonedwards ( 969693 ) on Tuesday April 15, 2014 @05:44PM (#46761367)
    It is a double edged sword. Because one can see the code, there is visibility into the process. Because OpenSSL is such a common tool and is arguably vital to the function of the Internet as we know it, this sort of a bug really is one of those "worst case scenarios" PR wise, as opposed to being cleanly swept under the rug as is possible in the case of many Closed Source 0-day vulnerabilities.

    The problem here is that people have been using the argument that Open Source is better because these issues can't happen "because" of the visibility. And the argument "Open Source is inherently safer" has been very heavily damaged by Heartbleed and now ranks up there with "Macs don't get viruses" and "Women are worse drivers".

    If this happened in Microsoft, Adobe or Oracle Land this would be "yet another 0-day" and largely ignored by the public. Because it is in an area with such a vocal group of people spouting "Impenetrable" for decades, it all of the sudden becomes quite newsworthy in a way that "yet-another-remote-code-execution-with-privilege-escalation-in-Acrobat-Reader" vulnerability doesn't.

    And if you doubt any of this for a moment, have you ever heard the name of the developer who was at fault for introducing a bug into Flash on the local news? Now did you hear the name "Robin Seggelmann" in connection to Heartbleed?
  • by Jeremiah Cornelius ( 137 ) on Tuesday April 15, 2014 @05:51PM (#46761423) Homepage Journal

    Closed source is not inherently safer. Raymond's proposition is theoretically sound, however in actual practice, the NSA has "many eyes"...

  • Re:Mr Fixit (Score:3, Interesting)

    by Desler ( 1608317 ) on Tuesday April 15, 2014 @05:57PM (#46761523)

    Which is a ridiculous statement to make in this situation. That's like patting your security company on the back for not noticing for two years that someone was secretly stealing money out of your bank vault and they only did something after being told by a third-party that there was a problem. But hey they reacted fast two years after the fact, right?

  • by erroneus ( 253617 ) on Tuesday April 15, 2014 @06:05PM (#46761601) Homepage

    Closed source is hazardous in many ways. Along with being more frequently targeted, the NSA revelations showed that Microsoft worked with the NSA when deciding how quickly to close some holes. Another hazard is the threat of being attacked and/or sued by companies whose products were found to have problems.

    No question the heartbleed thing is a huge and embarassing problem. But you know? It's actually kind of hard to count the number of high-profile vulnerabilities in F/OSS software as not a whole lot come to mind. On the other hand, the list is enormous for closed source from large companies... also hard to count but for another reason.

    It does highlight one important thing about F/OSS, though. Just because a project has enjoyed a long, stable and wide deployment, code auditing and other security practices are pretty important and just because it's a very mature project doesn't mean something hasn't been there a long time and had simply gone unnoticed for a long, long time. People need wakeup calls from time to time and F/OSS developers can be among the worst when it comes to their attitudes about their territories and kingdoms. (I can't ever pass up the opportunity to complain about GIMP and GNOME... jackasses, the lot of them.)

  • Re:It doesn't. (Score:4, Interesting)

    by ratboy666 ( 104074 ) <> on Tuesday April 15, 2014 @06:17PM (#46761723) Journal

    This myth gets trotted out again. It is arguably easier to find exploits without source. The source distracts from the discovery of an exploit. The binary simply is. The black-hat is looking for a way to subvert a system. Typically she is not interested in the documented (by source or documentation) functionality. That simply distracts from the issue which is finding out what the software actually does, especially in edge circumstances.

    This is what fuzzers do. Typically not aware of the utility of the program, they simply inject tons of junk until something breaks.

    Source availability tends to benefit people auditing and repairing more than black-hats.

    Yes, it took years for heartbleed to surface. If heartbleed (or a defect like it), was discovered due to a code audit, that speaks to the superiority of open source over closed source. If this defect is found by fuzzing or binary analysis, it is much harder to repair, as users are now at the mercy of the holder of the source. Build a matrix of Open/Closed Source vs. Bug found in Source, Bug by fuzzing/binary analysis.

    Bug found in source vs Closed Source is not applicable, giving three element. Found in source vs. Open Source (where the bug will be repaired in the source by anyone). Bug found by fuzzing... where the bug will be repaired in the source by anyone (Open Source) or the Vendor (Closed Source).

    The question then is (as I started the article): Is it easier to find bugs by source inspection? Assume big threats will HAVE the source anyway. If it was easy to find by inspection, it would be easy to fix (for examples: OpenBSD continously audits, and security has been a priority at Microsoft for the past decade). Fuzzing and binary analysis is still the preferred (quickest) method, giving the edge to Open Source. The reason is simple -- the black-hat cares about what is actually happening, and not what the source says is happening.

  • by Joe U ( 443617 ) on Tuesday April 15, 2014 @06:25PM (#46761785) Homepage Journal

    One word for you: Microsoft.

    2003 called, they want your Microsoft back. The Microsoft of 2014 has a better security record than almost every other vendor in the consumer field.

    I would worry more about Flash, Java, Firefox and Android.

  • by Thiarna ( 111890 ) on Tuesday April 15, 2014 @07:03PM (#46762099)

    I had to dig for direct connections between Codenomicon and Microsoft, but the chairman of the board seems a fairly strong link. The way Codenomicon have behaved in this has seemed reckless, I've never seen a bug so heavily marketed. The stats floating around initially seem to be way off the mark - to begin with quotes were of 66% of web servers being affected, later revised to 17% running affected versions. Both these numbers look too round to be anything other than made up.

  • by Anonymous Brave Guy ( 457657 ) on Tuesday April 15, 2014 @07:14PM (#46762181)

    Raymond's proposition is theoretically sound

    No, it isn't. It's nonsense and it always has been.

    There is plenty of evidence for the effectiveness of good code reviews, but most of it shows rapidly diminishing returns with the number of reviewers. You get much of the benefit from having even one or two additional people read over something. By the time you've had more than four or five people take a look, the difference in effectiveness from adding more barely even registers, unless one of the additional reviewers has some sort of unique perspective or expertise that makes them not like the others.

    Given that almost every major FOSS system software project has had its share of security bugs, there is really very little evidence to support Raymond's claim at all. It's not like it has ever been taken seriously outside the FOSS fan club, but there are a lot of FOSS fans on Slashdot, and so plenty of comments (and positive moderations) reinforce the groupthink as though it's some inherent truth.

  • by styrotech ( 136124 ) on Tuesday April 15, 2014 @08:03PM (#46762509)

    I think the grandparent was right. MS now is hugely better than the MS of 10-15 years ago. I'm not going to try and objectively prove that as I don't care enough about MS and probably couldn't anyway.

    But the NT4 to XP/2003 era was appalling security wise - but they changed that. IIS went from swiss cheese to one of the tougher web servers to break. You just don't hear any more about the kinds of problems they used to have. If you endured those days or just laughed from the sidelines, you don't need any hard data to see that they have improved a lot.

    I found this paper [] from Theo de Raadt illuminating though. He steps through 10+ years of OS hardening techniques OpenBSD has put in place to prevent badly written applications misbehaving. Towards the end he summarises how other platforms do this stuff - the only other platform that did it all by default was Windows (yikes!).

  • by Bugler412 ( 2610815 ) on Tuesday April 15, 2014 @10:13PM (#46763385)
    I think that it's really not about open or closed source. It's about monoculture, the whole net is more resilient if we didn't do that. So many warned about that issue with the desktop/laptop running Windows, and that risk is there and real still, but while worrying about that we built it anyway in an a non-OS specific way on servers too

If you want to put yourself on the map, publish your own map.