Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Chrome Firefox HP Internet Explorer Safari Security Software

Every Browser Hacked At Pwn2own 2015, HP Pays Out $557,500 In Awards 237

darthcamaro writes: Every year, browser vendors patch their browsers ahead of the annual HP Pwn2own browser hacking competition in a bid to prevent exploitation. The sad truth is that it's never enough. This year, security researchers were able to exploit fully patched versions of Mozilla Firefox, Google Chrome, Microsoft Internet Explorer 11 and Apple Safari in record time. For their efforts, HP awarded researchers $557,500. Is it reasonable to expect browser makers to hold their own in an arms race against exploits? "Every year, we run the competition, the browsers get stronger, but attackers react to changes in defenses by taking different, and sometimes unexpected, approaches," Brian Gorenc manager of vulnerability research for HP Security Research said.
This discussion has been archived. No new comments can be posted.

Every Browser Hacked At Pwn2own 2015, HP Pays Out $557,500 In Awards

Comments Filter:
  • Along with FIrefox on the first day.
    • by Anonymous Coward on Friday March 20, 2015 @12:38PM (#49303073)

      IE Fell First...

      But then George Lucas decided to edit it?

    • Re:IE Fell first. (Score:4, Insightful)

      by suutar ( 1860506 ) on Friday March 20, 2015 @12:41PM (#49303109)

      so, since the attackers came with prewritten exploits, that essentially means that IE got tested first. And this means what?

    • by Anonymous Coward on Friday March 20, 2015 @12:42PM (#49303119)

      Shaka, When the Browsers Fell

  • by ron_ivi ( 607351 ) <sdotno@NOSpAM.cheapcomplexdevices.com> on Friday March 20, 2015 @11:30AM (#49302243)

    Is it reasonable to expect browser makers to hold their own in an arms race against exploits?

    The problem is that browsers are trying to become an OS - with all the complexities associated with one.

    If we want back to a world where HTML was mostly about content -- that could be displayed in everything down to things like the Lynx browser -- they coudl be made secure.

    People wanted more, though -- so they decided to allow extensions like Java Applets, Flash Plugins, and ActiveX controls. Obviously more complex, those were not surprisingly insecure.

    So now people decide to take all the complexity and insecurity and build it directly into the browser itself?!? WTF.

    Makes me miss gopher clients. Maybe we should go back.

    TL/DR: Javascript+HTML5 is the new Java applet + Flash Player + ActiveX control.

    • by dave420 ( 699308 ) on Friday March 20, 2015 @11:36AM (#49302309)
      There's nothing stopping you from going back. The rest of us can still use the vastly more functional modern web applications to get stuff done. Yes, there are security issues, but security issues exist regardless of whether they are in the browser or in software. It's not as if we never had any computer security issues before Web 2.0...
      • There's nothing stopping you from going back.

        Actually, there is. You can't use any of the popular plug-ins on a lot of mobile devices. Chrome is so buggy that even the most basic functionality doesn't work with some of the plug-ins now. As a developer, trying to actually produce a good user experience using any of the formerly popular plug-ins is futile with all the security warnings and all-but-invisible switches to override them in modern browsers.

        And yet, after all their bitching about how insecurity is Java's fault or Flash's fault or whatever, it

        • by ThePhilips ( 752041 ) on Friday March 20, 2015 @12:30PM (#49302981) Homepage Journal

          [...] they are also trying to write secure software in unsuitable programming languages like C++.

          Right. So tell me, what "suitable" language would allow the browser to parse 200-500K of minified JS code in under 0.5 second? (200K == JQuery + few JQ plug-ins, 500K - JQuery + lots of JQ plug-ins.) Anyway, browsers already do resort to optimizations in assembler, because even C++ is not fast enough for what the web has become.

          So now we can't use tried and tested plug-in technologies to actually make stuff, and we all have to use HTML5+JS instead, even though in some areas they are still far inferior to what we had before with Flash or Silverlight or Java applets.

          Integration with 3rd parties is a bitch. That was and remains the main reason why plug-ins suck.

          Portability is another big reason. Windows, iOS and Android do things in starkly different ways, making portable plug-ins even harder.

          The problem are not plug-ins per se. The problem is that Google steers development of the Web toward its own goal which is to make the OSs obsolete. The short-sighted strategy resulted in overbloated browsers, with all the consequences for the security. Worse, they keep "optimizing" the browsers instead of e.g. integrating the JQuery/etc right into the browser to avoid repeating the loading of the same every time user clicks a link.

          • So tell me, what "suitable" language would allow the browser to parse 200-500K of minified JS code in under 0.5 second?

            It's not as if I have a handy JS engine implemented in every safer language to benchmark, but there are plenty of them out there that compile down to speeds close enough to C that the difference is mostly academic. The trouble is, every one of them is currently in the range of "obscure" to "extremely obscure" and lacks the surrounding ecosystem to be a viable alternative today.

            This is a big general problem with the software industry right now. There is so much momentum behind the C and C++ ecosystem that cr

            • So tell me, what "suitable" language would allow the browser to parse 200-500K of minified JS code in under 0.5 second?

              It's not as if I have a handy JS engine implemented in every safer language to benchmark, but there are plenty of them out there that compile down to speeds close enough to C that the difference is mostly academic.

              Benchmarks in studio.

              All comparisons I have seen so far were about executing JS code. But performance of data parsing is still largely sucks in all managed languages. And the modern web is overloaded by the ridiculous amount of the code. Parsing the JS nowadays takes more time than executing it. Because execution can be optimized to the bare necessary minimum, while one still has to parse the whole thing to know what to execute.

              The trouble is, every one of them is currently in the range of "obscure" to "extremely obscure" and lacks the surrounding ecosystem to be a viable alternative today.

              This is a highly specific task, really. And browsers have already literally ex

              • But performance of data parsing is still largely sucks in all managed languages.

                Are we talking about parsing the JS, or work being done by JS code here? I'm certainly not suggesting rewriting the browsers themselves or major components like the JS engine in a managed language. There are plenty of ways to make much more security-friendly languages than C++ that still compile to self-contained, native code without depending on a heavyweight VM.

                This is a highly specific task, really. And browsers have already literally excluded themselves from the rest of the software ecosystem. They come with their own network libraries, DNS libraries, security libraries, video/audio decoding libraries, GUI libraries and so on.

                I don't think it's as specific as you're suggesting. The same general balance between needing the control and speed vs. needing security and robus

                • I see where you coming from and actually I do not disagree with you.

                  Basically we just have different priorities.

                  Firefox has been hanging more for us in recent months than it has for years. This appears to be due to a couple of popular add-ons we use rather than Firefox itself, but the fact that a failing add-on can take out the entire Firefox process is itself a damning indictment of Firefox's basic process isolation and security model, which is still fundamentally flawed many years after every other major browser dealt with this issue.

                  Add-ons have literally unlimited access to the FireFox innards. That's by design. That's why FireFox add-ons are actually useful, compared for example to their castrated and harmless Chrome counterparts.

                  If you want to blame something, blame FireFox' rolling releases strategy. It's basically a cat and mouse game: browser changes, add-on authors has to change the add-ons, by the time they are fi

                  • Yes, it seems we do generally agree about this.

                    I have criticised the rapid release cycle of Chrome and now Firefox many times myself for the instability and crazy amount of regressions it seems to bring with it (though expressing this opinion seems almost guaranteed to get you down-modded/voted on almost any web development forum on-line) and I'm disappointed that Microsoft is reportedly going to move in the same direction with its new browser project.

                • In the old days my browser would crash from java or flash at least once a week, but I can't even remember the last time Chrome or Firefox crashed now. Chrome stays open with dozens of tabs 24/7, the only reason I ever restart it is to get security updates, probably been a year since a crash. If you're getting regular crashes in everything but IE, there's something wrong.

                  • If you're getting regular crashes in everything but IE, there's something wrong.

                    I'm a professional web developer, I work on a wide range of projects, and in recent versions I've seen fatal errors with Firefox and Chrome several times per week on average. Chrome comes up with a "didn't shut down properly" message a lot of the time just from a session loading it, leaving a real-time screen on Google Analytics open for a while, and then closing it down again!

                    It may be that we are using different definitions here. I'm including things like Firefox hanging (requiring the process tree to be

      • Re: (Score:3, Insightful)

        I really don't know what "vastly more functional modern web applications" even means. I get what AJAX and HTML4 added... and even there it seems like just a bit of an optimization over just using HTML. But I still have no clue what HTML5 added that is useful... other than built in video/audio playback.

        As far as I can tell, the biggest users of the new technology are trackers/ads.

        And there is a lot stopping me from going back. Old, functional pages keep getting replaced with JS ridden bullshit. Look, if

        • Canvas, local storage and bunch of other stuff important for developers. Why do you think flash and activex are pretty much dead?
          • I know about some of the features, among other things, canvas and local storage. I wasn't saying "what technical features" I was saying "why do I, as a consumer, want this". It's unclear to me what value Canvas will supply. Nor do I particularly want local storage from websites. One of the first things I did on new installations of flash was turn off it's local storage. Again, I see why developers^H^H^H^H^H^H^H^H advertisers want it. But I have no idea why I as a consumer would.

            To be honest, I have no

      • The thing is - everybody is responsible for their security. We don't need to "go back" - we need to teach users how to be safe. I check my parents computer whenever I come see them. No toolbars, no malware, no viruses - because me and my brother took the time to teach them basics of computer security (and mostly to click "no/cancel" if unsure).
        • This is totally wrong.

          If the system were built right, your parents would just use the thing and move on.

          It's not the users who need educating.

          It's the fucking coders.

          Why in Sam Hill can't a COMPUTER PROGRAM do whatever it is you do on your parent's computer????

    • I agree.

      The old UNIX paradigm of less is more and small is beautiful should be revisited with the browsers.

      The integration today is too convoluted and overstepping too many borders.
      If the modularity could be made stricter and communication between the modules be open and clear, then we could have all the functionality we want, but with less vulnerability to the whole system.

      In OO language, we don't want any friends and we want to make sure that no data is exposed and all functions that provide functionality

      • by Viol8 ( 599362 )

        "Firefox is open source right? If it has gotten out of control, why can't the good pieces be carried over to something better, and the old Firefox be shut down?"

        Thats pretty much what happened with Netscape Navigator - and it became Firefox. Thing is, rewriting code may free up some cruft and get rid of some bugs and make the devs feel like their doing something more productive than simply firefighting, but in general it simply replaces like for like - you simply get new cruft (after a few revisions) and ne

      • In OO language, we don't want any friends and we want to make sure that no data is exposed and all functions that provide functionality (get, set, do_something, whatever) are checked properly.

        Friends are irrelevant. In C and C++, you have the ability to set pointers to arbitrary values, cast them to whatever you want, and then use them to overwrite arbitrary memory. Friends matter for minimizing code complexity, but, as Stroustrup said, C++'s object model is intended to prevent accidents, not fraud. If you have evil code with access to an object, whether or not the code is friends with the object's class is entirely irrelevant.

    • by jellomizer ( 103300 ) on Friday March 20, 2015 @11:45AM (#49302409)

      I wouldn't say a browser is trying to be an OS but more of an interpreted language compiler.
      But if you turn off those nostalgia blinders. Of the days of the old web. We needed to install a program for almost everything, you needed an encyclopedia, then you put in that Encarta CD. Every piece of software worked for a particular OS. We had some multi-platform but they required other software that you needed to be lucky enough to have a version for your system as well. You needed ports open to share data with an other system...

      This is why back in the 1990's nearly everyone had to use windows. It is because buying a Mac, or using Linux will give you disadvantage in available software. The advanced browser opened up your Linux and Mac to the world, and people really don't care much what freaking OS you are using, because the content renders nearly the same.

      • The thing is, that was all true with even relatively early browsers, because it's the uniform access to information that was the radical improvement on what we had before.

        Nothing about that necessarily means moving complex executable software to the browsers or making browsers a thin client for code running in the cloud is a similarly significant improvement. Plenty of us would argue that in many ways it has been a huge step backward, leading to dumbed-down software, security and privacy concerns, rent-seek

        • That is arguing that using the Web Standard as a means for Application deployment isn't the best method. I agree... However it became the only practical one organically.
          It offered a few key advantages.
          1. Everyone had a browser. Not everyone had any other thin client protocol that works with Windows, Mac, and Linux and Unix systems. So for a thin client solution it was the only tool you had without additional downloads.

          2. The Web give your data then disconnect. Design while an issue in programming, does

      • Of the days of the old web. We needed to install a program for almost everything, you needed an encyclopedia, then you put in that Encarta CD.

        You're going to tell me that flat content is the killer app that JS/HTML5 solves???

        Also, what's the disadvantage of software that I install vs. software I download and run? If it really were just security, we could just allow downloaded sandboxed apps... like phones do. Is it that I own it and cannot be forced to pay (in dollars or privacy) for continued access? T

    • The problem is that browsers are trying to become an OS

      No, the problem is executing downloaded content. Doing that directly in the OS would be even worse than doing it in a browser. At least the browser executes in a sandbox. A defective sandbox is better than none at all.

      If you want security, then don't execute downloaded content. Disable Java, disable flash, disable JavaScript. Only turn them on for sites that you trust.

    • by tlhIngan ( 30335 ) <slashdot@worf.ERDOSnet minus math_god> on Friday March 20, 2015 @12:07PM (#49302667)

      TL/DR: Javascript+HTML5 is the new Java applet + Flash Player + ActiveX control.

      But it's far better than before. Because Flash Player and ActiveX you were limited to waiting for a third party to fix the flaw. There's nothing the browser vendor or the user could do. JavaScript/HTML5? The browser vendor's at fault and hell, it may even be possible to fix it yourself.

      JavaScript/HTML5 may be the new vulnerability, but it's a lot easier to fix the issue. If the vulnerability was in Flash Player or some random ActiveX object, you're stuck waiting for Adobe or other third party to make the fix. With JavaScript/HTML5, the browser vendor can fix it, if it's open source, you or the community can fix it.

      So yeah, there's vulnerabilities, but the resolution of which is far easier. It may even be simply switching browsers!

      • I seriously doubt they can ever be completely secure. That's the problem with running unknown code at basically random. That's what javascript is these days, a full blown programming language that your browser will execute code automatically when visiting a hosting site. Everyone should be running a script blocker that lets them selectively whitelist trusted sites.

      • But it's far better than before. Because Flash Player and ActiveX you were limited to waiting for a third party to fix the flaw. There's nothing the browser vendor or the user could do.

        So you're saying that this is better because I'm waiting on Microsoft instead of Adobe?

        , if it's open source, you or the community can fix it.

        There was an open source flash plugin

        he resolution of which is far easier. It may even be simply switching browsers!

        Which brings different holes.

        Look, the great thing about Flash was yo

    • Makes me miss gopher clients. Maybe we should go back.

      gopher://gopher.floodgap.com/1 [floodgap.com]

      That's an actual gopher link, you'll need something like lynx or the OverbiteFF extension to use it.

    • by Creepy ( 93888 )

      Except vanilla html5/javascript won't let you touch the filesystem other than to load files (you can with extensions or using some other method like PHP). That makes it difficult to design an exploit as well as create a safety sandbox for the program itself. Flash is essentially an OS, so exploiting it makes exploiting the machine much easier. I've been hacked so many times with PHP vulnerabilities I've stopped using it and use my own coded CGI calls for file access.

      Speaking of CGI, CGI's been around since

    • You'll be happy to hear Chrome is killing insecure plugin support. It's already deprecated, but come September, only sandboxed plugins will be allowed.
    • by Bengie ( 1121981 )
      We want interactive content that can do perceptibly real-time processing. Static content is for books.
    • /Oblg.

      The birth and death of Javascript
      https://www.destroyallsoftware... [destroyallsoftware.com]

  • by Viol8 ( 599362 ) on Friday March 20, 2015 @11:31AM (#49302249) Homepage

    ... getting their code airtight and less time constantly fucking about with GUI and javascript interpreter - sorry, "engine" - changes perhaps these exploits could become less of an issue.

    • Your post displays an astonishing level of both confidence and ignorance. Find me a piece of software half as complex as a browser (which has the unenviable task of running arbitrary code from untrusted sources in a secure manner) that doesnt have any CVEs and I'd happily retract my statement.

      • half as complex as a browser (which has the unenviable task of running arbitrary code from untrusted sources in a secure manner)

        What if we redefined the browser's goal from "run arbitrary code" to "show static pages that people uploaded". And then maybe add some small subset of the interactivity we have now. You know, as though the primary things people did all day weren't easy to redo. Even Twitter and Facebook could be rewritten to be mostly static pages pretty easily.

        Now, it would push the "run arbit

        • by Richard_at_work ( 517087 ) on Friday March 20, 2015 @12:56PM (#49303297)

          Most people dont want shitty static pages, they want the application experience. Which is why we have the heavy browsers we have today.

          • I'm pertty confident most people would be happier with static pages whether they know it or not. The only exception I can think of is video and audio, which could still be done easily enough without building massive pages of shitty java script. I have used noScript for years and it is amazing how improved most sites are when you block the scripts from their two dozen partner sites.

            • by marsu_k ( 701360 )

              Right. If I'm on a mobile device with data caps (which I don't have here, and they're an abomination, but that's another discussion), I sure as hell would be much more happy to reload an entire page (yes, some of the content such as CSS and images can be cached, but the HTML markup can't) instead of loading a little bit of JSON that would be rendered as the new data I wish to retrieve. Static HTML is obviously so much better.

              (I'm not saying blocking third-party scripts and cookies is a bad thing, I do that

  • I built a better sword,
    they built a better shield
    i got a long bow
    they got armor
    i fortified a wall
    they built a ballista

    • With every improvement to defensive equipment/strategy you reduce the number of attacks you can receive.

      It's the same with software. With every improvement you reduce the threat level. Just means a higher level of expertise is required and this means more time before the next break in. I remember in the 90s when everybody and their uncle could easily break into a web server using Telnet. You didn't even have to cover your tracks that well because many of the devices didn't log in/outs and if they did the lo

  • A security researcher identified by HP only as ilxu1a delivered the first exploit of the day with an out-of-bounds memory vulnerability in Firefox that took less than one second to execute. For his efforts, ilxu1a was awarded $15,000.

    To successfully exploit such a vulnerability (other than to make the browser to simply crash), and attacker needs to craft the attack to place just the right content into memory.

    By building the browser yourself (with CFLAGS, CXXFLAGS and even CC and CXX set to something unusual — such as to target only your specific -march) — rather than downloading prebuilt binaries — you make the attacker's job much harder. To successfully exploit your browser, he'll now need to make a custom exploit just for you.

    And, if you include -fstack-protector or equivalent [drdobbs.com] among your compiler-flags, you may even be able to make such attacks impossible for good.

    • by rudy_wayne ( 414635 ) on Friday March 20, 2015 @12:01PM (#49302601)

      By building the browser yourself (with CFLAGS, CXXFLAGS and even CC and CXX set to something unusual — such as to target only your specific -march) — rather than downloading prebuilt binaries — you make the attacker's job much harder. To successfully exploit your browser, he'll now need to make a custom exploit just for you.

      And, if you include -fstack-protector or equivalent [drdobbs.com] among your compiler-flags, you may even be able to make such attacks impossible for good.

      Technically, this is correct.

      However, I've tried to make my own custom builds of Firefox and it's a nightmare. The build process used by Firefox is so complicated and convoluted, it would make Rube Goldberg laugh. I haven't tried building Chrome, but reading the build instructions, it appears to be only marginally better.

      • However, I've tried to make my own custom builds of Firefox and it's a nightmare. The build process used by Firefox is so complicated and convoluted, it would make Rube Goldberg laugh.

        I once built a Firefox build on a PS2, I had the libs so I could enable a ton of features that the community builds didn't. The process was so horrible I only did it once....and then settled for the community builds even if they lacked gtk2 support (and other things)

        I figured out that it was easier to IGNORE most of the build instructions and just do a standard traditional ./configure, because setting up a frickin MOZCONFIG that worked was the devil's work.

      • by mi ( 197448 )

        However, I've tried to make my own custom builds of Firefox and it's a nightmare.

        Not if you use FreeBSD:

        make -C /usr/ports/www/firefox install clean

  • I want to know how vulnerable browsers are, not if they are. Always assume what you are using is vulnerable, if you feel completely safe with your software, then you are the one most likely to get hacked. But I want to know the level of effort it will take to perform such exploits. Some interestly coded HTML/XML /Javascript where you can drop the files on Any Web Server and perform the export. Perhaps it is in the HTTP protocol, where you need to write a Server Side application to perform the HTTP Call

  • by Pinky's Brain ( 1158667 ) on Friday March 20, 2015 @11:38AM (#49302337)

    Are the majority of exploits due to bugs which would be trivially detected at compile time let alone runtime in a modern language as usual?

    • by Lunix Nutcase ( 1092239 ) on Friday March 20, 2015 @11:46AM (#49302423)

      What a joke. "Modern" languages allow all sorts of security exploits through. Such as this [pcworld.com] hilarious one involving Ruby on Rails.

      • I agree ... implicit typing and functional programming introduce new failure modes. In fact I would argue they're both mistakes, no one is good enough to not be frequently caught out by implicit typing and only superstar programmers can internalize the compiler enough to have an idea of execution flows with functional programming.

        Presenting an implicitly typed multiparadigm language as the alternative is strawmanning though.

        • Presenting an implicitly typed multiparadigm language as the alternative is strawmanning though.

          So you want examples for programs written in Python, C#, Java, etc? Or are you going to keep moving the goalposts?

          • What goalpost? I asked how many of the exploits could be blamed on C (language features). How many exploits can be blamed on another language (feature) is an interesting discussion ... but it's entirely orthogonal.

  • by Nermal ( 7573 ) on Friday March 20, 2015 @11:41AM (#49302369) Homepage

    The article doesn't provide many details on what these exploits actually were, but in case anyone else is curious like I was they appear to be published on the ZDI site:

    Broad strokes for new discoveries [zerodayinitiative.com]

    Details for older exploits [zerodayinitiative.com]

  • by SirBitBucket ( 1292924 ) on Friday March 20, 2015 @11:48AM (#49302445)
    Curious how much NoScript would mitigate the Firefox vulnerabilities. I find the mild annoyance of having to enable scripting occasionally is well worth it.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Nearly all of them likely.

      99.9% of exploits are delivered via JS as sort of obfuscation mechanism. Otherwise said exploits would be immediately be caught by simple heuristics facilities and input sanitization.

      Honestly, that's the real problem with Web security. We've got a system where it's seen as acceptable to run un-trusted, un-signed code, from unknown or untrusted sources, requested by websites with similarly deficient credentials.

      Really, just point your browser at any modern website and you'll be loa

      • This is true, I was amased at how much crap ghostery caught. At first I thought that a common news website would have ads from two or tree domains and a couple of analytics scripts from third parties, it filters around 50 different stuff from third parties, the chance of any single one of them being compromised is pretty big...

  • by account_deleted ( 4530225 ) on Friday March 20, 2015 @12:46PM (#49303179)
    Comment removed based on user account deletion
  • The best way to deal with the situation is to run browsers under a hardened type-1 hypervisor that has a tiny attack surface itself. Create an 'untrusted' domain and tool around the Internet to your heart's content, or use disposable VMs that appear for risky temporary tasks [invisiblethings.org] and then self-delete.

    If we want this rich content in our lives we have to accept the complexity and the risk to some degree. Using an OS built on security by isolation allows us all that complexity, but behind very strong, simple securi

  • A few hours ago Mozilla released Firefox 36.0.3 for Windows, OS X, Linux, and Android to patch the exploits revealed at the Pwn2Own contest.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...