Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Firefox Google Mozilla Open Source Software News

Firefox 4's JavaScript Now Faster Than Chrome's 352

An anonymous reader writes "Firefox 4's JavaScript engine is now faster than V8 (used in Chrome) and Nitro (used in Safari) in the SunSpider benchmark on x86. On Mozilla's test system Nitro completes the benchmark in 369.7 milliseconds, V8 in 356.5 milliseconds, and Firefox 4's TraceMonkey and JaegerMonkey combination in 350.3 milliseconds. Conceivably Tech has a brief rundown of some benchmark figures from their test system obtained with the latest JS preview build of Firefox 4: 'Our AMD Phenom X6-based Dell XPS 7100 PC completed the Sunspider test with the latest Firefox JS (4.0 b8-pre) build in 478.6 ms this morning, while Chrome 8.0.560.0 clocked in at 589.8 ms.' On x86-64 Nitro still has the lead over V8 and TraceMonkey+JaegerMonkey in the SunSpider benchmark."
This discussion has been archived. No new comments can be posted.

Firefox 4's JavaScript Now Faster Than Chrome's

Comments Filter:
  • FF4 (Score:5, Funny)

    by Anonymous Coward on Sunday October 24, 2010 @07:30AM (#34003162)

    FF4 crashes when I try to open Gmail since the change. This makes it slower for opening my mail.

    1. connect to gmail with FF4
    2. FF4 crashes.
    3. Open chrome and go to gmail
    4. ??? (train monkeys to joust)
    5. Profit

    • by yuriyg ( 926419 )
      A beta version of a software crashes? Surely you must be joking!
  • by Joce640k ( 829181 ) on Sunday October 24, 2010 @07:35AM (#34003194) Homepage

    I'll be able to do one more mouse click every three weeks or so.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Having good javascript runtimes will help the web go to the next level. This is useful for the gaming industry to tap into the non-gamers and casual gamers pool, e.g. this this port of quake that is able in javascript as a proof of concept:

          http://code.google.com/p/quake2-gwt-port/

      But this can also be useful for non games usage: applications such as google street view and google earth could soon be embedded in regular webapps without the need for flash plugin for instance.

      • by Anonymous Coward on Sunday October 24, 2010 @08:33AM (#34003488)

        I'm not looking forward to 'the next level' of the web. It will only have more dancing and blinking crap on the page.

        Want to make you site fast? You don't need Ajax, Flash, or any other "Hype du Jour". Toss it all out, stick with plain old HTML and make it look decent with simple CSS. Wham, your site is now an order of magnitude faster. You don't need those five load balancers and those twenty application servers just to serve up a page that could easily run on one server when you actually had a clue.

        The Web is rapidly going the way of television: once it was about content, then ads came 'to pay for the content' and now it is all ads with the absolute minimum of content. Spreading a two paragraph article over eight pages just to have more ad impressions. Six pictures that just have to be in a slide show. Ads. Profit. Bottomline.

        Get me a bucket, I'm going to hurl...
         

        • by Jahava ( 946858 ) on Sunday October 24, 2010 @08:58AM (#34003636)

          I'm not looking forward to 'the next level' of the web. It will only have more dancing and blinking crap on the page.

          Want to make you site fast? You don't need Ajax, Flash, or any other "Hype du Jour". Toss it all out, stick with plain old HTML and make it look decent with simple CSS. Wham, your site is now an order of magnitude faster. You don't need those five load balancers and those twenty application servers just to serve up a page that could easily run on one server when you actually had a clue.

          Want to view content? I agree with your theme in that case, and there are plenty of sites out there that are designed around just that: simple presentation-focused static content display.

          However, most of the impetus for "Web 2.0" has not been around content viewing, but rather about utilizing the web browser as an effective, cross-platform thin client for applications. Now, granted, some sites are (ab)using AJAX and whatnot for purposes ranging from nefarious to just annoying, and there is some spillover from the dynamic application-based web pages into the static information-based ones, but it's generally kept in balance by the ease with which people can transition to a competing website if yours is too annoying.

          Recent advancements in Javascript execution speed are oriented towards polishing the thin client experience and capabilities. If fast Javascript execution becomes ubiquitous, sites can design much more successful thin clients because they can take that execution speed for granted. It's not all just flashing lights and annoying ads: take a look at the stunning Deluge BitTorrent Client's Web UI [deluge-torrent.org] to see how nicely "Web 2.0" can be used.

          • by sznupi ( 719324 )

            Now, granted, some sites are (ab)using AJAX and whatnot for purposes ranging from nefarious to just annoying ... it's generally kept in balance by the ease with which people can transition to a competing website if yours is too annoying

            So what keeps you here?

            • Maybe he enable the "Slashdot Classic Discussion System" like I did? It's right there on the preferences.

              • by sznupi ( 719324 )

                I know that; been using it until quite recently. How I think, now, that the AJAX one is better doesn't dispel its abuses (and you know, some parts of the site simply do not work without js anymore)

        • by Jorl17 ( 1716772 )
          The Bright Minds Of The World have to get away from this crap and create a new web. But this time, do it right -- don't let ignorant people get to it.
        • Cool, the luddites are finally on board with CSS!

        • Tired of http? Lets move on to gopher..

        • by Wraithlyn ( 133796 ) on Sunday October 24, 2010 @01:09PM (#34005258)

          1) Nobody uses load balancers and multiple app servers because they're serving "dancing and blinking crap" (Flash and JS heavy) sites. Those things slow down people's browsers, not the servers. Heavy server resources are needed when you need lots of server side processing, which generally comes from delivering customized pages to every user (ie you can't just cache everything). Furthermore, using AJAX helps REDUCE server load, by only requesting snippets of content, instead of complete page requests; you think GMail would be faster and less server-intensive if every click required a full page response? How about Google Maps?

          2) You seem to be under the impression that developers actually design sites. Maybe in some tiny one-man-show setup, but in the real world a UX/IA specialist designs the user experience, a designer does the visuals, the client signs off on it, and then the developer makes it all happen using whatever tools and techniques are necessary. They don't have the option to "toss it all out" and make it as simple as they like, much as they'd like to. Heck, I have to fight to make sure there's accessible fallback versions of the fancy JS-enhanced UIs everyone designs these days. "Throw it all out"? You live in a dream world buddy, or you do work of extremely limited scope.

          In short, your post is just "get off my lawn!". More and more clients demand rich user experiences, and this will continue to grow. Welcome to 2010.

      • Faster javacript is also good when you visit Websites that were written by your Redneck third cousin who is also your wife (i.e. none too bright):

        For example NBC.com's feedback site made my firefox freeze for about 10 seconds when I visited it yesterday. 6ms faster response on a "good" javascipt site might translate to a full second faster on a poorly-coded site..... i.e. waiting 9 seconds to "unfreeze" nbc.com instead of 10 seconds. For me that would make it worth switching from Chrome to Firefox.

        • >>>worth switching from Chrome to Firefox

          Or SeaMonkey which I've been trying these last few days. It has the same engine as FF4 but with far less bloat (~150,000 vs. ~300,000 kilobytes). Copied from another guy's post yesterday:

          Q: Why not use Firefox instead of Seamonkey?

          A: "Yes, Firefox web browser, Thunderbird email and news client, Sunbird Calendar, and NVU HTML editor are useful programs. The Mozilla/SeaMonkey suite, with all of this functionality, is about 11M compressed, whereas the separa

          • by sznupi ( 719324 )

            Seamonkey definitely feels snappier than FF every time I try it, and able to gracefully withstand much heavier browsing (why? Less reckless code in UI / more cautious with changes there / doesn't have to accommodate extensions? UI which, running also via js, isn't the most speedy type by design...); a bit funny, considering the stated goals at the start of Phoenix effort.

          • Are you sure that still applies? Since 2006 Mozilla launched XULRunner, which is shared between the applications, including Seamonkey, and provides the base functionality (all the XPCom libraries, if I'm not mistaken, are in XULRunner).

            Also, I don't use Thunderbird nor NVU and Sunbird has been deprecated, rendering the "sharing functionality" less useful.

        • by tenco ( 773732 )
          I would just turn off JS for that site. If it becomes unusable by doing so, well, good riddance.
      • by Anonymous Coward

        Quake II was released in 1997. That's 13 years ago. At the time of its release, Intel's top-end CPU was the Pentium II running at 233 MHz, and even that had only just been released. Most Quake II players were still using Pentium or high-end 486 systems.

        Today, a decade and a half later, we have cell phones that are many hundreds of times faster than those Pentium and Pentium II systems, and desktop systems that are thousands or tens of thousands of times more powerful. Yet with all that raw processing power,

        • Re: (Score:3, Interesting)

          by Bert64 ( 520050 )

          There is a general trend to higher level (which are bigger/slower) languages, look at the current fascination with ruby...

          The general trend has been that while hardware gets faster, software gets slower so the overall user experience remains the same... If you want a laugh, install some really old software on new hardware, i ran windows 3.0 on a p133 with 32mb a few years ago and it booted almost instantly compared to the 386/486 machines it was typically used on.

        • Re: (Score:3, Insightful)

          by moonbender ( 547943 )

          Hyperbole much? Cell phones aren't many hundreds of times faster than a Pentium or a P2, and desktops aren't thousands, much less tens of thousands of times faster. Hell, the clock is only about 10x faster (2GHz to 3GHz) -- where exactly do you think a 100- to 1000-fold increase in per-clock performance is coming from?

          For the record, the Quake 2 software renderer apparently does about 250 fps at 800x600 on todays top-of-the-line Intel CPUs. I still remember how Quake1 was choppy in software mode even at pai

        • Re: (Score:3, Informative)

          Today, a decade and a half later, we have cell phones that are many hundreds of times faster than those Pentium and Pentium II systems,

          A hundred times faster than a 233 MHz processor? That's 23 GHz. What phone has a 23 GHz processor?

  • by Anonymous Coward on Sunday October 24, 2010 @07:40AM (#34003236)

    Why do I have a feeling that Slashcode's terrible AJAX interface is going to get even worse in the near future?

    This is quite possibly the lamest e-peen measuring contest ever.

    • by DarkOx ( 621550 ) on Sunday October 24, 2010 @08:58AM (#34003644) Journal

      Yea, what is even more irritating is they keep subjecting us to it. Even thought the boxes stay checked I have had to go to my user preferences and turn on and then back off the new comment system several times in the last two weeks. I hate it, based on the comments I read here on Slashdot just about everyone else hates it two. Some of them are just Luddites that want the Slashdot of 1997 period but the rest of us just hate because its an awful way to browse and read comments, awful (GET IT TACO AWFUL) so many other sites have gotten it write, if you feel he need to update the look and feel of Slashdot go look at what others are doing!

    • I am glad I am not the only one that *HATES* Slashdot's cutesy Ajax stuff. Every time they make a change, I have to go search all over the place to find a way to TURN IT BACK OFF so I can use the logical, easy, fast, simple, efficient comment system of the past. UG!

    • I think Slashdot's AJAX is pretty poor, certainly not in line with what some of the better AJAXy sites do. That said, and even though I also keep having to look for stuff, I really love being able to things like in-line previewing and replying. I used to open the reply link in another tab, but that's a bad workaround. And changing the treshold without reloading the whole page is also nice, as is opening up "titles only" posts.

  • Its great Firefox are working on certain areas of speed but they seem to always do it in the wrong areas or more to the point that their browser is built on top of a slow memory leaking turd. I run a computer with a E2200 on win7 at work. Firefox is sluggish, I've even tried the latest beta and its still slow. Chrome is very fast somehow and so none of these tests are that relevant to me. I haven't liked Firefox since version 2.
    • Re: (Score:2, Insightful)

      Given that a lot of the browser is implemented in JavaScript, it should also make the browser itself faster.

      • Re: (Score:3, Informative)

        by TheRaven64 ( 641858 )

        Note the times for this article, however. The benchmarks that they are running are taking less than half a second from loading the JavaScript code to finishing running. That's a fairly good test for typical web pages, but it's a pretty pointless benchmark. Any script that does that little will run at an adequate speed with a fairly naive bytecode interpreter.

        The things that really benefit from JavaScript speed are long-running scripts. Consider something like a Flash game that runs for minutes at a t

  • Benchmarks (Score:4, Insightful)

    by markdavis ( 642305 ) on Sunday October 24, 2010 @07:57AM (#34003322)

    I am sure this will set off a whole series of arguments over benchmarks, tuning, fairness, etc. But from this article I will just take this: I don't care which one is fastest to the few dozen milliseconds, they are probably all in the same "class" now. Everybody wins. (I can sorta understand not including IE, but wonder why they didn't include Opera?)

    Now that Javascript is so much faster, perhaps the browsers can focus on giving some type of automated/intelligent control over when it is used and how so older machines won't come to a CRAWL because of all the cutesy animation and junk spread over most big sites now. (And no, NoScript doesn't cut it- too complicated for most users, not automatic, too easy to break Javascript that is actually needed, etc). Suppress time-delayed actions, disable tight loops, throw artificial delays in loops under user control, visually tag elements to manually "play" on-demand only or stop after X seconds. I know, keep dreaming.

    • I am sure this will set off a whole series of arguments over benchmarks, tuning, fairness, etc. But from this article I will just take this: I don't care which one is fastest to the few dozen milliseconds, they are probably all in the same "class" now. Everybody wins.

      I agree. Even IE 9 should be good enough by now, to not much JS that much of a bottleneck when loading web pages, even pretty JS heavy ones.

      Also, as for tuning, there's no secret that Mozilla has tuned Firefox 4 particularly to win Sunspider and V8.

    • That's solved by Flashblock (or, even better, completely deinstall flash if you can: I only use Flash for YouTube and starting with Firefox 4 even there won't be necessary, thanks to native WebM video) and most importantly Adblock Plus with a subscription to Easylist. Your browser will suddenly become much faster.
      • Re: (Score:3, Insightful)

        by markdavis ( 642305 )

        Flashblock only stops Flash, not Javascript animation. First it was animated GIF, then Flash, and now it is Javascript animation. Animated GIF and Flash are both easy to control. But Javascript animation is a whole different story. And although Adblock helps, a lot of the stuff is not ads.

        Web site designers don't seem to give a damn how much horsepower their site need or use. It is apparent when you try to browse the web using an older machine, or a smartphone. And on a portable device, all that extra

    • Seconded! NoScript breaks too many pages and it takes to much fiddling to figure out which scripts to allow. I'm constantly faced with broken pages that are importing javascript from a dozen different places and it's neigh impossible to figure out which ones to allow. Right now FlashBlock plus a custom AdBlock blacklist does a good job of stopping 99% of annoying adds, but as things move from Flash to HTML 5, I'm sure that will change.

    • Re: (Score:3, Interesting)

      by asa ( 33102 )

      I am sure this will set off a whole series of arguments over benchmarks, tuning, fairness, etc. But from this article I will just take this: I don't care which one is fastest to the few dozen milliseconds, they are probably all in the same "class" now. Everybody wins. (I can sorta understand not including IE, but wonder why they didn't include Opera?)

      Now that Javascript is so much faster, ....

      Sunspider is a particularly bad benchmark. It was developed before any of the browsers had JITs. V8 is a bit better but still doesn't really stress the browser with the kinds of tasks that are still slow in JS. Mozilla's Kraken benchmark, http://krakenbenchmark.com/ [krakenbenchmark.com] does that. If you want to see why we still have a long ways to go, compare the speeds all browsers get on the Kraken tests with compiled code implementations of those features. Firefox leads the way on much of that, but it's still slow compared

  • ...is NoScript. They have brought my Java script load times down to 0.00 seconds. Thanks, NoScript.
  • by DontLickJesus ( 1141027 ) on Sunday October 24, 2010 @08:05AM (#34003372) Homepage Journal
    Seeing that Firefox on a few weeks ago was starting to lag pretty severely behind Chrome, I applaud and thank the Firefox team for their hard work. This is also a boon for their technique, the so-called "shotgunning" method of pushing through compilation the old way if it will complete faster than the optimizations. I had become afraid I might have to move to Chrome, looks like that won't be necessary.

    As a developer I completely understand the dislike of the "everything in a browser" attitude, but we need to look beyond that. The next version of ECMAScript will give us the security we've been wanting, and this round of browsers will give us the speed we need. Enabling universal, secure process level interaction between machines is the goal. You can think of it as widgets, .Net, or whichever other poison you want, but Javascript is free of ownership and frankly a damn good language when written properly.

    Now give me an 100% on the Acid3 test please, that way I'll have multiple tools to leverage against my boss next time he asks me to make a web app IE6 compatible.
    • Call your Microsoft reps, and tell him about your boss, he will take him to golf and you will use ie9 !

    • by n0-0p ( 325773 )

      Seeing that Firefox on a few weeks ago was starting to lag pretty severely behind Chrome, I applaud and thank the Firefox team for their hard work. This is also a boon for their technique, the so-called "shotgunning" method of pushing through compilation the old way if it will complete faster than the optimizations. I had become afraid I might have to move to Chrome, looks like that won't be necessary.

      You don't seem to understand how JaegerMonkey works, or what the SunSpider benchmark actually tests. The entire speedup here can be attributed to Firefox not compiling JS "the old way." Instead of defaulting to bytecode like they were previously, they always emit compiled instructions via Nitro's assembler. And given how the SunSpider benchmark works, all that is being tested is their parsing plus Nitro's assembly. The SunSpider benchmark doesn't even run long enough for Mozilla's tracing engine to be a sig

      • by BZ ( 40346 ) on Monday October 25, 2010 @01:01AM (#34009118)

        There are several things wrong here:

        1) Spidermonkey still compiles the AST to bytecode.
        2) An assembler does just that: assembles. This means a 1-1 mapping of some other sort of
                representation of the exact machine instructions you want into actual bits in memory.
                There are no smarts here and no optimization going on; the only question is how fast
                you generate those bits in memory; an ideal assembler does this as fast as possible
                and without using too much memory. Now you have to generate assembly (or whatever
                representation the assembler takes as input) for it to assemble. That's the job of
                the compiler. JaegerMonkey takes the bytecode generated in step 1 and compiles it,
                passing the output to the assembler borrowed from Nitro. This compilation step is
                where (some of) the optimization takes place, and this is not code shared with Nitro.
        3) Tracemonkey is most certainly useful for Sunspider; just not as useful as for other
                things. See, for example, http://arewefastyet.com/?machine=6 [arewefastyet.com] where the purple line is
                below the black one solely because of Tracemonkey. Alernately, see
                https://bug580468.bugzilla.mozilla.org/attachment.cgi?id=482609 [mozilla.org] where you can see the
                scores on each sunspider subtest as of a week or so ago; the -m column is JaegerMonkey
                without Tracemonkey, -j is Tracemonkey without Jaegermonkey, and -mjp is what's
                actually being used now (a combination of the two, with some smarts about deciding
                when to use which one).
        4) The goal of Kraken is in fact to include anticipated use cases. If you know
                anticipated use cases it doesn't include, I'm sure Rob Sayre would love to know what
                they are.

  • by electrosoccertux ( 874415 ) on Sunday October 24, 2010 @08:37AM (#34003510)

    Tell me, mr anderson, what good is javascript performance if you are unable to use multiple cores?

    I wish someone would get on this and make firefox work with multiple cores better. As it is I use the "|" character in my home page settings to open about 20 tabs-- forums, review sites, slashdot, economics blogs, etc....and firefox slows to a grinding halt for about the 15 seconds (just timed it) it takes to render all those pages.
    Chrome does it in about 4 seconds and pegs all 4 of my cores to 100%.

    Please Mozilla, I know this would require a serious redesign, but it's seriously needed. Hitching while scrolling up/down because a tab is loading in the background (I make use of middle click to open tabs in the background extensively) is very annoying.

    • by hedwards ( 940851 ) on Sunday October 24, 2010 @11:15AM (#34004532)
      They're working on it. It's just a matter of wanting to do it correctly rather than just doing it to say they've done it. Sort of like how they've resisted cheating on the Acid tests like some of the other browsers have been.

      Just about any moron can make a new browser window per tab and not have them talking to each other. But it takes a fair amount of work to get them connected enough for performance reasons without causing one tab to crash others.
    • >I wish someone would get on this and make firefox work with multiple cores better.

      And right after adding multithreading support (which is generally a good thing), PLEASE MAKE SURE to give the user the option to TURN IT OFF (or at least compile without it). In a thin-client environment, Firefox/Web 2.0 is already destroying the application servers. The only thing that limits it from completely ruining them is that the servers are multicore. Once Firefox can take over multiple cores, the game is over :

  • Interesting? (Score:4, Informative)

    by mseeger ( 40923 ) on Sunday October 24, 2010 @08:38AM (#34003526)

    I really don't see the point in a posting like this. Its all

            My _______ (1) is _______ (2) than yours

    with typical choices for (1):

    - car
    - wife / husband / significant other
    - d*ck
    - browser
    - javascript
    - OS

    and choices for (2) like:

    - faster
    - harder
    - more expensive
    - longer
    - more open
    - prettier

    Now that we have covered all these discussions, can we move on please?

    CU, Martin

    • Re: (Score:2, Funny)

      by Anonymous Coward

      You're just jealous that my wife is harder than yours.

  • Really, all this focus on faster Javascript puzzles me. JS, used correctly, should be a thin layer of glue, representing only a fraction of the total run time for a browser. The only real use I could begin to see would be if they could apply the same speed-ups to the Actionscript engine within Flash to improve the decoding of Hulu's encryption system - but since all the client sees is the bytecoded form of the decryption, not the AS source, and since this speedup is in the JS in the browser rather than the

    • Re: (Score:3, Insightful)

      by takowl ( 905807 )

      Really, all this focus on faster Javascript puzzles me. JS, used correctly, should be a thin layer of glue,

      That was the original idea of JS. It's already being used much more heavily in current web apps. But the main point of speeding it up isn't for today's websites, it's so that websites can do entirely new things without bringing the browser to a crawl. Think image processing, online mini-games, and no doubt hundreds of more imaginative uses.

      • by Hatta ( 162192 )

        Think image processing, online mini-games, and no doubt hundreds of more imaginative uses.

        All of which would be better done with a native app.

        • by takowl ( 905807 )

          The sensible answer: Web apps have various advantages over native apps (and of course, various disadvantages). For example, easy collaboration, and no installation or upgrade procedure for the user. There's even a security benefit: if a user wants to play some silly game, it's much safer to run that in a browser than it is for them to download and install something potentially dodgy. Native apps aren't dead, but web apps have a place too, and there's enough reason to expand what they can do.

          The other answer

          • by Hatta ( 162192 )

            For example, easy collaboration, and no installation or upgrade procedure for the user.

            There's no reason a native app should be any less capable of networking than a web app. Installation is trivial

            There's even a security benefit: if a user wants to play some silly game, it's much safer to run that in a browser than it is for them to download and install something potentially dodgy.

            There is no security benefit. The browser is not a sandbox. Putting all this capability into javascript increases the attack

    • by Anonymous Coward

      I know, I know, it's damn near impossible to believe, but the Firefox developers voluntarily chose to write a huge portion of Firefox in JavaScript and XML (XUL). The rendering engine and network stack are written in C++, but just about everything else is implemented using JavaScript and XUL, including all of the UI.

      This is why JavaScript performance is so important to Firefox. While other browsers didn't make the same mistake, and wrote the bulk of the browsers in a real language like C++, the Firefox deve

    • If all you want the browser for is to do the same kind of stuff that we were doing ten years ago, then Javascript "should be a thin layer of glue". And your modern browser should be seen as nothing more than a pimped up version of Lynx.

      That is not what I want. The combination of HTML, CSS, Javascript, and big pipes make it possible to write applications that can run on multiple platforms, and (more to the point) be updated across their entire user base by simply uploading the new version to your website.

  • Does Firefox support the new File API yet? The same as Chrome and Safari, not the older one which was Firefox-only.

  • I'm tired of looking at benchmark figures. I want blog articles on how these JIT engines work (differently) and WHY one or anther is faster. Or at least why the FF engine is faster than it used to be.

  • by ttsiod ( 881575 ) on Sunday October 24, 2010 @10:51AM (#34004404) Homepage
    I, too, saw the speed of Firefox 4 in a pretty simple, math-only benchmark that rotated a 3D object [semantix.gr]. Run it for yourself and/or see the gathered statistics (bottom of the page). Here is the Reddit discussion where many people run it and confirmed Firefox 4 supremacy [reddit.com].
  • Comparing Chrome 8.0.552.11 dev to Firefox 4.0b6 on Sun Spider.

    Chrome: 335.6ms +/- 2.8%
    Firefox: 543.6ms +/- 2.4%

    System: AMD Phenom II 965 w/ 4GB RAM on Win7 Ultimate.

  • by mykdavies ( 1369 ) on Sunday October 24, 2010 @01:12PM (#34005272)
    Jesus guys, can't you just congratulate the Firefox devs on the great job they're doing? Just look at the rate of improvement over the past few months [arewefastyet.com] and give the JaegerMonkey/TraceMonkey guys kudos for a really impressive job of software engineering. Have a look at David Mandelin's recent post [mozilla.com] to get an idea of how much work and planning has gone into this project.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...