Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Firefox Mozilla News

Firefox 18 Beta Out With IonMonkey JavaScript Engine 182

An anonymous reader writes with a quick bite from The Next Web about the latest Firefox beta, this time featuring some under-the-hood improvements: "Mozilla on Monday announced the release of Firefox 18 beta for Windows, Mac, and Linux. You can download it now from Mozilla.org/Firefox/Beta. The biggest addition in this update is significant JavaScript improvements, courtesy of Mozilla's new JavaScript JIT compiler called IonMonkey. The company promises the performance bump should be noticeable whenever Firefox is displaying Web apps, games, and other JavaScript-heavy pages."
This discussion has been archived. No new comments can be posted.

Firefox 18 Beta Out With IonMonkey JavaScript Engine

Comments Filter:
  • Re:So far (Score:5, Informative)

    by tlhIngan ( 30335 ) <slashdot@@@worf...net> on Tuesday November 27, 2012 @02:07AM (#42102739)

    What sites are so damned slow? It's not the Javascript in most cases, it's the asset loading. The tubes are still the bottleneck on the web.

    If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now, and there are more and more sites that are mostly client-side, using frameworks like Backbone.

    Well, let's see. Go to a site like Engadget and make sure your javascript allow settings are set to show the page comments. Now open 24 hour's worth of news postings in tabs. After about halfway through, your browser will start to lag with CPU pegged. It's not loading content off the Internet if your CPU is pegged. But it's just all the instances.

    Or you can replicate it with a busy thread like one of their "post a comment to enter in a draw), where it gets around 200 comments a minute. You'll find the browser noticably slows down as the javascript is constantly running.

    Repeat same with a site like Gizmodo (probably an even worse offender). I can't open more than 10 tabs at a time before the browser locks up for a few minutes at 100% CPU.

    Latest firefox and Chrome. Websites just gobble up the Javascript processing. Those sites are unusable on weaker javascript engines.

  • by Turnerj ( 2478588 ) on Tuesday November 27, 2012 @02:52AM (#42102931)

    Wikipedia [wikipedia.org] goes into a bit of detail about it but in basic summary...

    TraceMonkey was the first JIT compiler for SpiderMonkey released in Firefox 3.5.

    JagerMonkey is a different design on TraceMonkey which outperforms it in certain circumstances (Some differences between TraceMonkey and JagerMonkey [mozilla.org])

    IonMonkey is another attempt at better perfecting the idea of JagerMonkey allowing even greater optimisations under particular circumstances.

    However TraceMonkey, JagerMonkey and IonMonkey are part of SpiderMonkey as JIT compilers, not a replacement of SpiderMonkey itself.

  • Re:So far (Score:5, Informative)

    by wmac1 ( 2478314 ) on Tuesday November 27, 2012 @03:04AM (#42102975)

    Engadget has heavily linked in components from other websites in the form of inline-frames , advertisement, images, tracking etc. They have also heavily used Javascripts, transparent layers etc. One would think they purposefully applied "worst practices".

    Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether.

  • by BZ ( 40346 ) on Tuesday November 27, 2012 @03:37AM (#42103087)

    That's not quite true.

    TraceMonkey has in fact been removed, and JaegerMonkey may be too once the new baseline JIT being worked on now is done.

  • by BZ ( 40346 ) on Tuesday November 27, 2012 @03:48AM (#42103135)

    A short summary:

    1) TraceMonkey turned out to have very uneven performance. This was partly because it type-specialized very aggressively, and partly because it didn't deal well with very branchy code due to trace-tree explosion. As a result, when it was good it was really good (for back then), but when it hit a case it didn't handle well it was awful. JaegerMonkey was added as a way to address these shortcomings by having a baseline compiler that handled most cases, reserving tracing for very hot type-specialized codepaths.

    2) As work on JaegerMonkey progressed and as Brian Hackett's type inference system was being put in place, it turned out that JaegerMonkey + type inference could give performance similar to TraceMonkey, with somewhat less complexity than supporting both compilers on top of type inference. So when TI was enabled, TraceMonkey was switched off, and later removed from the tree. But keep in mind that JaegerMonkey was designed to be a baseline JIT: run fast, compile everything, no fancy optimizations.

    3) IonMonkey exists to handle the cases TraceMonkey used to do well. It has a much slower compilation pass than JaegerMonkey, because it does more involved optimizations. So most code gets compiled with JaegerMonkey, and then particularly hot code is compiled with IonMonkey.

    This is a common design for JIT systems, actually: a faster JIT that produces slower code and a slower JIT that produces faster code for the cases where it matters.

    https://blog.mozilla.org/dmandelin/2011/04/22/mozilla-javascript-2011/ [mozilla.org] has a bit of discussion about some of this.

  • by serviscope_minor ( 664417 ) on Tuesday November 27, 2012 @06:53AM (#42103789) Journal

    I have to support a lot of low power systems and since around FF V6 its been completely unusable, especially for watching SD video, but even opening new tabs can cause FF to slam the CPU to 100%.

    Well, yes. I have an eee 900 (which is single core). I found firefox and chromium both to be pretty pathetic at playing video. Even in HTML 5. Actually, I'm not sure how they manage to make such a meal of it. It's really terrible. They're basically no better than flash. Perhaps even worse.

    I used to use flashvideoreplacer, but that doesn't work any more. I now use a greasemonkey script which basically replaces youtube videos (perhaps others?) with MPlayer, and it can decode 720p just fine in realtime.

    Firefox can slam the CPU to 100%. Chromium is multiple threaded, so it can slam the CPU to 100% with a load average of 57 making the machine grind completely to a halt.

    Chromium feels a bit snappier in that the UI elements give visual feedback quicker, but it doesn't actually seem to get the pages up faster. I switched back to firefox after a while.

They are called computers simply because computation is the only significant job that has so far been given to them.