Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Firefox Mozilla News

Firefox 18 Beta Out With IonMonkey JavaScript Engine 182

An anonymous reader writes with a quick bite from The Next Web about the latest Firefox beta, this time featuring some under-the-hood improvements: "Mozilla on Monday announced the release of Firefox 18 beta for Windows, Mac, and Linux. You can download it now from Mozilla.org/Firefox/Beta. The biggest addition in this update is significant JavaScript improvements, courtesy of Mozilla's new JavaScript JIT compiler called IonMonkey. The company promises the performance bump should be noticeable whenever Firefox is displaying Web apps, games, and other JavaScript-heavy pages."
This discussion has been archived. No new comments can be posted.

Firefox 18 Beta Out With IonMonkey JavaScript Engine

Comments Filter:
  • Heh, subject says it all.

    Also, first post?

    • by zrbyte ( 1666979 ) on Tuesday November 27, 2012 @02:53AM (#42103153)

      aaaand I'll be abel to boot Linux [bellard.org] faster!

    • Re: (Score:2, Interesting)

      Comment removed based on user account deletion
      • by serviscope_minor ( 664417 ) on Tuesday November 27, 2012 @05:53AM (#42103789) Journal

        I have to support a lot of low power systems and since around FF V6 its been completely unusable, especially for watching SD video, but even opening new tabs can cause FF to slam the CPU to 100%.

        Well, yes. I have an eee 900 (which is single core). I found firefox and chromium both to be pretty pathetic at playing video. Even in HTML 5. Actually, I'm not sure how they manage to make such a meal of it. It's really terrible. They're basically no better than flash. Perhaps even worse.

        I used to use flashvideoreplacer, but that doesn't work any more. I now use a greasemonkey script which basically replaces youtube videos (perhaps others?) with MPlayer, and it can decode 720p just fine in realtime.

        Firefox can slam the CPU to 100%. Chromium is multiple threaded, so it can slam the CPU to 100% with a load average of 57 making the machine grind completely to a halt.

        Chromium feels a bit snappier in that the UI elements give visual feedback quicker, but it doesn't actually seem to get the pages up faster. I switched back to firefox after a while.

        • by Lennie ( 16154 )

          Totally agree on that one, I have the same experience. Well, I never switches to Chrome, but everytime I try it out that the above conclusion is mine too.

  • by ArchieBunker ( 132337 ) on Tuesday November 27, 2012 @12:08AM (#42102433)

    None of these improvements feel any faster. Pages still load as quickly as they did a decade ago (provided your connection was fast). Why can't they make anything render faster?

    • Re: (Score:3, Interesting)

      None of these improvements feel any faster. Pages still load as quickly as they did a decade ago (provided your connection was fast). Why can't they make anything render faster?

      Have you used Firefox 3.6 recently? It sucks very badly which is why myself and Hairyfeet been promoting Chrome for 2 years. Run it on a VM and IE 9 is many multitudes a better browser. 10 years ago IE 6 broke all records in javascript performance. Run that today and slashdot and its default MSN homepage will crash within 20 seconds as the javascript interpretter can only run in 20 megs or ram and will crash.

      Old macs at my employer in the breakroom running Safari 1.x from 2006 is simply not even usable as y

      • Re:So far (Score:5, Insightful)

        by epyT-R ( 613989 ) on Tuesday November 27, 2012 @12:17AM (#42102473)

        javascript was never meant to do the things it's being used for now, that's why sites are so damned slow now.

        • Ie 6 as much as we hate it defined WEB 2.0 with inventing AJAX. Right now it is a platform whether we like it or not which is why old browsers crash on news websites. It is the framworks that are loaded taking gigs of ram as the browser is the OS.

          Firefox 18 and Chrome can make it run decent. We need a another language in the browser I do agree but with smart phones and applets running we need a platform as they simply use the HTML 5 browser for their UI and do ajax for the logic.

          • by epyT-R ( 613989 )

            Yeah I know, I just don't like it. 95% of the security problems we have start with the scriptable browser. If you want an interactive application that talks to remote data sources, make one and distribute it. It's not like the current stack's attempts at virtualization/abstraction have proven any more secure. It just makes remote access applications slower and take 10x the resources it should.

          • Re:So far (Score:4, Insightful)

            by Darinbob ( 1142669 ) on Tuesday November 27, 2012 @01:23AM (#42102797)

            Or we just turn back to clock to the good old days where web sites were about presenting information simply with a simple markup language instead of trying to be a full application.

        • Re:So far (Score:5, Insightful)

          by NightHwk1 ( 172799 ) <(ten.ksalfytpme) (ta) (noj)> on Tuesday November 27, 2012 @12:28AM (#42102551) Homepage

          What sites are so damned slow? It's not the Javascript in most cases, it's the asset loading. The tubes are still the bottleneck on the web.

          If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now, and there are more and more sites that are mostly client-side, using frameworks like Backbone.

          • Re:So far (Score:5, Informative)

            by tlhIngan ( 30335 ) <slashdot&worf,net> on Tuesday November 27, 2012 @01:07AM (#42102739)

            What sites are so damned slow? It's not the Javascript in most cases, it's the asset loading. The tubes are still the bottleneck on the web.

            If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now, and there are more and more sites that are mostly client-side, using frameworks like Backbone.

            Well, let's see. Go to a site like Engadget and make sure your javascript allow settings are set to show the page comments. Now open 24 hour's worth of news postings in tabs. After about halfway through, your browser will start to lag with CPU pegged. It's not loading content off the Internet if your CPU is pegged. But it's just all the instances.

            Or you can replicate it with a busy thread like one of their "post a comment to enter in a draw), where it gets around 200 comments a minute. You'll find the browser noticably slows down as the javascript is constantly running.

            Repeat same with a site like Gizmodo (probably an even worse offender). I can't open more than 10 tabs at a time before the browser locks up for a few minutes at 100% CPU.

            Latest firefox and Chrome. Websites just gobble up the Javascript processing. Those sites are unusable on weaker javascript engines.

            • Re:So far (Score:5, Informative)

              by wmac1 ( 2478314 ) on Tuesday November 27, 2012 @02:04AM (#42102975)

              Engadget has heavily linked in components from other websites in the form of inline-frames , advertisement, images, tracking etc. They have also heavily used Javascripts, transparent layers etc. One would think they purposefully applied "worst practices".

              Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether.

              • Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether.

                Which is soooo ironic. If you are blocking their ads, the only way you can help them is to contribute to the community so that more people without ad-blockers will spend time loading pages with ads. Plus, it is reasonable to assume that people blocking ads are smarter than your average dog on the internet so their comments might be higher calibre than the hoi polloi.

                • Plus, it is reasonable to assume that people blocking ads are smarter than your average dog on the internet so their comments might be higher calibre than the hoi polloi.

                  They don't want those people commenting, if they did it would be clear how much smarter and better educated random yokels on the internet are than they are.

              • "Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether." Thats BS. What will stop the comment system is NoScript until you unblock aol and fyre. AdBlock and DNT are not the culprits and I have both on with no Engadget related exceptions.
                • by wmac1 ( 2478314 )

                  I don't have NoScript on my Firefox. I only have those two. Those two stop the inline frame and the tracking components of fyre.

                  • I would look elsewhere for your problem as I have both Adblock and DNT. I have posted four or five times since the change over to the fugly version a week ago. Any problems I have had are related to script blocking which is noscript related. I have not changed any defaults in AdBlock.
                    • by wmac1 ( 2478314 )

                      You still insist on your own BS. When I disable adblock and DNT+ comments start to work. But even one of them will stop it. I inisited in my first post that I use Adblock WITH privacy filters (i.e. easyprivacy).

                    • No sir, it is you. I just added the 'easyprivacy' filter to ABP, went to engadget and posted a comment in the thread about "Phoenix project reincarnates WebOS as Nexus S app" In fact, I left a message for you timed coincident with this post now.
            • by haruchai ( 17472 )

              What are your system specs?
              I've been hearing complaints from various Slashdotters for a while but can't replicate their issues.
              I usually run the PortableApps FF package with TreeStyleTabs and a few other add-ons but it's only with this latest FF17 that it's been at all crashy - I've had a few in the last week when I get above 10 windows & 70 tabs.

              I've routinely exceeded 15-20 windows / 80 - 175 tabs for a couple years with as many as 5 simultaneous http downloads, 3 - 5 YouTube streams, Facebook

              • by gmack ( 197796 )

                Try loading the disk first. I find that FF does very poorly if there is too much disk activity to the point where even the menus lag. On an unloaded system it fantastic but it is the single worst preforming app on my system while the disk is busy. For myself, I mostly fixed the problem by going SSD.

                • by haruchai ( 17472 )

                  Hmm, I have lots of disks of various types attached to my main PCs; I do lots of transfers between them but usually when I'm stepping away.
                  I'll have to try some auto-transfers while I'm browsing to see how much impact it has.

          • Re:So far (Score:5, Interesting)

            by wvmarle ( 1070040 ) on Tuesday November 27, 2012 @02:30AM (#42103053)

            Slashdot is a prime example of a site heavily using javascript.

            Ubuntu 10.04 LTS stuck to Firefox 3.6 for a long time. When loading a /. page, particularly one with many comments, it often gave me the "script is taking too long to complete" warning message. It would eventually complete, but took long. When Ubuntu finally replaced the browser with a newer Firefox, that problem was solved. It now renders reasonably fast.

            And considering I have ads disabled, it is really /. itself that's so demanding.

            • Nope, no javascript on my slashdot. :) Using the older interface also has the benefit of preventing accidental moderation (you still have to click the Moderate button).

          • If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now,

            Yeah, that was always the idea. And sure, the AJAX reload on full gmail is faster than a full refresh. That's not surprising becahse the code required to run the whole program is quite large. However on really slow connections, the old fallback HTML interface works faster, and gmail automatically switches.

            Of course it means you can have massive pages without insane repeated load times, but sma

          • by fatphil ( 181876 )
            > content updates without a full page refresh are commonplace now

            Wait a sec! Didn't we have those in the 90s? They were called "frames".
        • Re: (Score:3, Insightful)

          by ArcadeMan ( 2766669 )

          It's not really Javascript's fault, it's all the damn know-nothing Web monkeys out there.

          I'm checking the code of one website at the moment, and they're loading all that crap:
          mootools-core.js
          core.js
          caption.js
          jquery-1.6.1.min.js
          jquery.bxSlider.js
          jquery.js (yes, jquery AGAIN along with some checks to see if it's already loaded -WTF)
          script.js

          There's also the following inline javascript functions:
          - something to add caption to images on page load (calls caption.js)
          - something to track page views

          Why do they load

          • Not sure what you think the problem with that is actually. Yes, loading jquery a second time is bad and should be removed. It appears that caption.js relies upon mootools, and I'm not sure what core.js is.

            The REAL problem is HTML5/CSS3 committees not getting off their collective asses and finalizing the spec. Then waiting for the browsers to all actually implement it, and then finally waiting for people to upgrade to those browsers. The last part being the most important as many refuse to upgrade even t

      • by Anonymous Coward
        Well duh. We just upgraded to FF 17. FF 3.6 must have been 12 years ago. Of course it's slow today (or even two years ago when FF 15 must have been the norm).
        • by PNutts ( 199112 )

          Well duh. We just upgraded to FF 17. FF 3.6 must have been 12 years ago. Of course it's slow today (or even two years ago when FF 15 must have been the norm).

          IIRC FF 3.6 was about 9 months ago. Ba-dum.

          • FF 4 came out in March 2011. I remember the day vividly. That is 18 months old. 12 years ago we were waiting for the all so aweome IE 6 with the correct box model ohh and ahh.

            That might not seem a long time ago to many here but the way the browser wars 2.0 are heating up it is becoming the next IE 6 of FF FAST! A world of difference in just a year and a half if you benchmarked both.

    • Re: (Score:2, Insightful)

      by Cinder6 ( 894572 )

      I find this an odd comment. I definitely notice pages rendering faster, and I can see this effect simply by changing browsers. Some are genuinely faster than others all around, while others have different rules for when they begin to display content.

      With that said, I'm getting kind of tired of all the Firefox posts. Why do we get a post for seemingly every Firefox release, including betas, and no mention at all of Chrome updates? (Note: I'm not advocating for more "stories" about Chrome.) Maybe everyon

      • by Anonymous Coward

        Firefox is open about their updates and this is significant - a whole new JIT compiler.

        This is an area that has been sorely needing update, people were complaining about the speed.
        So notifying them that a significant speed bump occurs in the new version is HELPFUL INFORMATION.

        Google manages their updates differently and they do usually get articles when significant milestones are reached.
        If they re-did their JIT compiling and it resulted in a speed boost, you don't think you'd hear about it?
        Maybe the proble

      • > Why do we get a post for seemingly every Firefox release,
        Because at one time FF was the poster boy for open source. It gave what users wanted - an open source, HTML standards compliant, extendable browser.

        Old habits die hard around here.

        Sadly FF has jumped the shark by a) never fixing the memory leaks, b) broke plugins, c) fucked the UI by removing resizable dialogs such as the bookmark menu, etc. Sure it is getting faster, and taking the initiative in supporting the latest HTML5 standards such as Web

    • I'm still going through loading my tabs, and initial page load seems slower but of course it could just be my connection acting up. Subjective experience means much to the individual user but I can't tell yet that anything is really faster. *shrug* - HEX
    • Re: (Score:3, Insightful)

      Because all the speed improvements were used by developers not to give the user a better experience, but to develop ever-more-complex pages. "Look, this new browser is 50% faster. Now, we can make a super-complex web page and still get the old speed!" Repeat for every speed increase.
    • Because by and large web developers hate you and see it as their duty to load your CPU as much as they can.

      You want someone to blame, blame the folks who see it as their duty to match JS engine improvements with more crap on the webpage.

  • by Anonymous Coward

    I found it remarkable that the benchmarks only compared to earlier versions of the Firefox JavaScript implementation. A comparison with JavaScriptCore and v8 can be found at http://arewefastyet.com [arewefastyet.com]

    • But only marginally so. In most benchmarks, it looks to be roughly on par with v8 now. In some, what, 20% slower? I think that's pretty respectable - compare with the situation a year or two ago when Firefox's Javascript engine was several times slower than v8.

      • by Anonymous Coward

        But only marginally so. In most benchmarks, it looks to be roughly on par with v8 now. In some, what, 20% slower? I think that's pretty respectable - compare with the situation a year or two ago when Firefox's Javascript engine was several times slower than v8.

        So the browser written by Google is 20% better on the benchmark written by Google, but there's only much less difference on other benchmarks?
        Could that difference be related to the fact that the winning browser and the benchmark are written by the same company?
        And note that this doesn't need to be intentional manipulation: It may just be that Google's benchmark is the primary benchmark Google's JavaScript engine optimizations are tested against.

  • by file_reaper ( 1290016 ) on Tuesday November 27, 2012 @12:28AM (#42102557)
    I haven't kept track with the JIT's that have been in Firefox. I recall the days when TraceMonkey and JagerMonkey were added to boost performance. Could somebody recap or tell why Firefox is abandoning the older versions or redoing them? I'm truly curious as to what they learned, what worked and what didn't work. Are they finding new usage patterns that warrant a new JIT design? Thanks.
    • by MSG ( 12810 )

      They aren't being replaced. Each of these codenames is an additional optimization layer. The performance enhancements are cumulative.

    • by Turnerj ( 2478588 ) on Tuesday November 27, 2012 @01:52AM (#42102931)

      Wikipedia [wikipedia.org] goes into a bit of detail about it but in basic summary...

      TraceMonkey was the first JIT compiler for SpiderMonkey released in Firefox 3.5.

      JagerMonkey is a different design on TraceMonkey which outperforms it in certain circumstances (Some differences between TraceMonkey and JagerMonkey [mozilla.org])

      IonMonkey is another attempt at better perfecting the idea of JagerMonkey allowing even greater optimisations under particular circumstances.

      However TraceMonkey, JagerMonkey and IonMonkey are part of SpiderMonkey as JIT compilers, not a replacement of SpiderMonkey itself.

    • by BZ ( 40346 ) on Tuesday November 27, 2012 @02:48AM (#42103135)

      A short summary:

      1) TraceMonkey turned out to have very uneven performance. This was partly because it type-specialized very aggressively, and partly because it didn't deal well with very branchy code due to trace-tree explosion. As a result, when it was good it was really good (for back then), but when it hit a case it didn't handle well it was awful. JaegerMonkey was added as a way to address these shortcomings by having a baseline compiler that handled most cases, reserving tracing for very hot type-specialized codepaths.

      2) As work on JaegerMonkey progressed and as Brian Hackett's type inference system was being put in place, it turned out that JaegerMonkey + type inference could give performance similar to TraceMonkey, with somewhat less complexity than supporting both compilers on top of type inference. So when TI was enabled, TraceMonkey was switched off, and later removed from the tree. But keep in mind that JaegerMonkey was designed to be a baseline JIT: run fast, compile everything, no fancy optimizations.

      3) IonMonkey exists to handle the cases TraceMonkey used to do well. It has a much slower compilation pass than JaegerMonkey, because it does more involved optimizations. So most code gets compiled with JaegerMonkey, and then particularly hot code is compiled with IonMonkey.

      This is a common design for JIT systems, actually: a faster JIT that produces slower code and a slower JIT that produces faster code for the cases where it matters.

      https://blog.mozilla.org/dmandelin/2011/04/22/mozilla-javascript-2011/ [mozilla.org] has a bit of discussion about some of this.

  • by detain ( 687995 ) on Tuesday November 27, 2012 @12:38AM (#42102607) Homepage
    Its good to see the focus of this release being an attempt to increase javascript speed by leaps and bounds. Modern webpages often use JS that goes way beyond anything people did 10 years ago (Jquery for example) and the complexities of what people do with javascript noticably slow down most webpages considerably.
    • Re: (Score:3, Interesting)

      Its good to see the focus of this release being an attempt to increase javascript speed by leaps and bounds. Modern webpages often use JS that goes way beyond anything people did 10 years ago (Jquery for example) and the complexities of what people do with javascript noticably slow down most webpages considerably.

      When I first learned to program in BASIC, I used to think that people should try speeding up C and Assembly language -- Make EVERYTHING run faster... Then I learned C and x86 Assembly and I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak. You might select a better algorithm, or better use registers, but this isn't changing ASM. C can't really be hugely optimized either, it's pretty close to the metal, but there there are a fe

      • by dmomo ( 256005 )

        I'm not so sure that the Javascript (well, EMCA Script) LANGUAGE is the problem. The challenges with respect to rendering speed have more to do with the DOM and the interaction with the browser itself. The DOM is a bulky beast. When javascript listeners are assigned to page elements the code can in turn alter the DOM creating or destroying elements, all of which can trigger javascript functions, any of which can create or destroy more DOM elements. It's a properly tangled mess. Memory management in this

      • by madsdyd ( 228464 ) on Tuesday November 27, 2012 @03:10AM (#42103225)

        I don't understand why this comment got +5. It is pretty misguided.

        The statement:

        > I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak

        makes some limited sense in some contexts (one could argue that the microcode supporting the assembler on the CPU is repeatedly optimized), but none in this. The IonMonkey JIT does essentially optimize the assembler code[*], by rearranging it in various ways to make it faster. E.g. it takes stuff like this (in javascript, as I have not written assembler in years):

        for ( var i = 0; i != 10 ; ++ i ) {
            var foo = "bar";
        }

        and changes it to e.g. this:

        for ( var i = 0; i != 10; ++i ) {
        }
        var foo = "bar";

        possibly then this:

        var foo = "bar";

        This is an optimization and it is performed at assembler level (Again: the above is not meant to be read as JavaScript, but assembler).

        The other statement that really sticks out is this:

        > A sign of a horribly designed language is that the speed of its implementations can be repeatedly increased "by leaps and bounds"...

        This simply highlights that the poster really do not understand the goals behind crossplatform languages, such as Java, Dalvik, JavaScript, lisp, ML, Python, Perl, and so on, or the goals for weakly typed languages.

        [*] It works on an abstract representation of the assembler code, but it might as well have been working directly on the assembler, was it not for the fact that this would require it to learn to many assembler variants.

      • Then I learned C and x86 Assembly and I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak.

        Well, not really. You can't optimize perfectly optimized assembly any further, but that's just tautology. You can optimize even quite well written assembly further, especially on modern CPUs with various selections of functional units, variable pipelines, multiple instruction dispatch per cycle, vectorising etc.

        In fact, the days have generall

      • > C can't really be hugely optimized either,

        You've never worked on a C/C++ compiler have you? :-)

        C most definitely can be sped up. Why do you think we even have the "restrict" keyword?
        http://cellperformance.beyond3d.com/articles/2006/05/demystifying-the-restrict-keyword.html [beyond3d.com]

        Function temperature provides hints to the compiler on what code should be inlined.

        The back-end of a compiler has a lot of room for generating optimal code IF it understands the target hardware. i.e. Minimizing register spill is ext

      • C makes 1970s assumptions like there will be only 1 thread (no parallelism), no such thing as SSE, and few built-ins (thus requiring a variety of libraries that may make similar errors). Modern technology offers C extensions (OpenMP), better libraries, better compilers which have utilize modern architecture targets. Even Java now auto-parallelizes some code that it once ran linearly. There is enormous potential for non-trivial code to better-fit the hardware.
    • by olau ( 314197 )

      Do you have a source to back that up?

      I'm personally under the impression that most pages are slow because of rendering speed, not because of Javascript execution itself. I see improvements in Javascript compilers mostly as an enabler of future tech, not something that significantly speeds up existing pages.

      Of course, with a couple of big libraries, parsing time is perhaps important.

  • /. is now a little bit more bearable.
  • It won't run on about 55% of the macs out there.

    • by wisty ( 1335733 )

      I guess you are referring to the Mac OS X 10.5 requirement, or the need for x86?

      Roughly 10% of Macs run Tiger, or a previous OS. Leopard has ~20%, SL 50%, and the Lions 20% (rough figures). Many of those Leopards are PPC. I'd guess at most 30% of Macs (which are actually in use - not old TAMs or Classics, if there's even a significant number of those) can't run Firefox.

      • by thogard ( 43403 )

        Apple has reported that 55% of the people who can run Lion are not running Lion. For most users, there is no reason to upgrade between major versions of OS X and since the change between 10.5.8 and 10.6.3 is about the change between the first 50 patches to service pack 2 for XP, I find it odd that a version isn't built since all it should take is not using xcodes default settings.

  • Actually, I shouldn't say that. Firefox started breaking on github around version 17.0. Many of the sub-project pages, e.g. Issues page and the Markdown - Raw viewer, redirect to a "Page did not load in a timely fashion" error page. This happens consistently on every github project. Unless the github team has done something weird on their end, this is another in the lengthy amount of compatibility problems Firefox is beginning to have.

  • What I liked about the previous mozilla javascript engines was that they supported multithreading. That made them suitable for web-server use. In contrast with, for example Chrome's V8, which is not suitable for server use (unless you are prepared to spawn multiple processes instead of threads, but this is very expensive performance-wise of course).

    So I hope they support multithreading.

    • by BZ ( 40346 )

      Multithreading should work fine as long as you use separate JSRuntimes on separate threads.

  • its not like javascript were the problem of any current browser, but all of them work on improving the js engines instead of taking a break from the x-th js improvement and build a better browser ui. For example mozilla should fix stuff like menubar, favicon, statusbar, ... all the "we need to look like chrome" stuff. hey, if i want something which looks like chrome, i use chrome. If there weren't all the extension fixing this crap, maybe i would be a chrome user for a long time, since firefox gave up its o

  • People don't change browsers much if those I know are any indicator. When they do, it typically is because one or more of three events occurred. The first is when they're actively shown an alternative by a preacher. The next is when they compare a site in different browsers and notice a material difference, eg when designing it. The last is when intense frustration leads them to actively seek alternatives.

    In my (non-representative) sample, FF has been hemorrhaging irate users for several years now. And the

    • I tried switching from Chrome (using the Canary build), because it kept crashing on me. I had to use Firefox nightly builds, because the standard Firefox looks fuzzy on my Retina display Macbook Pro (high-res Retina display have been out for almost 6 months, IIRC).

      Unfortunately, Firefox turned out to have problems with providing a cursor in the location bar after opening a new window (you had to set it by mouse!), and the privacy mode is broken: it removes all other windows and switches to anonymous brow

  • For fuck's sake - 5 layers deep of scripts? Six? More? No Script has become nearly useless when I have to turn on 5 or 6 LAYERS of scripts and 45 different scripts just to format a page. And on a good day they slow everything down to level of running it on a 486DX100 machine circa 1996.

  • ./firefox
    bash: ./firefox: /lib/ld-linux.so.2: bad ELF interpreter: No such file or directory

    Linux koala 3.6.6-1.fc16.x86_64 #1 SMP Mon Nov 5 16:56:43 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

  • That's what I've been reading, anyhow. My sources didn't specify if this would only be on OS X or also on other OSes which you might run on the same hardware.

  • I have used ion monkey for quite some time - I was running nightly ion-monkey builds before it was merged into the 'regular' nightly code. The speed up is at best minimal from a user perception except on really heavy js sites. And if anyone from the dev team reads this, there is a dreadful memory leak in the latest 20s.

"Why should we subsidize intellectual curiosity?" -Ronald Reagan

Working...