Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Firefox Media The Internet

Firefox To Support Google's WebP Image Format For a Faster Web (cnet.com) 53

Firefox has joined Google's WebP party, another endorsement for the internet giant's effort to speed up the web with a better image format. From a report: Google revealed WebP eight years ago and since then has built it into its Chrome web browser, Android phone software and many of its online properties in an effort to put websites on a diet and cut network data usage. But Google had trouble encouraging rival browser makers to embrace it. Mozilla initially rejected WebP as not offering enough of an improvement over more widely used image formats, JPEG and PNG. It seriously evaluated WebP but chose to try to squeeze more out of JPEG. But now Mozilla -- like Microsoft with its Edge browser earlier this week -- has had a change of heart. "Mozilla is moving forward with implementing support for WebP," the nonprofit organization said. WebP will work in versions of Firefox based on its Gecko browser engine, Firefox for personal computers and Android but not for iOS.
This discussion has been archived. No new comments can be posted.

Firefox To Support Google's WebP Image Format For a Faster Web

Comments Filter:
  • Still don't get it (Score:3, Insightful)

    by Anonymous Coward on Friday October 05, 2018 @01:15PM (#57432900)

    What makes the web slow is unnecessary javascript. The "problems" with JPEG and PNG are miniscule compared to the problems with bloated javascript, and downright negligable if you actually take the time to optimally compress your images.

    • by Desler ( 1608317 )

      Yep the 5 MB of minified and compressed Javascript is far more the problem then a 100 KB jpeg.

    • You mean downloading Megs of jQuery just so I can do a couple of $("#content").html("Text") is slowing things down? How could that be document.getElementById("content").innerHTML = "Text" is much longer to write.

      • You mean downloading Megs of jQuery just so I can do a couple of $("#content").html("Text") is slowing things down? How could that be document.getElementById("content").innerHTML = "Text" is much longer to write.

        If that were all that jQuery did, you'd have a point.

        jQuery abstracts away browser bugs and differences, just for one (huge) thing.

        It also makes things easy that are a huge PITA to implement in JavaScript.

        Yes, nothing you can do with jQuery can't be done with plain JavaScript - after all, jQuery is written in plain JavaScript. But you could just as well say that nothing you can do in C can't be done in assembly.

        • I am not dissing jQuery, I tend to use it often. However the problem is, it is a Big library, with a lot of features, rarely having even most of them being utilized on a page. So you are downloading megs of data for only a few kilobytes of actual executed code.

          Perhaps Major browsers should have a built in store of jQuery and AngluarJS vs having to download it every time. (Yes there is a way to link to it directly, without downloading the JS files, and it would be better cached in your browser) But still w

          • by jrumney ( 197329 )

            However the problem is, it is a Big library, with a lot of features, rarely having even most of them being utilized on a page.

            Which is why you should use a common CDN hosted copy of jquery, and not download and serve it from your own site. At least then, the same file is shared with other sites, and more likely to stay in the browser's cache.

            • by Anonymous Coward

              if you like your users being spied on by 3rd parties or maybe even the NSA, sure - great idea!

          • Comment removed based on user account deletion
      • I could quip that jQuery might be used for something else on the page that would take a LOT more code that in Javascript and, given that jQuery is already loaded, I might as well do "$("#content").html("Text")" for convenience.

        Sure, I could roll my own optimized libraries for the other things that jQuery does but they'd be slower to load than an already-cached copy of jQuery and you'd still complain about them.

        PS: Yes, there are web pages that only do "$("#content").html("Text")". They'll go under eventuall

      • You mean downloading Megs of jQuery just so I can do a couple of $("#content").html("Text") is slowing things down? How could that be document.getElementById("content").innerHTML = "Text" is much longer to write.

        I rarely do Javascript, but when I do I always typo getElementById as getElementByID the first time.

    • Webassembly? It's usually much faster and it allows you to use a language other than JS, such as C/C++, Rust, etc.
  • by RyanFenton ( 230700 ) on Friday October 05, 2018 @01:19PM (#57432924)

    Compression levels and file size aren't the current limitations on 'web speed' as such.

    The limits are REALLY, REALLY easy to detect, if you try any sizable set of major websites with and without various levels of ad blocking and script blocking.

    The limitation is servers placed in between users and the content they want, by marketing company servers that demand to be parsed before loading.

    And marketing companies don't place much priority on 100% minimal load times, compared to showing greater statistics on what makes them money.

    That's what kills the traffic flow - like a small number of bad actors can slow any traffic system. When those actors are left in front of the others, with no way to get around them, all the traffic is slowed.

    This is a fix - but it's very much not a general fix for what most affects people's experience online.

    Adblock and script blocking are that for now - but bypassing a >1ms marketing server delay would be the more proper fix if you wanted ads to keep paying for things.

    Have marketing companies absolutely lose their chance to show ads for any, ANY delay would fix their priorities, and fix the web for those that want to keep it an ad-loaded experience.

    Ad/script blocking works for everyone else.

    Ryan Fenton

    • Part of the problem is you that you have morons [google.com] claiming "ad blocking is unethical.

      *facepalm*

      So closing my eyes is fine but if I use technology to do the same thing all of it sudden it becomes "ethical" ??? WTF !

      I use ad-blocking to SPEED up MY browsing experience and not load images + cookies from 30+ different websites.

      • by MrL0G1C ( 867445 )

        Lol, and how ethical is spying on people and selling that information!?!?

        I've read 1984 and in many ways what we have today is worse, most of the population is brain-washed and the majority can't think beyond their genetic programming, hell the majority don't want to think, they find it too hard and want other people to think for them.

    • Being on dialup when I travel (yes really) I have a slightly different view. If I load the page straight-up, it's extremely slow (over a minute). However my dialup also has a compression algorithm and "Disable flash" that I can turn on.

      - The page looks like crap (2 color GIFs and JPEGs are not pretty), but it loads in less than 10 seconds.

      - And of course turning-off Flash means the annoying animated videos don't load.

      From my point of view Images and Flash are the main culprits for making webpages bloated

    • The limitation is servers placed in between users and the content they want, by marketing company servers that demand to be parsed before loading.

      Generally, this isn't true. Most marketing scripts are loaded asynchronously and will load in the background while your main content loads.

      Additionally, if you're a major website, you can demand network performance improvements from the marketing vendors before you'll put their pixel on your site.

      From my experience, cpu time processing javascript, loading large

  • But I suppose we can’t expect Google to attack the real speed killer - the calling and loading of sometimes dozens of third-party trackers and advertisements on a web page.

    • png4life. i actually do literally rage when i drag an image from my browser to the desktop and it ends up being webp. was so sure that shit would have died by now :(
  • by Anonymous Coward

    https://flif.info/ [flif.info]

  • by Waccoon ( 1186667 ) on Friday October 05, 2018 @03:44PM (#57434108)

    WebP is another one of those things as Google that was thrown together as a test vehicle for a compression algorithm. It sucks as a format and comes in multiple inconstant flavors (versions) that make it a PITA to support. Just getting the image dimensions requires a lot of low-level bit shifting and twiddling for no damn reason, and how that's done depends on how the chunks are organized. It's a mess. It's no surprise to me why it didn't catch on. After trying to add support for it on my image board, I just gave up. Retrieving image information is too difficult as there's too many gotchas.

    Of course, Google would prefer that you just use their huge, complex WebP library, so you don't have to worry about how to unwind that horrible mess.

  • by markdavis ( 642305 ) on Friday October 05, 2018 @04:43PM (#57434574)

    Please, Mozilla, if you are going to support WebP images make absolutely sure Firefox complies with the image.animation_mode setting so any WebP animation can be controlled or disabled by the user!

  • Many of the times the large file size isn't due to the format but because the person creating the image didn't bother to optimize the picture. It's not unusual to be able to get an image in which you can shave 90% off of the size without compromising quality. Usually it's more like 30%-60% but I have seen 90%. I used to optimize my images when I had my own site. I do it now and again with a slow loading site just to see what I can do.

    When you have all of these sites and applications that lets anyone build

    • You're right, of course. Unfortunately, stupid and lazy is the default setting of the human race, hence the need to code up solutions that don't depend on humans actually thinking.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...