Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
Google Open Source Software The Internet Technology

Google Releases Open Source 'Guetzli' JPEG Encoder (betanews.com) 83

BrianFagioli writes: Today, Google released yet another open source project. Called "Guetzli," it is a JPEG encoder that aims to produce even smaller image file sizes. In fact, the search giant claims a whopping 35 percent improvement over existing JPEG compression. If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, users can experience faster load times while using less data. While Google didn't aim to improve JPEG image quality with Guetzli, it seems it has arguably done so. It is subjective, but the search giant surveyed human beings and found they preferred Google's open source offering 75 percent of the time. Smaller file sizes and better image quality? Wow! Google has done something amazing here.

Google Releases Open Source 'Guetzli' JPEG Encoder

Comments Filter:
  • by Anonymous Coward

    who cares.. most of the bloat in today's sites comes from the javascript blobs..

    • Last time I checked JavaScript was transferred in character format, not binary

      • Re: (Score:2, Insightful)

        by arth1 ( 260657 )

        Which makes it even bloatier.
        And lossy compression doesn't work to well on it either...

      • Re:BFD (Score:5, Informative)

        by tlhIngan ( 30335 ) <{slashdot} {at} {worf.net}> on Friday March 17, 2017 @02:19AM (#54056351)

        Last time I checked JavaScript was transferred in character format, not binary

        You can send it as binary - use HTTP compression. I believe most webservers support it as do webbrowsers, so the javascript file is compressed and the compressed binary is sent over the wires, and the web browser will decompress the blob back to the original javascript.

    • That was my reaction as well. Instead of:

      If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, users can experience faster load times while using less data.

      the summary should read:

      If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, they can use the saved space to load five different Javascript frameworks at 5MB apiece.

  • ... until they're sued by Comcast, Cox, Verizon, T-Mobile, Sprint, AT&T, Disney and my Mom over lost profits from metered bandwidth fees. After all, it worked for the Oil industry for decades...

  • by Red_Chaos1 ( 95148 ) on Thursday March 16, 2017 @10:19PM (#54055481)

    By websites not have 20 tracking pixel GIFs, 50 different ad servers, 5 different CDNs, 10 tracking servers, and a partridge in a pear tree. Websites are built fucking stupid these days, too much shit relies on too many other sources to work correctly and if even one doesn't respond in a timely manner, the whole thing stalls.

    • by Anonymous Coward

      Red_Chaos1 wins the thread. Next topic, please.

    • by Anonymous Coward

      google pursues better video and image compression solely to benefit *them* (more compression = less storage space for them. more compression = more room on the page for the tracking and ad bullshit, autoplaying videos and other crap) and by purely coincidental extension, their advertising partners. they don't give a flying fuck about the end user.

    • Yup, we're done here.

    • I am the web developer and I approve this message.
    • Of course you are right. But squeezing files on a web server is always wellcome. I routinelly optimize all my JPEG and PNG images on my site:
      optipng -q *.png
      jhead -purejpg *.jpg
      If the new algorithm is better than currents one, I'll use it.
  • about time google started to use middle-out compression techniques, but I feel bad for the little guys they stole it from.

  • Weak statistics (Score:4, Insightful)

    by manu0601 ( 2221348 ) on Thursday March 16, 2017 @11:30PM (#54055837)

    the search giant surveyed human beings and found they preferred Google's open source offering 75 percent of the time.

    How many human being? It is in the research paper:

    23 raters participated in our experiment.

    Statistics on 23 persons seems rather weak.

  • I never thought I'd ever see the day that a large international corporation gives its product a Swiss-German name.

    • I'm trying to make a joke about Nestle here, but I'm afraid my brain is rotten after eating everything they produce.
  • by Anonymous Coward on Friday March 17, 2017 @03:53AM (#54056595)

    Just so you know

  • That's right. I'm that one asshole using OGG and annoying people with it.

    • I like OGG, but if I have the option I just FLAC it.

    • Yes, because Vorbis is a better encoder for the MP3 format, instead of a completely different format, to continue your analogy in the most logical fashion.
  • Not impressed (Score:4, Informative)

    by Waccoon ( 1186667 ) on Friday March 17, 2017 @08:26PM (#54063039)

    I just gave this thing a try on a Win7 x64 box on a Core i5 3770K @ 3.4 GHz, and the results are... interesting, but not in a good way.

    First, I tried to explicitly use a quality level of 78% to test it against some low-quality images, and the program immediately yelled at me that if I want to use a setting below 84%, I need to modify the source code and recompile. WTF? Do we really need idiot-proofing of command-line utilities, too? Second, I found out that setting a quality of 88% actually sets the quality much higher than 90%, resulting in huge, HQ files. I have no idea how they determine the quality from the command-line flags as I haven't looked at the source, but apparently this program is pretty buggy. Unsurprisingly, like most programs these days, Guetzli only utilizes one CPU core, so it will be slow, but at least it won't lock up your machine while it works.

    Anyway, compared to the default Photoshop "save for web" feature, this program makes files about the same filesize, but takes about 200-300x as long (roughly 1.5 minutes for Guetzli compared to less than a second for Photoshop). All my test images were between 0.5 to 1.0 megapixel, and consisted of gradient shaded cartoons and a few shaded pencil drawings, which normally show horrible artifacts and are difficult to compress well. For the images I used at 90% quality, there's apparently no real advantage. I couldn't get the quality settings in Guetzli to work correctly, so I did the runs with Google's tool first, then made comparable images with Photoshop to match the quality. File size differences were less than 10%. I did find that that Guetzli prioritizes chroma over luminance, so strong colors have fewer artifacts, but base colors and B&W patches have more artifacts and are blurrier. The net quality is about the same overall, so this tool is disappointing. If you're going to use this utility, it's best reserved for highly saturated pictures, but overall I didn't see any gains in compression.

    If you're looking for serious gains in compression, you're better off using PNGOut by Ken Silverman to crush PNG files. It's usually not worth trying to get more compression out of JPEG files over a utility like Photoshop, Irfanview, or ImageMagick. Even JPEGTran never gave me any significant gains over Photoshop's JPEG routines.

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra

Working...