Google Releases Open Source 'Guetzli' JPEG Encoder (betanews.com) 83
BrianFagioli writes: Today, Google released yet another open source project. Called "Guetzli," it is a JPEG encoder that aims to produce even smaller image file sizes. In fact, the search giant claims a whopping 35 percent improvement over existing JPEG compression. If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, users can experience faster load times while using less data. While Google didn't aim to improve JPEG image quality with Guetzli, it seems it has arguably done so. It is subjective, but the search giant surveyed human beings and found they preferred Google's open source offering 75 percent of the time. Smaller file sizes and better image quality? Wow! Google has done something amazing here.
Re: (Score:2, Interesting)
BFD (Score:1)
who cares.. most of the bloat in today's sites comes from the javascript blobs..
Re: (Score:2)
Last time I checked JavaScript was transferred in character format, not binary
Re: (Score:2, Insightful)
Which makes it even bloatier.
And lossy compression doesn't work to well on it either...
Minification, not NoScript (Score:2)
Lossy compression of javascript works fukin awesome.
If you're referring to minification, I'm inclined to agree. It loses variable names and internal comments, compressing them to a table mapping minified JavaScript files to their source code [gnu.org].
Its called NoScript. Loses it entirely!
Installing that and then not whitelisting any sites makes interaction with web applications require a full page reload for each update. Good luck getting web-based chat to refresh in a timely manner or making web-based image editing programs respond quickly without script. I assume that web users who find full page reloa
Re: (Score:1)
Re:BFD (Score:5, Informative)
You can send it as binary - use HTTP compression. I believe most webservers support it as do webbrowsers, so the javascript file is compressed and the compressed binary is sent over the wires, and the web browser will decompress the blob back to the original javascript.
Re: (Score:2)
That's not a Javascript blob, that's a zgip or deflate archive.
Re: (Score:2)
That was my reaction as well. Instead of:
If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, users can experience faster load times while using less data.
the summary should read:
If you are wondering why smaller file sizes are important, it is quite simple -- the web. If websites can embed smaller images, they can use the saved space to load five different Javascript frameworks at 5MB apiece.
Re:Will probably also be useful for video keyframe (Score:5, Interesting)
Until you read the bit about it needing 300mb of ram per megapixel of input data.
I don't think we'll see any hardware encoders being able to implement the algorithm any time soon.
And certainly not running in realtime on a CPU.
Re: (Score:1)
Bullshit. MJPEG is used in almost all security cameras running at 30 or 60 fps in real-time.
Re: (Score:2)
There is little or no value to a full I frame anymore. These days, to handle better bitrate allocation within a video stream, it's better to avoid burstiness by spreading I macroblocks across more frames. Not only that, but I frames only provided good quality based on a timer, not based on when it was needed. So for example, a blinking traffic light might have had to be encoded as B blocks which are actually quite inefficient when handling major changes and also requires a great
Keyframes are for seeking (Score:2)
There is little or no value to a full I frame anymore.
Adding more intra-frames reduces the time that a throbber is displayed when the user has chosen to seek to a different point in a recorded video or join a live stream in progress. And as you pointed out, "scrubbing" is the upper limit of seeking speed.
I frames only provided good quality based on a timer, not based on when it was needed.
That depends on the video encoder. Older real-time MPEG-1 and MPEG-2 encoders had a rigid "group of pictures" (GOP) structure that inserted I frames on a timer. Apple's HTTP Live Streaming breaks a video into a playlist of four-second segments, each of which n
Just you wait... (Score:1)
... until they're sued by Comcast, Cox, Verizon, T-Mobile, Sprint, AT&T, Disney and my Mom over lost profits from metered bandwidth fees. After all, it worked for the Oil industry for decades...
Re: (Score:3)
I can't provide a why or why not other than "does it really matter?"
Know how else users can get faster load times? (Score:5, Insightful)
By websites not have 20 tracking pixel GIFs, 50 different ad servers, 5 different CDNs, 10 tracking servers, and a partridge in a pear tree. Websites are built fucking stupid these days, too much shit relies on too many other sources to work correctly and if even one doesn't respond in a timely manner, the whole thing stalls.
Re: (Score:1)
Red_Chaos1 wins the thread. Next topic, please.
Re: (Score:1)
google pursues better video and image compression solely to benefit *them* (more compression = less storage space for them. more compression = more room on the page for the tracking and ad bullshit, autoplaying videos and other crap) and by purely coincidental extension, their advertising partners. they don't give a flying fuck about the end user.
Re: (Score:2)
Yup, we're done here.
Re: (Score:2)
But what is a distinct site? (Score:2)
A page should not contact more than 5 or say x count of other sites.
By "other sites" do you mean other origins, where an origin is a (protocol, hostname, port) tuple, or other registrable domains as defined by the Public Suffix List? For connection overhead purposes, you want origins, but for privacy purposes, you want registrable domains.
Re: (Score:2)
Depending on the source used, it may also have a more recent version of the JS library than might be otherwise used, as some of the CDNs that maintain the libraries automatically update to the latest compatible version.
Re: (Score:2)
optipng -q *.png
jhead -purejpg *.jpg
If the new algorithm is better than currents one, I'll use it.
Re: survey (Score:5, Informative)
I don't know what the survey compared it to. But I did a quick test with the same result.
Source: https://en.wikipedia.org/wiki/... [wikipedia.org] (a resized version was used)
Input Image: ../zayapa.jpg JPEG 1608x949 1608x949+0+0 8-bit sRGB 1.097MB 0.000u 0:00.000
Output Image: zayapa.jpg[1] JPEG 1608x949 1608x949+0+0 8-bit sRGB 641KB 0.000u 0:00.000
Only "gotcha" is:
$ time guetzli ../zayapa.jpg zayapa.jpg
real 5m39.725s
user 5m35.340s
sys 0m4.188s
It is a relatively low-end Pentium processor, but still over 5 minutes to compress a 1MB image is too high.
Re: (Score:2)
It is a relatively low-end Pentium processor, but still over 5 minutes to compress a 1MB image is too high.
Is it? It only has to be compressed once and put on a website, where it will be served to a million browsers and decompressed a million times. Does decompression take any longer?
finally (Score:2)
about time google started to use middle-out compression techniques, but I feel bad for the little guys they stole it from.
Re: (Score:2)
Weak statistics (Score:4, Insightful)
the search giant surveyed human beings and found they preferred Google's open source offering 75 percent of the time.
How many human being? It is in the research paper:
23 raters participated in our experiment.
Statistics on 23 persons seems rather weak.
Re:Weak statistics (Score:5, Informative)
At a 95% confidence level, a sample of 23 will give you a range of +/-18% on that 75% result (57%-93%). At 99% confidence level, the range is +/-23%. We can be pretty confident that more will prefer the Google result than not.
Guetzli? (Score:2)
I never thought I'd ever see the day that a large international corporation gives its product a Swiss-German name.
Re: (Score:2)
Guetzli means Cookie in Swiss German (Score:5, Informative)
Just so you know
Re: (Score:2)
I'm not a fan of lossy image compression anyway. It can accurately described as "a way to damage images."
What license is required for encoding BPG? (Score:2)
There are no longer any requirements for license or royalties for the use of the HEVC components for most cases of decoding BPG images.
Citation? And what license is required for encoding, such as a server that makes BPG thumbnails of uploaded JPEG or PNG images?
Re: (Score:2)
If x265 implements the same encoding methods as HEVC, which it probably does if it creates a compatible bitstream, it's covered by the U.S. patents that cover HEVC. That's the whole concept of "essential patents".
It's like the OGG/Vorbis of JPEG (Score:2)
That's right. I'm that one asshole using OGG and annoying people with it.
Re: (Score:2)
I like OGG, but if I have the option I just FLAC it.
Re: (Score:2)
Hank Hill... (Score:1)
Not impressed (Score:4, Informative)
I just gave this thing a try on a Win7 x64 box on a Core i5 3770K @ 3.4 GHz, and the results are... interesting, but not in a good way.
First, I tried to explicitly use a quality level of 78% to test it against some low-quality images, and the program immediately yelled at me that if I want to use a setting below 84%, I need to modify the source code and recompile. WTF? Do we really need idiot-proofing of command-line utilities, too? Second, I found out that setting a quality of 88% actually sets the quality much higher than 90%, resulting in huge, HQ files. I have no idea how they determine the quality from the command-line flags as I haven't looked at the source, but apparently this program is pretty buggy. Unsurprisingly, like most programs these days, Guetzli only utilizes one CPU core, so it will be slow, but at least it won't lock up your machine while it works.
Anyway, compared to the default Photoshop "save for web" feature, this program makes files about the same filesize, but takes about 200-300x as long (roughly 1.5 minutes for Guetzli compared to less than a second for Photoshop). All my test images were between 0.5 to 1.0 megapixel, and consisted of gradient shaded cartoons and a few shaded pencil drawings, which normally show horrible artifacts and are difficult to compress well. For the images I used at 90% quality, there's apparently no real advantage. I couldn't get the quality settings in Guetzli to work correctly, so I did the runs with Google's tool first, then made comparable images with Photoshop to match the quality. File size differences were less than 10%. I did find that that Guetzli prioritizes chroma over luminance, so strong colors have fewer artifacts, but base colors and B&W patches have more artifacts and are blurrier. The net quality is about the same overall, so this tool is disappointing. If you're going to use this utility, it's best reserved for highly saturated pictures, but overall I didn't see any gains in compression.
If you're looking for serious gains in compression, you're better off using PNGOut by Ken Silverman to crush PNG files. It's usually not worth trying to get more compression out of JPEG files over a utility like Photoshop, Irfanview, or ImageMagick. Even JPEGTran never gave me any significant gains over Photoshop's JPEG routines.