Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Google Graphics Media The Internet Technology

Google Releases New Image Format Called WebP 378

An anonymous reader writes "Google has released WebP, a lossy image format based on the image encoding used by VP8 (the video codec used in Google's WebM video format) to compress keyframes. According to the FAQ, WebP achieves an average 39% more compression than JPEG and JPEG 2000 while maintaining image quality. A gallery on the WebP homepage has a selection of images which compare the original JPEG image with the WebP encoded image shown as a PNG. There's no information available yet on which browsers will support the WebP image format, but I imagine it will be all the browsers which currently have native WebM support — Firefox, Chrome, and Opera." Independent analysis of WebP is available from a few different sources.
This discussion has been archived. No new comments can be posted.

Google Releases New Image Format Called WebP

Comments Filter:
  • Great. So? (Score:5, Insightful)

    by ceoyoyo ( 59147 ) on Friday October 01, 2010 @09:22AM (#33757866)

    JPEG was cutting edge a couple of decades ago but it's not very hard to beat now. We still use it because everything supports it and it's good enough.

  • True... (Score:3, Insightful)

    by recoiledsnake ( 879048 ) on Friday October 01, 2010 @09:30AM (#33757930)

    From the x264 link:

    What a blur! Only somewhat better than VP8, and still worse than JPEG. And that’s using the same encoder and the same level of analysis — the only thing done differently is dropping the psy optimizations. Thus we come back to the conclusion I’ve made over and over on this blog — the encoder matters more than the video format, and good psy optimizations are more important than anything else for compression. libvpx, a much more powerful encoder than ffmpeg’s jpeg encoder, loses because it tries too hard to optimize for PSNR.

    These results raise an obvious question — is Google nuts? I could understand the push for “WebP” if it was better than JPEG. And sure, technically as a file format it is, and an encoder could be made for it that’s better than JPEG. But note the word “could”. Why announce it now when libvpx is still such an awful encoder? You’d have to be nuts to try to replace JPEG with this blurry mess as-is. Now, I don’t expect libvpx to be able to compete with x264, the best encoder in the world — but surely it should be able to beat an image format released in 1992?

    Earth to Google: make the encoder good first, then promote it as better than the alternatives. The reverse doesn’t work quite as well.

  • Re:Not as Sharp (Score:5, Insightful)

    by m2pc ( 546641 ) on Friday October 01, 2010 @09:31AM (#33757938)
    I disagree. Look at images #3 and #4. The WebP versions are clearly sharper and more detailed than their JPEG counterparts. Other than that, the rest of the images are so close it's difficult to tell which is better. For a 39% size reduction, I think WebP has a clear advantage over JPEG. Some questions remaining are a) will companies actually adopt WebP and popularize it, or will it die a quiet death, and b) how CPU and memory-intensive is the algorithm to implement compared to JPEG, especially in mobile devices with limited resources and CPU power?
  • What a load a crap (Score:5, Insightful)

    by suso ( 153703 ) * on Friday October 01, 2010 @09:34AM (#33757976) Journal

    Most of the formats in general use are over a decade old, and the company says that they're consistently responsible for most of the latency users experience

    BULL SHIT! Images are nothing anymore. Its poor javascript coding, flash ads and all the dependent site components that are responsible for most of the experienced latency now. Images don't mean squat unless you're still on a 28.8 modem.

    Also, one way you can make jpeg images smaller is to set the quality value to 75 or 80, most people won't notice the difference and the size of your image will reduce dramatically. The problem today is that people leave their images at full quality right off their camera and upload a 2MB image file when it really only needs to be 138KB. WebP won't fix that user mistake.

  • Re:Well... (Score:4, Insightful)

    by Anonymous Coward on Friday October 01, 2010 @09:36AM (#33757998)

    There's always need for more compression. It all adds up. One loser at home isn't going to care, whereas an ISP with 20 million users might. Users might care when we eventually switch over to being billed by the byte, and being stuck on slow connections like cellular networks.

  • by Aphoxema ( 1088507 ) on Friday October 01, 2010 @09:37AM (#33758004) Journal

    Something suspiciously absent is any mentioning of license. I don't think it is necessary for me to describe why that's a problem.

  • Re:Well... (Score:5, Insightful)

    by Cornelius the Great ( 555189 ) on Friday October 01, 2010 @09:45AM (#33758052)
    Memory is a concern, especially on embedded devices. Plus, many mobile devices have built-in hardware encoding/decoding support for JPEG to minimize CPU and memory usage.

    PNG is a great format, but we don't need lossless for most pictures on the net.

    Rather, rather than replacing everything with PNG, the web needs a lossy image format with alpha support and ability to do lossless when needed. Oddly enough, (currently) WebP does neither...
  • by Aphoxema ( 1088507 ) on Friday October 01, 2010 @09:53AM (#33758130) Journal

    Yeah, considering that this is /. I'm surprised not more people are asking about that right away.

  • by Moraelin ( 679338 ) on Friday October 01, 2010 @09:56AM (#33758162) Journal

    Did you look at the full size images offered for download? Because the ones on the page are scaled down, and any artefacts will be inherently "antialised" out. But when you look at them at 1:1 zoom and flip between the two, it's not hard to notice small differences. E.g., the wood texture in picture 4 is notably different IMHO and the chairs in 9 look IMHO blurrier.

  • by glatiak ( 617813 ) on Friday October 01, 2010 @09:59AM (#33758182)
    I can only read this with horror -- yet another lossy image format to burden everyone. When I setup a media management system the number of different formats I need to accommodate already makes my head hurt. We are all dancing around the black hole that says the ultimate lossy compression can be achieved by writing to the null device. I guess that is the problem of software -- since it is intangible one can claim better by making it different (and incompatible). One sees few cars on the road with five wheels -- that standardized pretty quick and a long time ago. And I guess everyone likes keeping it art rather than science. Means 'engineer' is just a courtesy.
  • by Chrisq ( 894406 ) on Friday October 01, 2010 @10:01AM (#33758202)
    If it is going to be used instead of JPEG they are going to have to include EXIF data [wikipedia.org]/. I am not clear whether you can currently, evidently some RIFF-based formats can but I am not sure whether just using RIFF enables this.
  • Re:Not as Sharp (Score:5, Insightful)

    by DarkIye ( 875062 ) on Friday October 01, 2010 @10:12AM (#33758356) Journal
    Eh? The pixels you refer to are only slightly darker, not black.

    .

    I'm very impressed with WebP overall. The images are sharper and have better colour tones, and obviously lack those awful JPEG colour smudges. The resolutions are unimportant - the important thing is that WebP produces the images at the same size at similar or superior quality. They are also significantly smaller.

    I'd just like this opportunity to say "fuck the shitty Slashdot comments system". Try and guess which of the myriad reasons is causing me to complain this time!

  • by RingBus ( 1912660 ) on Friday October 01, 2010 @10:20AM (#33758510)

    No, the OP doesn't have a clue and is just ranting.

    This isn't about site loading speeds. WebP is about wasted bandwidth. Serving WebP versions of images for sites is going to be huge win in their bandwidth costs for virtually identical results to the end user's browser.

  • by ooooli ( 1496283 ) on Friday October 01, 2010 @10:30AM (#33758636)

    They're comparing webP to jpeg by testing how well both algorithms can recompress (a set of almost entirely) jpeg images? Really? Really???
    More to the point, jpeg compression artifacts (discontinuities) are a *nightmare* for wavelet coders, so this is in no way fair to jpeg2k.

    Also, from TFA:

    Predictive coding uses the values in neighboring blocks of pixels to predict the values in a block, and then encodes only the difference (residual) between the actual values and the prediction. The residuals typically contain many zero values, which can be compressed much more effectively.

    WTF, this is exactly what a wavelet coder like jpeg2k does, only in a much more sophisticated way.

    This whole thing is so far below any accepted standard of image compression research, it should just be silently ignored.

  • by Anonymous Coward on Friday October 01, 2010 @10:32AM (#33758678)

    So that only leaves one option: cui bono? Could Google want to reduce bandwidth just for its own benefit?

  • Solution: (Score:4, Insightful)

    by thijsh ( 910751 ) on Friday October 01, 2010 @10:48AM (#33758888) Journal
    Good point, a real addition that would be beneficial to mitigate uselessly big photo's would be an image format that contains a thumbnail, small version, larger version and huge version of the photo in progressive order and only downloads the parts needed to display at the size on the screen. JPEG and GIF supported progressive images, with WebP they could enhance on this to have some real images within boundaries clearly segmented in chunks... So when a user uploads a 16MP photo and the website displays it at 320x240 you only download the first two chunks, unless of course you zoom in and the browser downloads the rest of the same file. When launching a new format they have a chance to create something a little revolutionary, the work to add the code to all browsers needs to happen anyway.

    Multiple chunks in progressive sizes will get rid of all the extra thumbnail and small version files that need to be created, stored and downloaded. For example searching an image on Google image search shows:
    - 125x125 thumbnail in results
    - 250x250 zoomin thumbnail over results
    - 550x550 preview over webpage (scaled version of full image)
    - 16MP image when downloading

    When for example you don't like the preview image and don't want to save it you will still have downloaded several MiB... very wasteful, and my cache is littered with several thumbnails per image.
    With the progressive chunked version you would only have downloaded the first few percent of the image until 'chunk_pixels > viewport_pixels'.

    Some other advantages:
    - VP8 is a video codec, so you can predict parts of the larger chunks based on the small chunks before that (basically a gradually focusing video). It may require some specific optimizations but should not increase the total size by a lot (so thumbnails are a free bonus).
    - The images are displayed faster while loading, and not top-down but gradually sharper (the old advantage of progressive encoding, but fuck those JPEGS were ugly).
    - You can display a photo at a low resolution on the webpage but still get sharp high resolution prints without wasting bandwidth of all users just viewing the page.
    - This will make it easier for browsers to scale down large images smoothly (try viewing a 50MP image, no browser scales that smoothly) without requiring massive amounts of CPU.
    - Reduce bandwidth, storage and caching requirement for websites and for clients.

    So Google if you want to save bandwidth: make a format that stores large images in progressive chunks so browsers only need to download as much of the image as is needed to display the current size on screen.
  • Re:Rendering Speed (Score:1, Insightful)

    by Anonymous Coward on Friday October 01, 2010 @11:02AM (#33759134)

    There are no actual WebP formatted pictures there. Those are all standard PNG images.

  • Re:Not as Sharp (Score:1, Insightful)

    by Anonymous Coward on Friday October 01, 2010 @11:12AM (#33759278)
    Well, two reasons of the top of my head, Google maintain image search and they also have their own infrastructure on the backbone that shifts a lot of data. Reducing a 600k image by 37% might not seem like a world changing feat to us, but if Google could make this the de facto standard it could give them some real benefits - similar to the way that Google host JavaScript libraries and you might ask why, since it costs them space and bandwidth, but the caching aspect probably saves them a ton of bandwidth/indexing.
  • Re:Rendering Speed (Score:1, Insightful)

    by Anonymous Coward on Friday October 01, 2010 @11:13AM (#33759296)

    Your computer is not rendering WebP, it is rendering a WebP image exported losslessly to a PNG. So you are just rendering a PNG.

  • by jandrese ( 485 ) <kensama@vt.edu> on Friday October 01, 2010 @11:19AM (#33759374) Homepage Journal
    The assumption with JPEG2000 is that it's going to be torpedoed by some jerk with a patent if it ever takes off. That's why nobody is willing to invest too heavily in it. They were already burned by GIF, they learned their lesson the first time.
  • Re:Not as Sharp (Score:3, Insightful)

    by clone53421 ( 1310749 ) on Friday October 01, 2010 @12:58PM (#33761234) Journal

    Somebody moderated this overrated. I’m not sure why.

    Maybe they thought I uploaded a black picture to be cute. I didn’t, but you’d probably have to tilt your LCD to notice that there’s anything there. Here... I’ll kick the levels way up so you can see the difference.
    http://ompldr.org/vNXAyZw/webp_vs_jpeg_enhanced.png [ompldr.org]

  • Re:Not as Sharp (Score:4, Insightful)

    by fractoid ( 1076465 ) on Friday October 01, 2010 @01:18PM (#33761672) Homepage
    Yeah, I realised that just after posting. *sigh* My bad.

    So I downloaded them and I'm flipping between the two images. I agree that the difference is somewhere between bugger all and diddly squat.

    For the preview images on the page I still maintain that presenting the two images side by side as they do is misleading given that they are pretending that it's "JPEG vs. WebP" when in actual fact it's "JPEG vs. PNG", since they seem to have compressed the right hand side with WebP at full resolution then scaled it down and PNG'd it, thus most likely hiding any artifacts.

Doubt isn't the opposite of faith; it is an element of faith. - Paul Tillich, German theologian and historian

Working...