Forgot your password?
Movies Media

Lucas Digital Releases OpenEXR Format 171

Posted by chrisd
from the no-oscars-till-2005 dept.
frankie writes "Although George Lucas may have gone over to the dark side, at least some of his staff prefer Freedom and light. ILM has released OpenEXR, a graphics file format and related utilities, under a BSD-style license. Among other things, it supports the same 16 bit format used by Nvidia CG and the Geforce FX. OpenEXR runs on Linux, Jaguar, and Irix; other platforms are likely to work with a little help from the community."
This discussion has been archived. No new comments can be posted.

Lucas Digital Releases OpenEXR Format

Comments Filter:
  • by gpinzone (531794) on Wednesday January 22, 2003 @05:09PM (#5138233) Homepage Journal
    Jar Jar in my own home! Thanks Lucasfilm!
  • by goodwid (102323)
    Always nice to see more platforms getting development for graphics..
  • Now everybody will be able to create Jar-Jar Binks models and insert them into substandard movies!
  • by Anonymous Coward on Wednesday January 22, 2003 @05:12PM (#5138262)
    it's [], not 'www.openexr'. Sigh.
  • by stratjakt (596332) on Wednesday January 22, 2003 @05:14PM (#5138286) Journal
    And I would doubt he played any role whatsoever in the decision.

    But its great that now we can all remaster his original films and add our own awkward, out-of-place looking robots, aliens and spaceships.

    I'll have Jar Jar and Indiana Jones doing the hoochie-coo on the roof of a car in American Graffiti.
    • Re:ILM isn't Lucas (Score:5, Informative)

      by dhess (65762) on Wednesday January 22, 2003 @05:23PM (#5138342)
      Actually, he did, since this is the first time that ILM or any other Lucas Digital company has released source code for free.

      It was a group of developers who first floated the idea, but ultimately it was George's call whether or not to do it, and he gave the OK, which is pretty cool, I think.

      • Re:ILM isn't Lucas (Score:3, Informative)

        by _ph1ux_ (216706)
        I believe the parent is Drew Hess - who packaged and maintains the source for OpenEXR.

        Hello Drew, and thanks for all the fish.
      • Lucas controls all (Score:3, Interesting)

        by McSpew (316871)

        I saw a rare interview/profile of Lucas just before AOTC was released, and they pointed out that Lucas is intimately involved in the important decisions for all of his businesses (and he has lots of them). While he might allow small decisions to be made by subordinates, Lucas pretty much nearly micromanages his empire. Can't argue with his management style because it's clearly worked for him. Come to think of it, I wonder if the folks at Pixar would have preferred to stay with Lucas vs. going to work for Steve "Reality Distortion Field" Jobs.

        • That is just perception (mostly I guess). Of course he had to make that decision it's a pretty big one, probably not only legal repurcussions but also economical and business wise. After all this can be seen as ILM give out something that gave them a competitive advantage over others in the field.

          Buthis empire is huge I doub't he micromanages anything, with the exception of Lucasfilm at most. I went to VES last year and it was funny, this guy at the opening reception kept pestering Lucas about this software. He said he didn't know about that and he didn't take care of that, he referred him to Jim Morris. Actually sometimes it seems he doesn't know all the details. In an interview (Millimeter, American Cinematographer or something like that), he was asked if he was aware the headaches the ILM model shop had because of using the HD cameras and he didn't, the previous article detailed all that. Actually he seems pretty laid back since he has to take care of his kids, I guess he lets everyone else worry about everything ;-).
    • Yeah, in the same sense that MS isn't Bill Gates...he isn't even CEO anymore.

      ProfQuotes []
  • by valisk (622262)
    Looks like we could well see a nice improvement in editing software for all those with DV cams in the near future.
    Thank you very much ILM

  • OpenExr []
  • Well, the Jaguar *was* a 64-bit console...
  • by jj_johny (626460) on Wednesday January 22, 2003 @05:20PM (#5138320)
    Get it folks. They designed a format and have some tools but have decided that they want to tap into the great pool of OSS talent. Who says this is not a dark side ploy?

    If all goes as planned all the great OSS software will be written to output this format in no time.

  • Not for Windows? (Score:3, Interesting)

    by Anonvmous Coward (589068) on Wednesday January 22, 2003 @05:26PM (#5138377)
    That's a bummer. Lightwave loves HDRI imagery.

    Out of curiosity, has anybody used HDRI images for textures? I'm curious if the floating point data makes a difference. I could see it being particularly useful for the diffuse and lumination channels. What about color?
    • Re:Not for Windows? (Score:3, Interesting)

      by dhess (65762)
      I'm sure somebody will port it to Windows in time. The libraries themselves are pretty vanilla code, so it should be easy to port. We don't really use Windows for effects work here so it hasn't been a priority for us.

      • "I'm sure somebody will port it to Windows in time. The libraries themselves are pretty vanilla code, so it should be easy to port. We don't really use Windows for effects work here so it hasn't been a priority for us."

        May I ask where you're at? I'm sorry, I don't mean to pry. I'm just curious because I want to work at an FX studio in the next year or so. Any tidbit of info I can get about how things work in a place like that are extremely valuable to me.

        Question: Do you use a tool there like Photoshop (Gimp maybe?) that works in that format? I'm very interested in 'painting' HDRI textures.
    • Not off topic. (Score:5, Interesting)

      by Anonvmous Coward (589068) on Wednesday January 22, 2003 @06:30PM (#5138784)
      "Out of curiosity, has anybody used HDRI images for textures? I'm curious if the floating point data makes a difference. I could see it being particularly useful for the diffuse and lumination channels. What about color?"

      Okay, somebody modded me down as 'Off-Topic'. I'm just going to assume he/she/but probably he didn't understand what I was talking about here.

      OpenEXR is a format for High Dynamic Range Imagery. What this essentially means is that instead of describing a pixel by having 3 channels @ 8-bits per channel (which has a maximum value of 255), you get a floating point 16-bit value per channel which is a measure of intensity. The result? Instead of having just color data there, you have color data & intensity data. The sky's blue, right? If you take a 24-bit picture of the sky, you get blue pixels. Is that enough data? No. Try looking up at the sky without squinting your eyes. Can't do it, can ya? The sky is *very* bright. With the HDRI format, you can store that luminosity as well as the color. That's why they use it for global illumination. You're capturing light sources, intensities, and color at the same time.

      Thing is though, a floating point format has uses in other areas of 3D such as texture mapping. It means you can create/capture textures that deal in intensity as well (just like real life), thus you get a much more realistic response from lights in the scene.

      I have no idea if I'm making any sense here or not, but the main point I'm trying to make here is that I am nowhere near off-topic. That's the reaason this format is interesting. It's not another .PNG or .JPG format, it's a more accurate way of storing information about light, and us people that work in 3D have a lot to be excited about. Since it's just recently become involved in the major 3D Apps out there, the capabilities of it are still in their infancy and I'm curious what people have discovered about it.
      • It allows gamma correction on steroids, as seen in their sample section - they clearly emphasize the many-stops gamma adjustments they can achieve, especially in the very bright or dark areas of the picture.
      • How is having a 16-bit floating point value per channel different from having an 16-bit integer per channel? You can't represent more data with a 16-bit floating point number than you can with a 16-bit integer. 16 bits of information is 16 bits of information no matter how you slice it.

        I also don't see the difference you are talking about with the "intensity" values as opposed to color. If the 8-bit 'r' value in a 24-bit pixel isn't measuring the intensity of red light at that pixel, then what is it measuring?

        I guess my basic question is: Why is this so special? Why couldn't someone, for example, extend PNG to have 16 bits per channel (if it doesn't already) and forget about OpenEXR?

        • Re:Not off topic. (Score:3, Informative)

          by alannon (54117)
          It is VERY different.
          A 16 bit (unsigned) integer value has a range of 0 to 65535, in 1 unit increments.

          A 16 bit floating point (half) value has a range between about .00000006 and 65000 (and also zero).

          The nice thing about floating point numbers is that you can use the precision that it gives you in the most optimal way for your image whereas with integer values, the precision is spread out evenly over your entire range of values.
          In the high range of floating point values (highlights), distances between discrete values will be large. In the low range, they will be small (shadows). Since the eye (and film) is not a linear light sensor (they are close to logarithmic), it makes more sense to deal with pixel values that are floating point instead of integer.

          FP numbers to work with when you're doing image manipulation, since scaling up the data-type size (32 bit floating point) leaves you with data where 0.0 (black) and 1.0 (white), for example, have exactly the same meanings, but you now have extra precision for doing intermediate work on the pixel values. If you shift from a 16 bit integer, to 32 bit integer data type, the values of 0 and 1, for example, now have very DIFFERENT meanings, since the value of 'white' for the 32 bit pixel will have to be shifted upwards to take advantage of extra precision.

          There are a whole series of advantages, though I'm not sure I've stated them well here. Go to their web site for more information, obviously.
        • How is having a 16-bit floating point value per channel different from having an 16-bit integer

          Because light intensity is exponential rather than linear. One canonical example is that a black sheet of paper in full sun is actually brighter (in absolute terms) than a 100 watt bulb in a closed room. Therefore, FP makes more sense than integer.

      • Just to clarify, there is a seperate floating-point intensity for each color (rgb). The above description implies there is a seperate color and intensity.

        Though this is practical if you use a linear space like XYZ so little software uses this that it is probably much more useful to store rgb. Don't even think about trying to store "hue" or store non-linear values like CIE xyz or any of that color-management stuff, non of it is defined well enough for cgi.

        • "Just to clarify, there is a seperate floating-point intensity for each color (rgb). The above description implies there is a seperate color and intensity."

          Yeah, you're right. I should have written that more carefully.

          Is Digital Domain using floating point image formats for anything besides Global Illumination? I started experimenting with using .HDR files for textures in Lightwave, and was surprised at the results. I was just curious if D2 was doing something similar? (Note: I'm not limiting it to Lightwave, that's just the tool I have in front of me right now.)
          • Yes we are using them for rendering output a lot. The renderers will correctly calculate brightness well outside the 0-1 range and this allows us to recover that information in compositing.

            Biggest problem with renderers is that they refuse to treat incoming texture maps as anything other than linear. It would help a lot if they assummed 8-bit files were sRGB brightness levels. In some cases this can be fixed in shaders but that is rather tedious. Also often GUI input of colors is wrong, since it assumes the 8-bit color displayed by the GUI is linear.

            We have been using an in-house format we call RGBA. This is simply SGI's format with the "exponent" of the largest of RGB put in the alpha channel, and the top 8 bits (with a leading 1) of the mantissa of RGB put into the RGB channels. This is an old trick from Graphics Gems. We have not really used Debevec's HDR format, RGBA was much easier to get into our software as it all already read/wrote SGI's format. But EXR is much better and we expect to use it extensively.

            • "We have been using an in-house format we call RGBA. This is simply SGI's format with the "exponent" of the largest of RGB put in the alpha channel, and the top 8 bits (with a leading 1) of the mantissa of RGB put into the RGB channels. ..."

              I think I understand what you're saying. It's a bit of a surprise to me, I didn't realize how much programming work is involved in an FX studio.

              "Biggest problem with renderers is that they refuse to treat incoming texture maps as anything other than linear."

              Has Lightwave been tested in this sense? Before I responded to your post, I used an HDRI image I found and textured a cube with it, comparing it to a 24-bit version of the same texture. In playing with light intensities, I noticed the HDRI image behaved much differently in comparison to the 24-bit image. It responded much better to the light source, even better than a well defined diffuse channel. But I'm still not completely sure how I can use that to my advantage without tools to really create a texture that way. I think I can photograph some textures though that'd work really well. It might be a fun experiment at some point.

              So yeah, I was just curious if D2 had tested Lightwave with HDRI images as textures lately. It seems to support them, but maybe I'm not totally understanding what you mean?

              What got my interest in HDRI started is that my company has a 360 degree spherical video camera with an adjustable shutter. It occured to me that we can replace a light-probe with this camera placed in the center of the action. Since video can be captured, changing lighting conditions (like stage lights) could be captured as well. I wish time was more abundant do I can pursue experimentation with it.


              P.S. Not sure if you remember me or not, we had an email conversation about a month ago. I showed you the rocket image I did. :)
    • That depends what you call a "texture". You can think of a planar reflection map, for example, as just a texture in camera space. If it's to reflect very bright lights, it looks just plain wrong without the extra dynamic range.
  • Hey George! (Score:3, Funny)

    by Jethro On Deathrow (641338) on Wednesday January 22, 2003 @05:28PM (#5138394) Journal
    Quit wasting time with this crap and release the real Star Wars on DVD. And while you are at it, get the Indiana Jones triligy out on DVD too.

  • by stratjakt (596332) on Wednesday January 22, 2003 @05:35PM (#5138449) Journal
    Copyright (c) 2002, Industrial Light & Magic, a division of Lucas Digital Ltd. LLC All rights reserved.

    Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

    - Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

    - Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

    - Neither the name of Industrial Light & Magic nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.


    Reason: Don't use so many caps. It's like YELLING.
    Your comment violated the "postercomment" compression filter. Try less whitespace and/or less repetition. Comment aborted.
    Please try to keep posts on topic.
    Try to reply to other people's comments instead of starting new threads.
    Read other people's messages before posting your own to avoid simply duplicating what has already been said.
    Use a clear subject that describes what your message is about.
    Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be moderated. (You can read everything, even moderated posts, by adjusting your threshold on the User Preferences Page)
    • How are any of these clauses 'interesting'?

      Replace "Industrial Light & Magic" with "The Regents of the University of California" and it is just the standard BSD license.

    • This Software Is Provided By The Copyright Holders And Contributors "As Is" And Any Express Or Implied Warranties, Including, But Not Limited To, The Implied Warranties Of Merchantability And Fitness For A Particular Purpose Are Disclaimed. STAR TREK IS STUPID. In No Event Shall The Copyright Owner Or Contributors Be Liable For Any Direct, Indirect, Incidental, Special, Exemplary, Or Consequential Damages (Including, But Not Limited To, Procurement Of Substitute Goods Or Services; Loss Of Use, Data, Or Profits; Or Business Interruption) However Caused And On Any Theory Of Liability, Whether In Contract, Strict Liability, Or Tort (Including Negligence Or Otherwise) Arising In Any Way Out Of The Use Of This Software, Even If Advised Of The Possibility Of Such Damage.

      George has a problem with the Mountain?
    • What this shows, is that you don't need to print something in nanopoint type to ensure that most people won't notice it. YELLING is also an effective way to make sure that no one hears you too, because it activates their mental this-is-worthless-crap filters.

      "Your honor, my client did not consent to the terms, for he was nor informed of them. After all, the terms were clearly shouted right in his face, in bold, underlined, and blinking. There's no way he could have seen that."

  • Leaving Windows development to their community is a bad idea.

    How long before we step in and port the Linux version?
  • by Amsterdam Vallon (639622) <> on Wednesday January 22, 2003 @05:36PM (#5138458) Homepage
    Before you spend a half-hour downloading any packages, please note that shared libraries aren't supported yet for Mac OS X version 10.2.

    Well, to rephrase this, you can build them, but Lucasfilm have't gotten them to link due to undefined symbols and are probably
    doing something wrong in the Makefile system.

    The test suite will automatically try to link shared libraries if you've built them, so 'make check' will fail. To run the confidence tests, tell configure not to build shared libraries ("./configure --enable-shared=no").

    More details are available in the README document.
  • by NanoGator (522640) on Wednesday January 22, 2003 @05:39PM (#5138473) Homepage Journal
    Hey dudes,

    I was just curious if anybody out there uses HDR imagery (like the OpenEXR format) for anything besides global illumination?

    I've been fiddling with the .HDR format (similar to OpenEXR, I imagine) in Lightwave's various texture channels and have gotten interesting results. (Especially the diffuse channel.) It strikes me that you could lose the diffuse channel all together in favor of a floating point color channel. In english, that means that you have one texture that responds properly to light, as opposed to having to assign the color of the surface in one channel and it's light reflectance in a seperate one.

    That's seriously cool, but I'm in my infancy here with regards to these floating point formats. I'm just curious, who's using HDR in ways besides global lighting? It seems like there's a whole new door opening here.
    • It's great for 2D image processing. HDR images blur really nicely - the highlights stay bright, instead of going murkey grey.
    • Rendering into HDR images is really useful if you want to process the color later using a 2D program. Post process blurs, lens flare, and motion blur are far, far better when done to an HDR image. Also since no data is lost the 3D person does not have to waste lots of time adjusting the lights to get the render into range.

  • They release an extensible file format, some libraries, and it is teh best thing in the world?
    Anybody having to manage picture in more than 16**3 color will think of something simlar. And the extensibility remind me of tiff (TAGGED image file format). In fact, I'm not sure they would not have be able to store all what they want in tiff. So what?
  • by exp(pi*sqrt(163)) (613870) on Wednesday January 22, 2003 @05:54PM (#5138560) Journal
    But have you watched any movies with ILM effects lately? The dynamic range sucks! Episode II was basically characters jumping between matte paintings and each painting looked like it had been painted with an 8 bit paint package. Unless you actually bother to collect data on set that is high dynamic range having the file format is as good as useless.
  • PNG has been accepted as far as browser support, but is relatively (in comparison to JPG and GIF) unused. Unless this image format has vastly improved abilities over the conventional method, this is a non-starter.


    • PNG has been accepted as far as browser support, but is relatively (in comparison to JPG and GIF) unused. Unless this image format has vastly improved abilities over the conventional method, this is a non-starter.

      Don't hold back, tell us what you really think.

      This might come as a shock to some, but the entire world isn't the same as you. They have different needs and different desires. In this case, ILM has a need for an image format which allows for high dynamic range and lossy compression. PNG doesn't supply that. TIFF doesn't supply that. JPG doesn't supply that. So they invented their own, and released it for all to use.

      They really don't care very much about whether your browser supports it (although a nice plugin would be a cool idea, and golly, it is possible because they were kind enough to release the source). They are busy making movies. If you aren't making movies or interested in high dynamic range photography, you probably don't care. But then, they never said you had to care, did they?

      • ILM has a need for an image format which allows for high dynamic range and lossy compression.

        The EXR compression schemes (there are three) are lossless.
        However, your general point is valid.
    • While I disagree with Kickstart70's comment (clearly the new format has advantages that PNG does not have), his comment is obviously on topic. Whatever moderator marked it offtopic should have a bite taken out of his/her karma.

    • This serves a different purpose. This is a well-defined lossless storage for floating-point intensity data. It is going to be at least twice as big as PNG (probably more as the compression is not going to be as good). For images that are not being manipulated, but just shown to the user, there is no need for more than 8 bits. For photographic images (which this is likely to be used for) neither it nor png are good for the net, use lossy compression like jpeg and it will be 10 times smaller.
  • by Anonymous Coward on Wednesday January 22, 2003 @06:11PM (#5138663)
    The submitter doesn't even understand what ILS is offering, 'uses the same 16 bit format as...', no, it uses a special datatype that CG has, and FX will natively support (pssst CG is dead too, thanks to both MS and the OpenGL consortium endrunning them by implementing their own high level shader language)

    the only thing I see this library even offers is the 'capability to store' HDR' (High Definition Rendering) information, which offers better lighting techniques and edge detection.. *free* code to do the exact same thing is available at ATI, nVidia, SIGGRAPH, Usenet, any number of graphic books, etc.

    This story is useless. This code is useless. HDR relies on the rendering technique, not the 'file format'.

    • I'll bite.

      16 bit float is just one of the datatypes it supports. The particular format they chose is not limited to Cg or the GeForceFX, it's the most common 16 bit float format out there, even if it isn't an IEEE standard.

      The DirectX HLSL is (deliberately) syntacticly identical to Cg, so that's actually good for Cg, rather than killing it off as you suggest. OpenGL2's HLSL has yet to be confirmed, but if (as may be likely) it isn't also just like Cg, Cg will still be able to compile to OpenGL2 - it's just another render target, along with DX8/9, OpenGL 1.3/1.4, and nVidia's own extensions.

      HDR info is useful for many many things in both 3D and 2D work (though I'm doubtful about edge detection). Other HDR-supporting formats do exist, including HDRI, TIFF, FLX, and RLA. Even Cineon/DPX supports limited HDR info. Each have their own advantages & disadvantages - OpenEXR is no worse than most, and better than many.

      Rendering technique is only one small part of the whole job. If you want to take HDR info from one device/app/system to another, you have to write it into a file, so you need a file format that won't clip all your highlights...

    • The reason that this is useful is that until recently there hasn't been a standard image format for storing HDR files. There are a couple of formats that support floating point, but they're mainly 32bit/channel, which in most cases is overkill.

      By providing a format specifically designed for HDR images, and providing a library and viewer for it, they will help enable VFX companies to share their data between companies and applications without reinventing the wheel every time.
  • by exp(pi*sqrt(163)) (613870) on Wednesday January 22, 2003 @06:28PM (#5138775) Journal
    Movie making required heavy duty image processing. Often thousands of layers need to be processed together with very complex operations. In order to do this at film res you need to break the image up into tiles. A package like Apple's shake works with 128x128 or 256x256 tiles I can't remember exactly. For maximum efficiency the image files need to be stored as tiles too. So popular file formats used such as Kodak's DPX/Cineon or TIFF support tiling. Without tiling you end up with major cache thrashing as the entire image needs to be read in any time a single tile gets dropped from the cache. (I'm talking about the application cache - not the CPU or memory cache.) Even if you do low quality work at low res (eg. ILM do much of their work at hi def resolution) you can still suffer from this.

    It's not a show-stopper but tiling really ought to be there. This format doesn't really add much to already existing formats and subtracts something important.

    • You're right, tiling support is missing. We've been able to get away without it because we don't typically work in tiles.

      You can load the image in pieces using the FrameBuffer object, but it's scanline-oriented, not tiled. Dunno if apps can get away with that or not.

      Does Shake actually load the original file-based image in tiles, or does it simply tile its internal representation of the image and page that out to/from disk?

      • It's been my opinion that if you really need tiles, you can always do it as an additional layer on top of the image file format. Rather than hacking tiles into the format (and all the complexity that involves), just store each image as a directory containing one tile per file.

        I don't know about Shake but I know Nuke is completely scanline-oriented. Processing in scanlines makes sense because each line fits in the CPU cache, whereas an entire image does not. Performing repeated ops on a single scanline without leaving the cache is a huge performance win. I don't see as clear a benefit for tiling horizontally also, except in specialized cases where you are working on HUGE images (like that 40Kx20K Earth mosaic) or using lots of sub-regions of larger images.
    • Tiling is irrelevant (Score:5, Interesting)

      by Namarrgon (105036) on Wednesday January 22, 2003 @08:13PM (#5139719) Homepage
      Thousands of layers? The most complex composition I've seen personally was the Swordfish Ventura Bank explosion, and that required somewhat over 500 layers (at 4K). Definitely qualifies as heavy duty, but still a far cry from "thousands" of layers. "Often" would be less than 100 layers, in my experience.

      Anyway, tiling as you describe is rarely used in motion picture image processing work, regardless of the number of layers. Breaking down a large (4000x3000 or larger) image does improve memory usage (sometimes at a cost in efficiency for certain algorithms), but when this is done, it's usually broken into scanlines or groups of scanlines, not square tiles. This works just as well and fits better with how images are processed, stored, displayed etc. The number of layers to be composited does not affect this at all.

      DPX and Cineon do not support tiled image packing. TIFF does, but I've never seen a post-production app actually output a tiled image - it just complicates things unnecessarily.

      And it's rarely necessary to re-read an entire image if you just want a subrectangle of it - many formats make it relatively easy to read a limited region. Compression can complicate things, but you can usually limit your reading to just the scanlines involved.

      • Thousands of layers?
        It happens. There's all kinds of curious non-standard work out there that uses large procedurally built flowgraphs. But I'd agree 100s is more common.

        tiling as you describe is rarely used in motion picture image processing work, regardless of the number of layers
        False. Cineon did it. Shake does it. The ImageVision library from SGI, upon which a number of visual effects houses built code, used tiles. Many people use tiled code. But also many people don't.
        DPX and Cineon do not support tiled image packing
        My mistake. IFF, another popular choice in the visual effects world, does support tiling.
        Also, it's not just on input that tiles are useful. Many 3D renderers generate tiled output from buckets because that provides better coherence than scan line order. It's much more convenient to do this with a tiled image format.

        It's worth noting that even if the final render is only at 3k by 2k there may still be much higher res intermediate files that need processing.

        As I say - it's not a show-stopper. But it's nice to have.

    • Tiles are not needed (Score:3, Informative)

      by spitzak (4019)
      Tiles were invented back when image processing meant running PhotoShop on a Mac with 1Mbyte of memory.

      In fact tiles are a complete hinderance to modern programs that want to access arbitrary rectangles of the image and not obey some predefined cutting into tiles. For these programs, "tiles" like in tiff files require reading the entire image into memory before any of it can be returned, completely inverting the entire purpose of tiles. In the software I am writing our tiff reader refuses any tiled tiffs (ie it only accepts files that are one big tile) and we have yet to encounter any tiff that is not just one big tile.

      Many modern programs "tile" the image by cutting it into scan lines or groups of scan lines, which you could consider long narrow tiles. But this requires no special support by the file other than storing the pixels in horizontal order.

      • Tiled TIFFs are used by PRMan and other RenderMan-type renderers to store texture maps. But there are very good reasons for this - mip-mapping and cache locality - and nobody edits the tiled files directly, they just work on a master non-tiled file and convert it to the tiled format before rendering.
        • Actually PRMan *reads* tiff files, but it writes the mip-maps to it's own file format (a .tx file or something like that).

          Pixar has refused to come up with documentation for the texture file format, forcing us to write everything to tiffs and then run their converter, which is a real pita, and we could probably produce better mipmaps if we could start from our original floating point data rather than converting to .tiff.

  • Is there a 16 bit floating point camera? Does Povray generate 16 bit floating point renders? Are TV stations going to start broadcasting 16 bit floating point? It looks like the only way to do it is to spend a few months in front of Maya creating a scene from scratch and render a few hundred variations in brightness.
    • All film cameras (still and moving) record some high-dynamic range information, yes, and the better digital cameras do too. Video cameras don't, however, and couldn't store it if they did.

      Most 3D packages render directly to high-dynamic range formats, including (I think) Povray. You don't need to render multiple exposures.

    • A film scanner like the ones used at VFX houses can produce material with up to 14 bits per channel of color resolution. So can Panoscan's MK1 [] HDR camera. For reasons outlined in another thread [], there are advantages to using FP numbers rather than integers to represent these values.

      The CCDs used in these devices are pretty expensive and aren't available in pro-sumer or consumer devices. For now.

      Apps like Idruna's Photogenics [], Paul Debevec's HDRShop [], and Greg Ward's Photophile [] can produce HDR FP images from scans of photos of the same scene using different exposures. This works with the cheap color scanner that you bought at Fry's or Best Buy.

      As for synthetic images, Renderman, Mental Ray use 32-bit FP internally. They can already produce 32-bit TIFF images. We're working on making the OpenEXR display drivers for these apps available with the rest of the OpenEXR software distribution.

    • Check out Paul Debevec's web site []. He seems to have pioneered (correct me if I'm wrong) a lot of image-based rendering techniques. HDR images are an important part of this. He describes how to recover HDR images from photographs [], how to create "light probes" [] (HDR environment maps), and then how to light synthetic scenes with a light probe [].

  • Available for Atari? (Score:3, Interesting)

    by cant_get_a_good_nick (172131) on Wednesday January 22, 2003 @07:05PM (#5139069)
    OpenEXR runs on Linux, Jaguar , and Irix

    I'm glad someone is finally releasing software for the Atari Jaguar, it was such an unloved system.

    Bad jokes aside, too many damn codenames that mean the same thing. Sometimes i realize why folks make stupid names like Itanium and Infinium.... no one else will be stupid enough to use them.
  • Just curious if there was a Photoshop plugin or Gimp feature or something where one could paint using the 16-bit format?

    I'd like to paint textures this way, it's a lot more natural than today's 24-bit formats. Kinda sad, really. Since HDRI (high dynamic range imagery) came along, 24-bit seems so limited! So I'm hoping that something like Photoshop comes along soon and supports it.

    Today what you have to do is make a sequence of images (3 or 4) that represent the image at different intensities so that a program can analyze them and develop a luminance curve. Which is fine, but it's a bit tricky to paint a texture that way. (works fine with photographs, though...)

    Just curious about what kinds of tools are out there. I'm only recently developing an interest in this format.
    • by dhess (65762) on Wednesday January 22, 2003 @07:38PM (#5139391)
      We submitted an OpenEXR plugin to the Film Gimp [] team, and I understand it'll show up in the next release.

      Also, Idruna Software [] is working on OpenEXR support for their Photogenics package. It already supports creation of and painting on HDR formats.
      • "We submitted an OpenEXR plugin to the Film Gimp [] team, and I understand it'll show up in the next release.

        Also, Idruna Software [] is working on OpenEXR support for their Photogenics package. It already supports creation of and painting on HDR formats."

        Interesting. I'm glad you mentioned Idruna Software because I was under the impression Gimp was the only paint prog on the block for Linux.
  • Good C++ style (Score:3, Insightful)

    by captaineo (87164) on Wednesday January 22, 2003 @09:38PM (#5140350)
    I've been reading over the code - anyone who wants to study good C++ style should definitely check this out, even if you aren't interested in graphics! The ILM libraries make good use of templates, exceptions, operator overloading, and iostreams - in ways that are clear and easy to understand (as opposed to many other C++ libraries I've seen...). You'll have to look hard to find a more appropriate application of C++ features.

The more cordial the buyer's secretary, the greater the odds that the competition already has the order.