Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Movies Media Graphics Software

ILM Now Capable of Realtime CGI 262

Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."
This discussion has been archived. No new comments can be posted.

ILM Now Capable of Realtime CGI

Comments Filter:
  • Don't get so excited (Score:4, Informative)

    by derrickh ( 157646 ) on Thursday April 17, 2003 @08:54AM (#5750472) Homepage
    The realtime images aren't -final- renders of the scene. They're just rough drafts. The scene still has to be rendered in full res/texture, which still takes hours per frame.

    What ILM has is a supercharged 'preview' button. Just like when you switch to wireframe mode in Lightwave [newtek.com] or Maya [aliaswavefront.com] and see a 'realtime' preview of the animation you're working on. But I'm sure ILM's version looks little bit better.

    D
  • Re:Two Towers (Score:4, Informative)

    by Zzootnik ( 179922 ) on Thursday April 17, 2003 @08:59AM (#5750500)
    No-no-no-no-no....That was Motion Capture.

    Sensors on Andy S's Body capturing movement data and feeding it into a computer...Much like the mouse you're waving around right now...

  • Re:Two Towers (Score:4, Informative)

    by chrisseaton ( 573490 ) on Thursday April 17, 2003 @09:28AM (#5750667) Homepage
    "Sensors on Andy S's Body capturing movement data and feeding it into a computer"

    Yes... and then rendering the character on top of the real from using the motion capture info.

    It's still realtime rendering.
  • by _|()|\| ( 159991 ) on Thursday April 17, 2003 @09:49AM (#5750791)
    SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. ... You can set up a cheap-ass render farm ... that can do the same job as a SGI render farm ... (Shuttle SFF PC w/ 3 gig CPU + ATI 9700)

    A high-end video card is used in a workstation for content creation. Final rendering, however, is still done in software (i.e., by the CPU), whether it's LightWave, Mental Ray or RenderMan. Don't waste your money on a Radeon for your render node.

  • by Anonymous Coward on Thursday April 17, 2003 @09:51AM (#5750803)
    Actually Radeon 9700/9800 supports only 24bit floating point precision in the pixel shader and thus is unable to produce same quality as the software renderers which use 32bit floating point precision.

    GeForceFX supports 32bit floats and QuadroFX can be up to 20 times faster [nvidia.com] than the fastest CPU in the Maya5 which supports hardware rendering.
  • Re:Open? (Score:5, Informative)

    by Kupek ( 75469 ) on Thursday April 17, 2003 @09:53AM (#5750814)
    It [openexr.net] was released under a modified BSD license [ilm.com].
  • SGI (Score:3, Informative)

    by zero_offset ( 200586 ) on Thursday April 17, 2003 @10:11AM (#5750955) Homepage
    I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).

    I was friends with several SGI employees when SGI decided to ditch their Intel/WinNT support. Two of my friends were directly involved with the NT-related operations. The decision was mainly related to a series of falling-outs with Microsoft over things like the Fahrenheit relationship. Officially it was attributed to cost-cutting and re-focusing on core competencies, though (SGI has been in a bad way financially for quite a long time).

    These days a large part of their revenue stream depends heavily on service contracts for their custom hardware. It would take a seriously impressive balancing act for them to support commodity hardware and remain afloat...

    (As a side note, since their custom hardware is so heavily graphics oriented, when things go wrong it often does really interesting things... like an entire render-job where everything ends up with the same bump map, but is otherwise normal...)

  • Re:Errm... (Score:3, Informative)

    by Sentry21 ( 8183 ) on Thursday April 17, 2003 @10:24AM (#5751092) Journal
    According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.


    Wouldn't realtime by WHILE the scene is filmed

    Well, it's hard to act and watch a monitor at the same time. Besides, the CGI they're doing in realtime is just a preview that they can overlay onto the video feed to see sort-of what it would look like.
  • It's neither (Score:3, Informative)

    by sgi_admin ( 666721 ) on Thursday April 17, 2003 @10:33AM (#5751172)
    You need to be able to render something in realtime, combining rendered scenes and a live video feed, and then to display that on a monitor or writing it out to digital storage - and all in real time.

    This is not a function of the CPU of the GFX card - they both play a part, but not as much as you seem to assume.

    The main thing here is *bandwidth*. You have to have a large amount of non-contentious bandwidth to chuck all that data around.

    Point to point bandwidth between multiple devices - CPU to memory, CPU to GFX, GFX to memory, CPU to output pipe, input pipe to GFX, etc. etc.

    One approach was UMA, which SGI pioneered in the O2. A 7year old SGI O2, even a low end one, can handle 800mb textures in realtime with ease. There is little available today that can even approach that.

    Another approach is some sort of IO switch. SGI have the crossbar in the Octane/Octane2, which is old tech based on the IO infrastructure of the Origin 2000.

    The Octane's crossbar switch gives you a large amount of non-contentious bandwith, which is why, for instance, a low end 195mhz SGI Octane from 1998 can apply in incoming digital feed as a texture to an object, and display that on digital output, in real time.

    Remember the Anubis warriors in The Mummy Returns? Their skin was a texture from a feed that was applied in real time - this was so the animators could get a good feel for how their animation changes effected the scene.

    Sun have a similar sort of setup with their UPA crossbar on their Ultrasparc kit. IBM and DEC^W^Compaq^WHP a similar deal.

    The reason UNIX vendors can charge lots for their kit is the years of R&D they've put into solving problems like this, which just don't appear on low end commodity kit.

    The Origin2000 can scale to 1024 CPUs in a single system image - it's one OS, no partitions. Sure, that's for a niche market, but it doesn't change the fact that there is a serious IO contention issue that needs to be solved in that scenario. SGI have then taken that solution and thrown it into their graphics workstations.

    CPU speed and gfx speed will not help here - neither will the OS. Linux or Windows, ATI or Nvidia, PC solutions will not cut it in this area - the underlying architecture is poorly suited to those sorts of tasks.

    Niche markets for people like SGI will always remain small, but despite the ill-informed nay sayers, they will never die - because there is a need there, and there is no commodity kit that can do the job.
  • by nr ( 27070 ) on Thursday April 17, 2003 @11:34AM (#5751693) Homepage
    They where strange beasts yes, but kickass at the time. Was'nt there a special designed crossbar architecure which glued the onboard GFX circuts to the memory to alow jaw-dropping access times between the GFX and memory banks, I remember they had amazing memory I/O compared to the vanilla Intel boxes they competed with, which only had first generation of AGP. Another thing I remember is that they managed to do the impossible (according to Intel) of running the P-III CPU in a quad setup.
  • Re:Serious Question (Score:1, Informative)

    by Anonymous Coward on Thursday April 17, 2003 @04:50PM (#5754389)
    No competitive effects company uses SGI workstations as primary artists desktops.

    SGI still has a serious advantage in servers though.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...