ILM Now Capable of Realtime CGI 262
Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."
Don't get so excited (Score:4, Informative)
What ILM has is a supercharged 'preview' button. Just like when you switch to wireframe mode in Lightwave [newtek.com] or Maya [aliaswavefront.com] and see a 'realtime' preview of the animation you're working on. But I'm sure ILM's version looks little bit better.
D
Re:Two Towers (Score:4, Informative)
Sensors on Andy S's Body capturing movement data and feeding it into a computer...Much like the mouse you're waving around right now...
Re:Two Towers (Score:4, Informative)
Yes... and then rendering the character on top of the real from using the motion capture info.
It's still realtime rendering.
Re:It's the video card, not the CPU.... (Score:4, Informative)
A high-end video card is used in a workstation for content creation. Final rendering, however, is still done in software (i.e., by the CPU), whether it's LightWave, Mental Ray or RenderMan. Don't waste your money on a Radeon for your render node.
Re:It's the video card, not the CPU.... (Score:1, Informative)
GeForceFX supports 32bit floats and QuadroFX can be up to 20 times faster [nvidia.com] than the fastest CPU in the Maya5 which supports hardware rendering.
Re:Open? (Score:5, Informative)
SGI (Score:3, Informative)
I was friends with several SGI employees when SGI decided to ditch their Intel/WinNT support. Two of my friends were directly involved with the NT-related operations. The decision was mainly related to a series of falling-outs with Microsoft over things like the Fahrenheit relationship. Officially it was attributed to cost-cutting and re-focusing on core competencies, though (SGI has been in a bad way financially for quite a long time).
These days a large part of their revenue stream depends heavily on service contracts for their custom hardware. It would take a seriously impressive balancing act for them to support commodity hardware and remain afloat...
(As a side note, since their custom hardware is so heavily graphics oriented, when things go wrong it often does really interesting things... like an entire render-job where everything ends up with the same bump map, but is otherwise normal...)
Re:Errm... (Score:3, Informative)
Wouldn't realtime by WHILE the scene is filmed
Well, it's hard to act and watch a monitor at the same time. Besides, the CGI they're doing in realtime is just a preview that they can overlay onto the video feed to see sort-of what it would look like.
It's neither (Score:3, Informative)
This is not a function of the CPU of the GFX card - they both play a part, but not as much as you seem to assume.
The main thing here is *bandwidth*. You have to have a large amount of non-contentious bandwidth to chuck all that data around.
Point to point bandwidth between multiple devices - CPU to memory, CPU to GFX, GFX to memory, CPU to output pipe, input pipe to GFX, etc. etc.
One approach was UMA, which SGI pioneered in the O2. A 7year old SGI O2, even a low end one, can handle 800mb textures in realtime with ease. There is little available today that can even approach that.
Another approach is some sort of IO switch. SGI have the crossbar in the Octane/Octane2, which is old tech based on the IO infrastructure of the Origin 2000.
The Octane's crossbar switch gives you a large amount of non-contentious bandwith, which is why, for instance, a low end 195mhz SGI Octane from 1998 can apply in incoming digital feed as a texture to an object, and display that on digital output, in real time.
Remember the Anubis warriors in The Mummy Returns? Their skin was a texture from a feed that was applied in real time - this was so the animators could get a good feel for how their animation changes effected the scene.
Sun have a similar sort of setup with their UPA crossbar on their Ultrasparc kit. IBM and DEC^W^Compaq^WHP a similar deal.
The reason UNIX vendors can charge lots for their kit is the years of R&D they've put into solving problems like this, which just don't appear on low end commodity kit.
The Origin2000 can scale to 1024 CPUs in a single system image - it's one OS, no partitions. Sure, that's for a niche market, but it doesn't change the fact that there is a serious IO contention issue that needs to be solved in that scenario. SGI have then taken that solution and thrown it into their graphics workstations.
CPU speed and gfx speed will not help here - neither will the OS. Linux or Windows, ATI or Nvidia, PC solutions will not cut it in this area - the underlying architecture is poorly suited to those sorts of tasks.
Niche markets for people like SGI will always remain small, but despite the ill-informed nay sayers, they will never die - because there is a need there, and there is no commodity kit that can do the job.
Re:nothing inherantly special about dell/linux (Score:2, Informative)
Re:Serious Question (Score:1, Informative)
SGI still has a serious advantage in servers though.