Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Movies Media Graphics Software

ILM Now Capable of Realtime CGI 262

Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."
This discussion has been archived. No new comments can be posted.

ILM Now Capable of Realtime CGI

Comments Filter:
  • Errm... (Score:4, Insightful)

    by bconway ( 63464 ) on Thursday April 17, 2003 @07:40AM (#5750380) Homepage
    According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.

    Wouldn't realtime by WHILE the scene is filmed?
    • Re:Errm... (Score:2, Interesting)

      by thona ( 556334 )
      Well, how can the ACTOR look at the scene WHILE he is playing it, without looking like he is looking at a scene. Also the director is propably more concentrated on the screenplay.
      • Re:Errm... (Score:2, Insightful)

        by sigep_ohio ( 115364 )
        'Also the director is propably more concentrated on the screenplay.'

        not if your name is George Lucas. then it is all about the eye-candy
      • Re:Errm... (Score:2, Redundant)

        by The_K4 ( 627653 )
        DARK HELMET: What the hell am I looking at? When does this happen in the movie? COL. SANDERS: Now. You're looking at now, sir. Everything that happens now is happening now. DARK HELMET: What happened to then? COL. SANDERS: We passed then. DARK HELMET: When? COL. SANDERS: Just now. We're at now, now. DARK HELMET: Go back to then. COL. SANDERS: When? DARK HELMET: Now! COL. SANDERS: Now? DARK HELMET: Now! COL. SANDERS: I can't. DARK HELMET: Why? COL. SANDERS: We missed it. DARK HELMET: When? COL. SANDERS: Just
    • Wouldn't realtime by WHILE the scene is filmed?

      As for the term 'realtime', without a reference to what it is realtime against, it could mean anything. It's just how the language works, without a relation to something else, we have to assume something and hope the author intended what we assume.

      Just goes to show the inadequacy of languages, and why so many confusions are taking place in this world. No need to get upset about it though.
    • Re:Errm... (Score:5, Insightful)

      by UCRowerG ( 523510 ) <UCRowerG@@@yahoo...com> on Thursday April 17, 2003 @07:53AM (#5750463) Homepage Journal
      Technically, perhaps. I think this is a great tool for directors and actors. Instead of having to wait weeks/months to incorporate CGI and see the interaction, it can be done in minutes/hours or as fast as the CGI people can splice things together. The director can give near-immediate feedback to the actor(s), which could really help the movie get done more quickly and with fewer costs in the long run. Think about it: changing the expression/pose/color on a CGI character is fairly easy. Re-filming live actors, especially with live fx, can take much longer and be more expensive (salaries for actor, director, film crew... lighting, film, makeup, fx expenses).
      • I can't wait until we can wholesale replace hollywood with a desktop box. No more listening to Tim Robbins flap his jaw about politics. No more waiting to replace the prima dona actress that just walked off the set because the production crew never ordered her Folgers high colonic. It's a directors dream (just like in S1m0ne, the Paccino movie no one saw).

        There are drawbacks however. Getting a half dog for a CG character would definately throw up some flags. We'd have to update the Bible. No gay sex,
        • About S1M0NE...I just watched this movie last weekend. The problem with completely CG "actors", is that, at some point and time, the computer guy is going to do something stupid that tips people off that they're actually watching a CG "actor", not a human.

          In the movie, at the awards show, S1M0NE won an award, and being CG, couldn't be live at the show. So, she appeared via satellite from some 3rd world country where she was doing "Charity work". In the background, there's a "huge" wind storm, and her behav

      • Re:Errm... (Score:5, Insightful)

        by jeffgreenberg ( 443923 ) on Thursday April 17, 2003 @08:12AM (#5750571) Homepage
        This is particularly important as they aren't using film.

        WIth HD Lucas is shooting actors on Video...and now doing previsualization with the CG elements on set.

        Did Liam look in the general direction of, but not AT the eyes of the CG character? Reshoot. etc. etc. etc.

        Additionally a rough edit can be done off the video tap on set with the rough CG edit.

        Unfortuantetly this still means nothing without good acting, a good script, or alternate footage to make decisions from.

        You make a film three times.

        Once on the page, once while directing, and once in the edit. But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all.

        Oh well, at least you can see what the giant CG creature looks like
        • Re:Errm... (Score:3, Insightful)

          by UCRowerG ( 523510 )
          Once on the page, once while directing, and once in the edit. But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all. I think this is exactly the point of this story. Whereas before, a director would have to fit the CGI to the live action already filmed, or expend a *lot* more money in bringing the actors, crew, etc. back to re-shoot the scene (several times). Now, a director can find a good "fit" for a scene almost immedi
        • Re:Errm... (Score:2, Insightful)

          by Stickster ( 72198 )

          But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all.

          If you're George Lucas, you don't discover anything in the edit, you simply use CGI to change the actors' bodies to fit what you want. If you listen to the Episode I and II DVD commentaries, you will hear some very interesting details about how actors' positions on "set," their limbs, and even their faces were changed in post to suit Lucas' direction. It's no wonder th

      • I think this is a great tool for directors and actors. Instead of having to wait weeks/months to incorporate CGI and see the interaction, it can be done in minutes/hours or as fast as the CGI people can splice things together.

        From the article:

        "It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."

        So how is this different from using wireframe models to do live action takes? That's been done for years now. All I can tell from
    • Re:Errm... (Score:3, Insightful)

      by mrtroy ( 640746 )
      No, adding the effects in "realtime" would still force you to rewind and watch it after.

      That would be like saying videotaping isn't "realtime" since you have to rewind!
    • Re:Errm... (Score:3, Informative)

      by Sentry21 ( 8183 )
      According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.


      Wouldn't realtime by WHILE the scene is filmed

      Well, it's hard to act and watch a monitor at the same time. Besides, the CGI they're doing in realtime is just a preview that they can overlay onto the video feed to see sort-of what it would look like.
    • "Wouldn't realtime by WHILE the scene is filmed? "

      No. It means they can play it back at any camera view without having to wait for it to render.
    • According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.

      Well if you have figured out a way they can watch a monitor and do their scene at the same time, let them know.

      It could usher in a whole new era- I want to watch myself actually doing work, while Im doing it. But if Im watching it, then Im not doing it... This could possibly create a paradox and make the universe disappea

  • by say ( 191220 ) <.sigve. .at. .wolfraidah.no.> on Thursday April 17, 2003 @07:40AM (#5750381) Homepage
    ...my webserver has been doing realtime CGI for years.
  • by BeninOcala ( 664162 ) on Thursday April 17, 2003 @07:41AM (#5750387) Homepage
    We have Real-time CGI Porn?
  • Realtime (Score:4, Funny)

    by Anonymous Coward on Thursday April 17, 2003 @07:41AM (#5750390)
    Maybe it IS realtime, but the actors just don't have the skill to watch themselves on a monitor WHILE acting, so they use the obvious 'i'll watch when i'm done method'
  • by cdemon6 ( 443233 ) on Thursday April 17, 2003 @07:42AM (#5750393) Homepage
    "Realtime CGI in Movie Quality" would be impressive, but:

    "It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."
    • This will be resloved with the new nVidia GeForce FMXP 41000x [nvidia.com] It's supposed to be 3rd quarter this year. It's supposed to totally smoke anything ATI has right now, with photographic quality and with at least 6 times the Q3 fps. And it has outputs to a film projector. And built in lasers. And 7.1 Audio onboard. And a soda/Pizza exhaust port.

      ....and it's supposed to be profitable.
  • Serious Question (Score:4, Interesting)

    by Anonymous Coward on Thursday April 17, 2003 @07:42AM (#5750394)
    With all the excitement over ILM using Linux I'm wondering exactly how many Hollywood visual effects studios use Linux.
    • by MrMickS ( 568778 ) on Thursday April 17, 2003 @10:09AM (#5751519) Homepage Journal
      Linux will be used on commodity x86 hardware for render farms by all effects studios, if not now in the near future. The reason for this. Bang for buck density. In order to render complex scenes you need a large render farm, the more faster units you have in the farm the better. It's cheaper to do this with x86 kit that anything else and the render software has Linux render engines written for it.

      More and more manufacturers are coming out with blade servers using x86 processors which will increase this density and likely increase the use.

      This is not saying that the studios are not running SGI kit for animation, modelling etc. Linux/x86 kit has a way to go to catch up there.

  • by AssFace ( 118098 ) <stenz77.gmail@com> on Thursday April 17, 2003 @07:44AM (#5750399) Homepage Journal
    The way that is worded, it makes it sound as if the processing power of an Intel/Linux combination is superior - whereas it is a matter of the bang for the buck instead.

    You can get more processing power with the latter since it is cheaper (I would imagine even moreso with AMD) and easier to maintain. But not because it is inherently special or faster in any way.

    I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).
    • SGI (Score:3, Informative)

      by zero_offset ( 200586 )
      I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).

      I was friends with several SGI employees when SGI decided to ditch their Intel/WinNT support. Two of my friends were directly involved with the NT-related o

      • by rf0 ( 159958 )
        SGI are still actually supporting all the NT boxes on a hardware level. Going from what I've heard from friend I have in the UK office is that they are a pain to maintain now. one reason sited for them getting out of the market was the ongoing support costs, and the actual prices charged for the support contracts didn't actually cover the costs of repairs

        Rus
    • I wonder if this will bring Silicon Graphics back into the favor of Intel boxes

      The SGI x86 boxen were strange beasts with freaky graphics cards designed to be used as graphics workstations. They were very good but very expensive, for x86 hardware. Why would anyone buy an expensive SGI branded PC when they can buy a cheaper one, slap a good graphics card in, and get good enough performance?

      There's no money for SGI in PC hardware, they've tried it and it flopped. SGI's market is making kit that can do thi

      • They where strange beasts yes, but kickass at the time. Was'nt there a special designed crossbar architecure which glued the onboard GFX circuts to the memory to alow jaw-dropping access times between the GFX and memory banks, I remember they had amazing memory I/O compared to the vanilla Intel boxes they competed with, which only had first generation of AGP. Another thing I remember is that they managed to do the impossible (according to Intel) of running the P-III CPU in a quad setup.
        • I used one for a couple of years (SGI 320) - the graphics were ok, but nothing amazing, certainly slower than the Geforce 2 for games use. Also it was very much tied to Windows NT/2000. Nothing else will run on them as far as I know. RAM was ludicrously expensive due to the shared memory pool I guess. USB ports were non-standard (voltages were slightly different AFAIK).

          In short, it was "alright", but certainly not worth the price, IMO. I only had a single 450mhz P3 in mine though, so maybe a dual CPU would
  • further proof (Score:2, Insightful)

    by Anonymous Coward
    that proprietary unix is dying
    • Re:further proof (Score:5, Interesting)

      by Alan Partridge ( 516639 ) on Thursday April 17, 2003 @07:56AM (#5750482) Journal
      not even close

      further proof that commodity hardware is killing innovative companies like SGI, and a FREE UNIX is helping it happen.

      Linux is great for a company like ILM which is stuffed full of coders who can adapt it to suit their needs, not so good for many other companies.
      • If commodity hardware is killing SGI, then they must not be sufficiently innovative anymore, either that, or their innovation was irrelevant to the market they once served.
    • mid 1980's - Mainframes are dying
      mid 1990's - NT will replace Unix
      late 1990's - Linux will replace Windows
      2000's - Linux will replace proprietary Unix

      None of these have happened yet. I doubt that the last will ever happen.

      oops missed one:
      late 1980's to current day - Apple is dying

      Pls mod down parent as overrated - it's an opinion based on an emotional response rather than reason.

  • Two Towers (Score:5, Insightful)

    by alnya ( 513364 ) on Thursday April 17, 2003 @07:44AM (#5750407)
    In the Fellowship of the Ring DVD, Peter jackson can clearly be seen watching golum on a monitor (low poligon, but golum none the less) performing the mo-cap Andy Serkis is performing IN REAL TIME; as it is happening (not after).

    So does this make this old news??

    I dunno, I feel the ILM have been behind the bleeding edge for sometime now...

    alnya

    • Re:Two Towers (Score:4, Informative)

      by Zzootnik ( 179922 ) on Thursday April 17, 2003 @07:59AM (#5750500)
      No-no-no-no-no....That was Motion Capture.

      Sensors on Andy S's Body capturing movement data and feeding it into a computer...Much like the mouse you're waving around right now...

      • Re:Two Towers (Score:4, Informative)

        by chrisseaton ( 573490 ) on Thursday April 17, 2003 @08:28AM (#5750667) Homepage
        "Sensors on Andy S's Body capturing movement data and feeding it into a computer"

        Yes... and then rendering the character on top of the real from using the motion capture info.

        It's still realtime rendering.
        • Re:Two Towers (Score:3, Interesting)

          no, I think that was wireframing. Yeah, it's true that it's still technically rendering, but not really very useful to the average person...
          This is much higher res (though obviously not THAT great) rendering, which is really useful.
      • I watched some LoTR early 3d visualizations which had nothing whatsoever to do with mocap on one of the interviews. It might have been a feature on some other documentary DVD that my girlfriend's cousin bought, and not on the four dvd set, which I also own.
    • In the Fellowship of the Ring DVD, Peter jackson can clearly be seen watching golum on a monitor (low poligon, but golum none the less) performing the mo-cap Andy Serkis is performing IN REAL TIME; as it is happening (not after).

      That was motion capture... but it was indeed real-time... very similar to what ILM is now doing. In fact, it's actually a bit more hardcore.

      I dunno, I feel the ILM have been behind the bleeding edge for sometime now...

      I would have to agree there. ILM these days is a lot like M
    • If I'm right, he actually mentions that this is made realtime via SGI hardware :)
    • Re:Two Towers (Score:3, Insightful)

      by Sandman1971 ( 516283 )
      *GASP* motion cpature is not the same thing as inserting a complete animated CG character out of thin air.

      As far as motion capture goes, I remember seeing a Phantom Menace special which showed exactly that. Ahmed Best in a motion capture shoot, and a rough CG of JarJar on a monitor moving along with the actor. So to those neisayers out there, this was being done way before WETA did it for LotR.
  • Yay! (Score:2, Funny)

    by Plissken ( 666719 )
    This will probably help on release dates for movies.

    We'll get to see Episode III sooner!
    • Re:Yay! (Score:2, Funny)

      by sporty ( 27564 )
      Too bad you can't use that 3d technology in Duke Nuke'em forever.

      Wait a minute...

      [/joke]
  • I always thought with the current 3d cards coming
    out and the horsepower they can throw at things
    they would eventually be able to to tv quality
    3d animation programs in real time.

    Hopefully this is going to lead to alot more 3d
    animated series on tv in the near future , and
    in time pick up from where final fantasy left
    off. I still think it was such a pity that film
    didn't get the people into the cinema to watch it.

    But I think the advances they made will pave the
    way for the future. Mainstream 3d Anime here we
    com
  • Oh well (Score:5, Funny)

    by stephenry ( 648792 ) on Thursday April 17, 2003 @07:49AM (#5750433)
    Its a pitty they haven't got one of those to write the script!

    Steve.
  • Hrmm (Score:5, Funny)

    by acehole ( 174372 ) on Thursday April 17, 2003 @07:49AM (#5750435) Homepage
    well, I guess they need to get that jar jar binks death scene juuuuust right.

    • Re:Hrmm (Score:5, Funny)

      by docbrown42 ( 535974 ) on Thursday April 17, 2003 @07:58AM (#5750494) Homepage
      well, I guess they need to get that jar jar binks death scene juuuuust right.

      Naw. The actors kept screwing up just so they could kill Jar Jar again...and again...and again. Given the chance, I think most fans would do the same thing.

  • It's hard to tell if this is anything more than a toy at this point. Marginal quality control is now possible. The time from pre-production to release might be a few days difference.

    The actors might be able to play their roles slightly better if they know what the final result will be. In movies like EpisodeII they were acting totally blind in front of a screen for most of the movie. Very little of it was actually built.

    The biggest question is "When will we have it at home?"
  • Don't get so excited (Score:4, Informative)

    by derrickh ( 157646 ) on Thursday April 17, 2003 @07:54AM (#5750472) Homepage
    The realtime images aren't -final- renders of the scene. They're just rough drafts. The scene still has to be rendered in full res/texture, which still takes hours per frame.

    What ILM has is a supercharged 'preview' button. Just like when you switch to wireframe mode in Lightwave [newtek.com] or Maya [aliaswavefront.com] and see a 'realtime' preview of the animation you're working on. But I'm sure ILM's version looks little bit better.

    D
  • by binaryDigit ( 557647 ) on Thursday April 17, 2003 @07:56AM (#5750479)
    Well, as more and more cgi houses move off of SGI (and on to whatever), they are only really left with their server business. It's really a shame to see a once proud pioneer in the industry reduced to a mere shadow of their former selves, though I guess in this industry, its very common (e.g. DEC, Lotus, Compaq, etc). At this rate it's hard to even see them being around in 4 years, a definite takeover target.

    ob /. comment:

    SGI (aka Silicon Graphics Inc.) was found dead today at the age of 20. After being a high flyer in his youth, often seen hobnobbing with Hollywoods power elite, the latter years were not so kind and saw him in the throes of an identity crisis. Eventually his reliance on a small circle of friends was his undoing, as he was slowly replaced by more mainstream competitors. He will be sorely missed, as while he was at the top, he was a role model for "cool" in the industry, and helped to usher in one of the most exciting (and abused) technology shifts in the motion picture/video entertainment industry since the advent of talkies and color.
    • Nobody in the Hollywood sector really misses SGI. Maybe for the stability of the recent hardware and the ease of maintaining a single machine rather than a network of smaller compute servers, but that's about it.

      SGI really dropped the ball when it came to pricing. They still make top-of-the-line gear and awesome graphics engines, but the cost is insane. SGI even makes a machine (Origin 3900) that can scale to 512 CPUs with a single machine. Every 16 CPUs fits into a 4U rackmount "brick". They did this for
      • That right there is the problem. I don't think there would be a problem paying a 2x or even a 3x price difference to go with SGI over x86/Linux. But 10x? That's the deal breaker.

        You don't know what you're talking about, do you? You could buy 256 cheapo Dell boxes for one tenth that price, but that won't make a cluster.

        I recently saw negotiations to buy a Linux cluster from either IBM or Dell. The price they were talking about came to around $100,000 for a 24-processor configuration, with gigabit networ
    • SGI actually owns Maya which is vastly used on the WinTel market.
      They recently won an Oscar award for their software. The only other company to even win an oscar was Pixar. I'm not too worried about them "dying" just now...
    • As anyone familiar with SGI's recent business strategy will tell you, they realized CGI was a dead horse and gave up on that market years ago.

      SGI has been pushing high performance computing for engineering and science as of the last couple of years, and they have had a few high-profile sales in this arena.

      NASA Ames (not too far from SGI BTW) has purchased a 1024 processor Origin [sgi.com]. I saw the guy in charge of this lab at an HPC conference, and he was very gung-ho about the Origin's shared-memory architecture
  • by imadork ( 226897 ) on Thursday April 17, 2003 @07:57AM (#5750488) Homepage
    would be to develop a program that re-writes Lucas's inane dialogue in real time...
    • Maybe we already had the technology. Let's try Eliza [manifestation.com].

      DV: When I left you, I was but a learner.

      Obi: We were discussing you, not me.

      DV: Now I am the master!

      Obi: How long have you been the master?

      DV: I am your father!

      Obi: Did you come to me because you are my father?

      DV: Search your feelings, you know this to be true.

      Obi: You're not really talking about me, are you?

      Hmm. Just about a wash between that and the real thing.

  • by sgi_admin ( 666721 ) on Thursday April 17, 2003 @08:08AM (#5750549)
    This is, largely, nonsense.

    These images are *not* realtime! A PC is not capable of rendering a CGI screen, in realtime, and merging that, in realtime, with a video feed, and then displaying that, in *realtime*.

    Say what you like about Linux, or high speed CPUs, or XXX vendor's high end GFX card - the architecture and the tools are physically incapable of this.

    If you look at the extras on the LOTR:FOTR DVD set, you'll see people walking around, with a camera on a stick. This *is* displaying real time camera images, merged into a low res, non final rendered, scene of the Cave Troll fight in Moria.

    A point of reference - the machine's they are using for this are SGI Octanes. Not Octane2s, but Octanes.

    They did that work around, what, 3 years ago? And the Octane, at that time, was only 3-4 years old.

    Can anyone show me a PC from 1997 that can manage that? Anyone?

    Despite the fact that the Octane is an ancient piece of kit, there is nothing from the PC world that can match it's capabilities.

    SGI have always been, and always will be, a niche player.

    You would be a fool to buy expensive SGI kit for a renderfarm - buy Intel PCs with Linux. Similarly, you would be fool to try and do realtime CGI with that same kit - that's a specialist task that calls for specialist skills.

    This article does not show that SGI is dying, or that they're being thrown out of the GFX workstation market.

    This article *does* confirm what is widely known - the once cutting edge ILM are now many years behind people like Weta Digital.

    Throwing around "Linux" and "Intel replacing SGI" sound bytes to try and get some news coverage for a dated effects house isn't going to change that.
    • A moden dual proc Xeon can come very very close to what an Octane was able to do in 1997. It's not the same thing, but it's close enough to do the job. Octane2 (with the right software) would be overkill, so here are the differences between a used Octane and a dual Xeon:
      The Xeon is new. That means you can get a good warranty and not have to worry about using used equipment.
      The wiz-bang factor. These days most SFX software runs on both IRIX and Linux. Even Apple's Shake does. So does all of the latest Linux
      • Good points, well made.

        As an aside, I would say that you can buy 2nd hand Octanes, phone up SGI, and they will give you a support contract - they will even check the machines over for you.

        The total cost will be a fraction of the cost of a new dual Xeon workstation.

        Again, this would be a foolish decision to apply across the board, but if you're doing the sort of effects work where the strengths of something like Octane are a bonus, it's a good solution.

        I know someone who runs an effects house who has bou
      • The 18 wheeler can haul a lot more, but the F1 race car will get you to the local Wal-Mart a lot faster. For small tasks, the Xeon will feel a lot faster.
        Yes, that's actually a very good analogy. Keep going, though. Is there really any computing task that involves 'hauling' more data, than rendering 3D video? Would you propose using an F1 to move a truckload of stock to the warehouse? Hm, the benefits of the F1's speed just went down the toilet, eh?

        Frankly, most video rendering companies only use an 18-wh
    • Thanks, dude. I work with both Linux and SGIs, and I can confirm pretty much everything you say. But for my situation it's actually the reverse. SGI workstations were traditionally very big in my field, but we use Linux for almost everything now. However, we now have two Origin 300s for file serving, and the stability and performance has been far superior to anything we could get out of a PC. We don't need raw speed - we need something with bandwidth, that won't crash when NFS goes out of control. (Bu
    • Untitled Document

      From about 1995-98 I worked for an effects company, Tippett Studio, in Berkeley CA. We did giant bugs for the film Starship Troopers using a range of SGI boxes from a few years old to brand spanking new. At the time those machines, running IRIX, where a totally different experience from running a typical PC: They were fast and WAY STABLE, but all $10,000+. Working there felt like having a ticket to the future, and you felt like a race car driver sitting behind one.

      And then I

    • A PC is not capable of rendering a CGI screen, in realtime, and merging that, in realtime, with a video feed, and then displaying that, in *realtime*.

      They said that a baseline VGA adapter couldn't display more than 256 colors simultaneously, but coders on the demoscene figured out ways to do that. And did I mention these coders were often in their teens or early 20's?

      I don't think anyone thought they meant ILM was doing final-rendering of scenes that took days per frame a year ago, now in real-time. Of
  • by dr.robotnik ( 205595 ) on Thursday April 17, 2003 @08:10AM (#5750558)
    Simply prop a big mirror against the wall, and Hey Presto! you can watch a virtual representation of the actors performing the scene in real time!
  • C'mon now.... (Score:4, Interesting)

    by curtisk ( 191737 ) on Thursday April 17, 2003 @08:14AM (#5750577) Homepage Journal
    This is pretty cool, no doubt, but I believe that they were using this in a similiar form for Two Towers (?)

    I think its funny that two recent articles call Linux immature or maturing and that Novell gets bashed for calling Linux immature, but noone (as of this writing) makes mention of ILM saying the exact same thing.

    ILM's move to Linux has been a long and gradual process. "Linux is still maturing," says Plumer. "When we started working with it a few years ago, it was still in its infancy and a lot of things we took for granted coming from (Silicon Graphics') IRIX operating system just weren't there - supportive drivers from different graphics cards and other things. "It took a while to mature, but right now it's going extremely well."

    Ahhhh.... slashdot :)

  • Not new.. (Score:2, Insightful)

    Pah - Jim Henson's Creature Shop, Weta and Framestore have been doing this sort of thing long before ILM. Framestore did this for Dinotopia, Weta for Golum, and JHC for a variety of different things - all too numerous to mention here.
  • When the audience can watch the finished scene, complete with CGI, as the actors are filming it -- now that would be realtime!
  • by Faeton ( 522316 ) on Thursday April 17, 2003 @08:18AM (#5750609) Homepage Journal
    Carmack himself [slashdot.org] (on Slashdot no less) has predicted this would come to pass, due to the increasingly feature-rich and faster video chipsets.

    SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. Even Pixar thought the same [siliconinvestor.com]. Boy, were they wrong though. You can set up a cheap-ass render farm for about $250k, taking up minimal space that can do the same job as a SGI render farm that costs a cool $2 million (Shuttle SFF PC w/ 3 gig CPU + ATI 9700). Of course, there's still the software side.

    The Nvidia's GeForceFX and ATI's Radeon 9800 both contain features that even through the marketing-hype has some real value to programmers out there. Just look at Doom 3. It will run well on some computers that are just 6 months old. Now, imagine taking 250 of them, as a Beowulf cluster!!1

    • by _|()|\| ( 159991 ) on Thursday April 17, 2003 @08:49AM (#5750791)
      SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. ... You can set up a cheap-ass render farm ... that can do the same job as a SGI render farm ... (Shuttle SFF PC w/ 3 gig CPU + ATI 9700)

      A high-end video card is used in a workstation for content creation. Final rendering, however, is still done in software (i.e., by the CPU), whether it's LightWave, Mental Ray or RenderMan. Don't waste your money on a Radeon for your render node.

      • Final rendering, however, is still done in software

        Erm, then why does SGI get all hot and bothered over the ability to add more hardware pipelines with G-Bricks? I think it's more that video cards are designed from the standpoint that you have to render in realtime, and your quality is what is variable, while SGIs are set up to render photorealistically, with time the variable. And I remember some older article about the new video cards being potentially great for rendering, but having huge hangups in pu
    • It's neither (Score:3, Informative)

      by sgi_admin ( 666721 )
      You need to be able to render something in realtime, combining rendered scenes and a live video feed, and then to display that on a monitor or writing it out to digital storage - and all in real time.

      This is not a function of the CPU of the GFX card - they both play a part, but not as much as you seem to assume.

      The main thing here is *bandwidth*. You have to have a large amount of non-contentious bandwidth to chuck all that data around.

      Point to point bandwidth between multiple devices - CPU to memory, CP
    • It depends on which part of the production world you're talking about...

      For rendering, you need raw CPU power and middle of the road networking. A rack of dual proc PCs and a 100BaseT switch is plenty for most 3D people.

      For 3D modeling, a good graphics card and a strong PC behind it is what's needed. You want a card that can handle the polygons, can handle the textures, and has enough cache for all of the display lists. A 3DLabs Wildcat-series card on a modern PC is good enough for almost any 3D animato
    • You can set up a cheap-ass render farm for about $250k

      Yes, but:

      1. Subtract from that at least 10% of the CPU power, because that's going to have downtime. (This is probably a low estimate.)

      2. Add quite a few dollars for increased admin time. If you don't care when boxes die, this won't be quite so bad.

      3. Add quite a few dollars for increased power consumption.

      4. A lot of that money will be spent on networking equipment.

      5. Quite a bit will be spent on a central file server as well.

      6. Forget about d
  • Open? (Score:5, Interesting)

    by Diabolical ( 2110 ) on Thursday April 17, 2003 @08:27AM (#5750659) Homepage
    ILM developed its proprietary file format, OpenEXR

    Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.

    • Re:Open? (Score:5, Informative)

      by Kupek ( 75469 ) on Thursday April 17, 2003 @08:53AM (#5750814)
      It [openexr.net] was released under a modified BSD license [ilm.com].
    • Yeah- it's just like organics. There are starting to be a real bunch of latte-liberal techies out there who just like to hear that they're consuming something vaguely open, and they feel that much better, regardless of whether or not it's true.
    • Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.

      It's a continuing trend, pioneered by Avid with their totally closed Open Media Framework (OMF).

    • The question is, is the file format available for anyone else to download or not? If the file format is not available without "buying" a license, then the name is wrong.

      If you can download it somewhere freely, then its a case of yet another so-called Journalist who writes for emotion and knows nothing of the english language, and you should beat the writer with rubber chickens for using the wrong word.
    • Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.


      That's nothing new - 10 years ago, the proprietary Motif toolkit was being put out by the Open Software Foundation. "Open" has been used as a euphamism for closed for quite a long time.

  • by caveat ( 26803 ) on Thursday April 17, 2003 @08:42AM (#5750750)
    Dark Helmet - "What the hell am I looking at? When does this happen in the movie?"
    Col Sandurz - "Now. You're looking at now, sir. Everything that happens now, is happening now."
    Dark Helmet - "What happened to then?"
    Col Sandurz - "We passed then?"
    Dark Helmet - "When?"
    Col Sandurz - "Just now. We're at now, now."
    Dark Helmet - "Go back to then."
    Col Sandurz - "When?"
    Dark Helmet - "Now."
    Col Sandurz - "Now?"
    Dark Helmet - "Now."
    Col Sandurz - "I can't."
    Dark Helmet - "Why?"
    Col Sandurz - "We missed it."
    Dark Helmet - "When?"
    Col Sandurz - "Just now."
    Dark Helmet - "When will then be now?"
    Col Sandurz - "Soon."
    • DARK HELMET: How soon?
      CORPORAL: Sir.
      DARK HELMET: What?
      CORPORAL: We've identified their location.
      DARK HELMET: Where?
      CORPORAL: It's the Moon of Vega.
      COL SANDURZ: Good work. Set a course, and prepare for our arrival.
      DARK HELMET: When?
      CORPORAL Nineteen-hundred hours, sir.
      COL SANDURZ: By high-noon tomorrow, they will be our prisoners.
      DARK HELMET: WHO?
  • by Jonboy X ( 319895 ) <jonathan,oexner&alum,wpi,edu> on Thursday April 17, 2003 @08:45AM (#5750768) Journal

    Intel-based Dell systems running Linux

    So conflicted...Intel bad...Linux good...Dell ambivalent...
  • ...he didn't have real time rendering when he created the Earth and the Stars.

    He might have changed his mind several times along the way, and we'd all be living inside a soap bubble right now.
  • How portable/scalable is this? I mean, if they had it during LOTR, they couldn't have used it when they were filming out in nature scenes, could they? A lot of stuff is done on sets, but a lot is also done in remote locations, and I'd think it would be seriously hindered under such circumstances.

    Jack
  • they could watch the CGI *before* it happened.

    Now that's Cost Savings!

  • As much as I love ILM and what they are able to do with technology and movies, this doesn't seem like that great of a thing. If all they are writing about in the article is being able to see how the film they just shot will line up with a rough animatic, then thats not that great. I'm guessing what they have is much like what Weta Digital had to make the cave troll and other stuff in Balin's Tomb. Now, I would have been shocked and surprised if they said they could render a CGI scene with full effects, s
  • by Michael_Burton ( 608237 ) <michaelburton@brainrow.com> on Thursday April 17, 2003 @02:16PM (#5753619) Homepage

    ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux.

    Well, I hope all you open-source advocates are happy now. You worked to develop Linux and other open source software because it was "cool," and I'm sure you all had a great time making it more and more powerful. I'll bet you never gave one minute of thought to the fact that the software you were producing might make it easier to make those awful, awful movies, did you?

    Well, it's too late now. I just hope you're satisfied!

GREAT MOMENTS IN HISTORY (#7): April 2, 1751 Issac Newton becomes discouraged when he falls up a flight of stairs.

Working...