ILM Now Capable of Realtime CGI 262
Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."
Errm... (Score:4, Insightful)
Wouldn't realtime by WHILE the scene is filmed?
Re:Errm... (Score:2, Interesting)
Re:Errm... (Score:2, Insightful)
not if your name is George Lucas. then it is all about the eye-candy
Re:Errm... (Score:2, Redundant)
Re:Errm... (Score:2)
As for the term 'realtime', without a reference to what it is realtime against, it could mean anything. It's just how the language works, without a relation to something else, we have to assume something and hope the author intended what we assume.
Just goes to show the inadequacy of languages, and why so many confusions are taking place in this world. No need to get upset about it though.
Re:Errm... (Score:5, Insightful)
Re:Errm... (Score:2)
There are drawbacks however. Getting a half dog for a CG character would definately throw up some flags. We'd have to update the Bible. No gay sex,
Re:Errm... (Score:2)
About S1M0NE...I just watched this movie last weekend. The problem with completely CG "actors", is that, at some point and time, the computer guy is going to do something stupid that tips people off that they're actually watching a CG "actor", not a human.
In the movie, at the awards show, S1M0NE won an award, and being CG, couldn't be live at the show. So, she appeared via satellite from some 3rd world country where she was doing "Charity work". In the background, there's a "huge" wind storm, and her behav
Re:Errm... (Score:5, Insightful)
WIth HD Lucas is shooting actors on Video...and now doing previsualization with the CG elements on set.
Did Liam look in the general direction of, but not AT the eyes of the CG character? Reshoot. etc. etc. etc.
Additionally a rough edit can be done off the video tap on set with the rough CG edit.
Unfortuantetly this still means nothing without good acting, a good script, or alternate footage to make decisions from.
You make a film three times.
Once on the page, once while directing, and once in the edit. But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all.
Oh well, at least you can see what the giant CG creature looks like
Re:Errm... (Score:3, Insightful)
Re:Errm... (Score:2, Insightful)
If you're George Lucas, you don't discover anything in the edit, you simply use CGI to change the actors' bodies to fit what you want. If you listen to the Episode I and II DVD commentaries, you will hear some very interesting details about how actors' positions on "set," their limbs, and even their faces were changed in post to suit Lucas' direction. It's no wonder th
Re:Errm... (Score:2)
From the article:
"It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."
So how is this different from using wireframe models to do live action takes? That's been done for years now. All I can tell from
Re:Errm... (Score:3, Insightful)
That would be like saying videotaping isn't "realtime" since you have to rewind!
Re:Errm... (Score:3, Informative)
Wouldn't realtime by WHILE the scene is filmed
Well, it's hard to act and watch a monitor at the same time. Besides, the CGI they're doing in realtime is just a preview that they can overlay onto the video feed to see sort-of what it would look like.
Re:Errm... (Score:2)
No. It means they can play it back at any camera view without having to wait for it to render.
Re:Errm... (Score:2)
Well if you have figured out a way they can watch a monitor and do their scene at the same time, let them know.
It could usher in a whole new era- I want to watch myself actually doing work, while Im doing it. But if Im watching it, then Im not doing it... This could possibly create a paradox and make the universe disappea
That's nothing... (Score:5, Funny)
oh yeah (Score:5, Funny)
Re:That's nothing... (Score:2)
How long til... (Score:5, Funny)
How long... (Score:2)
Re:How long til... (Score:2)
Almost there... we already have CGI porn, and we already have real-time porn... and let me tell ya', I've got 100 people hard at work on real-time CGI porn!
it'll be awhile. (Score:2)
Realtime (Score:4, Funny)
What's the point about this? (Score:4, Interesting)
"It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."
Re:What's the point about this? (Score:3, Funny)
....and it's supposed to be profitable.
Serious Question (Score:4, Interesting)
Re:Serious Question (Score:4, Insightful)
More and more manufacturers are coming out with blade servers using x86 processors which will increase this density and likely increase the use.
This is not saying that the studios are not running SGI kit for animation, modelling etc. Linux/x86 kit has a way to go to catch up there.
nothing inherantly special about dell/linux (Score:5, Interesting)
You can get more processing power with the latter since it is cheaper (I would imagine even moreso with AMD) and easier to maintain. But not because it is inherently special or faster in any way.
I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).
SGI (Score:3, Informative)
I was friends with several SGI employees when SGI decided to ditch their Intel/WinNT support. Two of my friends were directly involved with the NT-related o
Re:SGI (Score:2)
Rus
Re:nothing inherantly special about dell/linux (Score:2)
The SGI x86 boxen were strange beasts with freaky graphics cards designed to be used as graphics workstations. They were very good but very expensive, for x86 hardware. Why would anyone buy an expensive SGI branded PC when they can buy a cheaper one, slap a good graphics card in, and get good enough performance?
There's no money for SGI in PC hardware, they've tried it and it flopped. SGI's market is making kit that can do thi
Re:nothing inherantly special about dell/linux (Score:2, Informative)
Re:nothing inherantly special about dell/linux (Score:2)
In short, it was "alright", but certainly not worth the price, IMO. I only had a single 450mhz P3 in mine though, so maybe a dual CPU would
There are reasons for buying SGI... (Score:3, Interesting)
It's amazing they still have a few dollars in the bank. They've sold some patents and sold off Cray (albeit for pennies on the dollar from what they originally paid), but to last that long is impressive in itself.
Any any rate, SGI does offer unique and unmatched products *in certain areas*...
Do you need a shared-memory supercomputer that can scale to 512 proc
Re:nothing inherantly special about dell/linux (Score:2, Interesting)
if you are talking about "running RISC chip A at 1Ghz is this much slower doing this than were I to run it on chip B at 1Ghz" - then that is totally different.
that is a benchmark that is useless - especially in terms of real world usage.
what is useful is exactly what I said in the first post - bang for the buck.
If you run Dell/Linux and you pay $500 for one entry level node, and your budget is $50K for this project, then you can have
Re:nothing inherantly special about dell/linux (Score:4, Interesting)
I work in TV, and I know first hand that SGI is losing out to commodity hardware running Linux, Windows and even to the Mac. SGI gear is just about hanging on thanks to discreet - but it's just a matter of time before an inferno for Intel product lands and a lot of Onyx racks hit eBay.
Unless, of coures, SGI fights back...
Re:nothing inherantly special about dell/linux (Score:3, Interesting)
The article states that they upgraded their hardware and the new hardware is faster and cheaper than the prior hardware... uhhh, right - I'm pretty sure that is how the hardware world works.
Where you could argue that Linux has its edge is stated right in the article - it is the driver support. SGI doesn't support certain drivers, and for
Right... (Score:2)
You're Alan Partridge, and you work in TV
I think you mean you used to work in TV
And even The Day Today's brilliant graphic effects weren't that advanced from a hardware standpoint...
Re:Right... (Score:2, Funny)
On the Partridge point, you're forgetting my combat-based gameshow that runs on digital TV.
further proof (Score:2, Insightful)
Re:further proof (Score:5, Interesting)
further proof that commodity hardware is killing innovative companies like SGI, and a FREE UNIX is helping it happen.
Linux is great for a company like ILM which is stuffed full of coders who can adapt it to suit their needs, not so good for many other companies.
Re:further proof (Score:2)
Re:further proof (Score:3, Funny)
mid 1990's - NT will replace Unix
late 1990's - Linux will replace Windows
2000's - Linux will replace proprietary Unix
None of these have happened yet. I doubt that the last will ever happen.
oops missed one:
late 1980's to current day - Apple is dying
Pls mod down parent as overrated - it's an opinion based on an emotional response rather than reason.
Two Towers (Score:5, Insightful)
So does this make this old news??
I dunno, I feel the ILM have been behind the bleeding edge for sometime now...
alnya
Re:Two Towers (Score:4, Informative)
Sensors on Andy S's Body capturing movement data and feeding it into a computer...Much like the mouse you're waving around right now...
Re:Two Towers (Score:4, Informative)
Yes... and then rendering the character on top of the real from using the motion capture info.
It's still realtime rendering.
Re:Two Towers (Score:3, Interesting)
This is much higher res (though obviously not THAT great) rendering, which is really useful.
Re:Two Towers (Score:2)
Re:Two Towers (Score:2)
That was motion capture... but it was indeed real-time... very similar to what ILM is now doing. In fact, it's actually a bit more hardcore.
I dunno, I feel the ILM have been behind the bleeding edge for sometime now...
I would have to agree there. ILM these days is a lot like M
Re:Two Towers - Yes, and on SGI hardware :) (Score:2)
Re:Two Towers (Score:3, Insightful)
As far as motion capture goes, I remember seeing a Phantom Menace special which showed exactly that. Ahmed Best in a motion capture shoot, and a rough CG of JarJar on a monitor moving along with the actor. So to those neisayers out there, this was being done way before WETA did it for LotR.
Yay! (Score:2, Funny)
We'll get to see Episode III sooner!
Re:Yay! (Score:2, Funny)
Wait a minute...
[/joke]
This has been coming for a while (Score:2, Interesting)
out and the horsepower they can throw at things
they would eventually be able to to tv quality
3d animation programs in real time.
Hopefully this is going to lead to alot more 3d
animated series on tv in the near future , and
in time pick up from where final fantasy left
off. I still think it was such a pity that film
didn't get the people into the cinema to watch it.
But I think the advances they made will pave the
way for the future. Mainstream 3d Anime here we
com
Oh well (Score:5, Funny)
Steve.
Hrmm (Score:5, Funny)
Re:Hrmm (Score:5, Funny)
Naw. The actors kept screwing up just so they could kill Jar Jar again...and again...and again. Given the chance, I think most fans would do the same thing.
Will this really improve movies? (Score:2, Interesting)
The actors might be able to play their roles slightly better if they know what the final result will be. In movies like EpisodeII they were acting totally blind in front of a screen for most of the movie. Very little of it was actually built.
The biggest question is "When will we have it at home?"
Don't get so excited (Score:4, Informative)
What ILM has is a supercharged 'preview' button. Just like when you switch to wireframe mode in Lightwave [newtek.com] or Maya [aliaswavefront.com] and see a 'realtime' preview of the animation you're working on. But I'm sure ILM's version looks little bit better.
D
Another nail in the SGI coffin (Score:5, Interesting)
ob
SGI (aka Silicon Graphics Inc.) was found dead today at the age of 20. After being a high flyer in his youth, often seen hobnobbing with Hollywoods power elite, the latter years were not so kind and saw him in the throes of an identity crisis. Eventually his reliance on a small circle of friends was his undoing, as he was slowly replaced by more mainstream competitors. He will be sorely missed, as while he was at the top, he was a role model for "cool" in the industry, and helped to usher in one of the most exciting (and abused) technology shifts in the motion picture/video entertainment industry since the advent of talkies and color.
Re:Another nail in the SGI coffin (Score:2)
SGI really dropped the ball when it came to pricing. They still make top-of-the-line gear and awesome graphics engines, but the cost is insane. SGI even makes a machine (Origin 3900) that can scale to 512 CPUs with a single machine. Every 16 CPUs fits into a 4U rackmount "brick". They did this for
Re:Another nail in the SGI coffin (Score:2)
You don't know what you're talking about, do you? You could buy 256 cheapo Dell boxes for one tenth that price, but that won't make a cluster.
I recently saw negotiations to buy a Linux cluster from either IBM or Dell. The price they were talking about came to around $100,000 for a 24-processor configuration, with gigabit networ
Re:Another nail in the SGI coffin (Score:2)
They recently won an Oscar award for their software. The only other company to even win an oscar was Pixar. I'm not too worried about them "dying" just now...
Re:Another nail in the SGI coffin (Score:2)
SGI has been pushing high performance computing for engineering and science as of the last couple of years, and they have had a few high-profile sales in this arena.
NASA Ames (not too far from SGI BTW) has purchased a 1024 processor Origin [sgi.com]. I saw the guy in charge of this lab at an HPC conference, and he was very gung-ho about the Origin's shared-memory architecture
The next innovation (Score:5, Funny)
Re:The next innovation (Score:2)
Hmm. Just about a wash between that and the real thing.
A stunningly inaccurate article (Score:4, Interesting)
These images are *not* realtime! A PC is not capable of rendering a CGI screen, in realtime, and merging that, in realtime, with a video feed, and then displaying that, in *realtime*.
Say what you like about Linux, or high speed CPUs, or XXX vendor's high end GFX card - the architecture and the tools are physically incapable of this.
If you look at the extras on the LOTR:FOTR DVD set, you'll see people walking around, with a camera on a stick. This *is* displaying real time camera images, merged into a low res, non final rendered, scene of the Cave Troll fight in Moria.
A point of reference - the machine's they are using for this are SGI Octanes. Not Octane2s, but Octanes.
They did that work around, what, 3 years ago? And the Octane, at that time, was only 3-4 years old.
Can anyone show me a PC from 1997 that can manage that? Anyone?
Despite the fact that the Octane is an ancient piece of kit, there is nothing from the PC world that can match it's capabilities.
SGI have always been, and always will be, a niche player.
You would be a fool to buy expensive SGI kit for a renderfarm - buy Intel PCs with Linux. Similarly, you would be fool to try and do realtime CGI with that same kit - that's a specialist task that calls for specialist skills.
This article does not show that SGI is dying, or that they're being thrown out of the GFX workstation market.
This article *does* confirm what is widely known - the once cutting edge ILM are now many years behind people like Weta Digital.
Throwing around "Linux" and "Intel replacing SGI" sound bytes to try and get some news coverage for a dated effects house isn't going to change that.
I agree... somewhat (Score:3, Interesting)
The Xeon is new. That means you can get a good warranty and not have to worry about using used equipment.
The wiz-bang factor. These days most SFX software runs on both IRIX and Linux. Even Apple's Shake does. So does all of the latest Linux
Re:I agree... somewhat (Score:2, Insightful)
As an aside, I would say that you can buy 2nd hand Octanes, phone up SGI, and they will give you a support contract - they will even check the machines over for you.
The total cost will be a fraction of the cost of a new dual Xeon workstation.
Again, this would be a foolish decision to apply across the board, but if you're doing the sort of effects work where the strengths of something like Octane are a bonus, it's a good solution.
I know someone who runs an effects house who has bou
Re:I agree... somewhat (Score:2)
Yes, that's actually a very good analogy. Keep going, though. Is there really any computing task that involves 'hauling' more data, than rendering 3D video? Would you propose using an F1 to move a truckload of stock to the warehouse? Hm, the benefits of the F1's speed just went down the toilet, eh?
Frankly, most video rendering companies only use an 18-wh
Re:A stunningly inaccurate article (Score:2)
Re:A stunningly inaccurate article (Score:2, Interesting)
From about 1995-98 I worked for an effects company, Tippett Studio, in Berkeley CA. We did giant bugs for the film Starship Troopers using a range of SGI boxes from a few years old to brand spanking new. At the time those machines, running IRIX, where a totally different experience from running a typical PC: They were fast and WAY STABLE, but all $10,000+. Working there felt like having a ticket to the future, and you felt like a race car driver sitting behind one.
And then I
Re:A stunningly inaccurate article (Score:2)
They said that a baseline VGA adapter couldn't display more than 256 colors simultaneously, but coders on the demoscene figured out ways to do that. And did I mention these coders were often in their teens or early 20's?
I don't think anyone thought they meant ILM was doing final-rendering of scenes that took days per frame a year ago, now in real-time. Of
Low tech solution (Score:3, Funny)
C'mon now.... (Score:4, Interesting)
I think its funny that two recent articles call Linux immature or maturing and that Novell gets bashed for calling Linux immature, but noone (as of this writing) makes mention of ILM saying the exact same thing.
ILM's move to Linux has been a long and gradual process. "Linux is still maturing," says Plumer. "When we started working with it a few years ago, it was still in its infancy and a lot of things we took for granted coming from (Silicon Graphics') IRIX operating system just weren't there - supportive drivers from different graphics cards and other things. "It took a while to mature, but right now it's going extremely well."
Ahhhh.... slashdot :)
Not new.. (Score:2, Insightful)
Re:Not new.. (Score:2)
Real Realtime In My Dreams (Score:2)
It's the video card, not the CPU.... (Score:5, Insightful)
SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. Even Pixar thought the same [siliconinvestor.com]. Boy, were they wrong though. You can set up a cheap-ass render farm for about $250k, taking up minimal space that can do the same job as a SGI render farm that costs a cool $2 million (Shuttle SFF PC w/ 3 gig CPU + ATI 9700). Of course, there's still the software side.
The Nvidia's GeForceFX and ATI's Radeon 9800 both contain features that even through the marketing-hype has some real value to programmers out there. Just look at Doom 3. It will run well on some computers that are just 6 months old. Now, imagine taking 250 of them, as a Beowulf cluster!!1
Re:It's the video card, not the CPU.... (Score:4, Informative)
A high-end video card is used in a workstation for content creation. Final rendering, however, is still done in software (i.e., by the CPU), whether it's LightWave, Mental Ray or RenderMan. Don't waste your money on a Radeon for your render node.
Re:It's the video card, not the CPU.... (Score:2)
Erm, then why does SGI get all hot and bothered over the ability to add more hardware pipelines with G-Bricks? I think it's more that video cards are designed from the standpoint that you have to render in realtime, and your quality is what is variable, while SGIs are set up to render photorealistically, with time the variable. And I remember some older article about the new video cards being potentially great for rendering, but having huge hangups in pu
It's neither (Score:3, Informative)
This is not a function of the CPU of the GFX card - they both play a part, but not as much as you seem to assume.
The main thing here is *bandwidth*. You have to have a large amount of non-contentious bandwidth to chuck all that data around.
Point to point bandwidth between multiple devices - CPU to memory, CP
It's the... BOTH (Score:3, Offtopic)
For rendering, you need raw CPU power and middle of the road networking. A rack of dual proc PCs and a 100BaseT switch is plenty for most 3D people.
For 3D modeling, a good graphics card and a strong PC behind it is what's needed. You want a card that can handle the polygons, can handle the textures, and has enough cache for all of the display lists. A 3DLabs Wildcat-series card on a modern PC is good enough for almost any 3D animato
Re:It's the video card, not the CPU.... (Score:2)
Yes, but:
1. Subtract from that at least 10% of the CPU power, because that's going to have downtime. (This is probably a low estimate.)
2. Add quite a few dollars for increased admin time. If you don't care when boxes die, this won't be quite so bad.
3. Add quite a few dollars for increased power consumption.
4. A lot of that money will be spent on networking equipment.
5. Quite a bit will be spent on a central file server as well.
6. Forget about d
Open? (Score:5, Interesting)
Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.
Re:Open? (Score:5, Informative)
Re:Open? (Score:2)
Re:Open? (Score:2)
It's a continuing trend, pioneered by Avid with their totally closed Open Media Framework (OMF).
Re:Open? (Score:2)
If you can download it somewhere freely, then its a case of yet another so-called Journalist who writes for emotion and knows nothing of the english language, and you should beat the writer with rubber chickens for using the wrong word.
Re:Open? (Score:2)
That's nothing new - 10 years ago, the proprietary Motif toolkit was being put out by the Open Software Foundation. "Open" has been used as a euphamism for closed for quite a long time.
Had do be said... [Spaceballs ref] (Score:5, Funny)
Col Sandurz - "Now. You're looking at now, sir. Everything that happens now, is happening now."
Dark Helmet - "What happened to then?"
Col Sandurz - "We passed then?"
Dark Helmet - "When?"
Col Sandurz - "Just now. We're at now, now."
Dark Helmet - "Go back to then."
Col Sandurz - "When?"
Dark Helmet - "Now."
Col Sandurz - "Now?"
Dark Helmet - "Now."
Col Sandurz - "I can't."
Dark Helmet - "Why?"
Col Sandurz - "We missed it."
Dark Helmet - "When?"
Col Sandurz - "Just now."
Dark Helmet - "When will then be now?"
Col Sandurz - "Soon."
Re:Had do be said... [Spaceballs ref] (Score:2)
CORPORAL: Sir.
DARK HELMET: What?
CORPORAL: We've identified their location.
DARK HELMET: Where?
CORPORAL: It's the Moon of Vega.
COL SANDURZ: Good work. Set a course, and prepare for our arrival.
DARK HELMET: When?
CORPORAL Nineteen-hundred hours, sir.
COL SANDURZ: By high-noon tomorrow, they will be our prisoners.
DARK HELMET: WHO?
Internal monologue (Score:3, Funny)
Intel-based Dell systems running Linux
So conflicted...Intel bad...Linux good...Dell ambivalent...
Thank God.... (Score:2)
He might have changed his mind several times along the way, and we'd all be living inside a soap bubble right now.
Where/When can they do this? (Score:2)
Jack
If they were using the preemtive kernel (Score:5, Funny)
Now that's Cost Savings!
doesn't seem that great (Score:2, Insightful)
Too Late for Regrets (Score:3, Funny)
ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux.
Well, I hope all you open-source advocates are happy now. You worked to develop Linux and other open source software because it was "cool," and I'm sure you all had a great time making it more and more powerful. I'll bet you never gave one minute of thought to the fact that the software you were producing might make it easier to make those awful, awful movies, did you?
Well, it's too late now. I just hope you're satisfied!
sweet with a slightly bitter taste (Score:2)
I have been very surprised that they are FULL THROTTLE on trying to create something like real time virtual hosts. Their goal is to have people like Bernard Shaw giving the news in say 2050 (long after he's dead) They even make allusions to Max Headroom. But wasn't he a state controlled computer generated news and info person? Hmmm.
I also found it interesting that hardly any of the reporters at the confere