ILM Now Capable of Realtime CGI 262
Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."
Re:Errm... (Score:2, Interesting)
What's the point about this? (Score:4, Interesting)
"It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."
Serious Question (Score:4, Interesting)
nothing inherantly special about dell/linux (Score:5, Interesting)
You can get more processing power with the latter since it is cheaper (I would imagine even moreso with AMD) and easier to maintain. But not because it is inherently special or faster in any way.
I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).
NVIDIA + Maya 5? (Score:1, Interesting)
This has been coming for a while (Score:2, Interesting)
out and the horsepower they can throw at things
they would eventually be able to to tv quality
3d animation programs in real time.
Hopefully this is going to lead to alot more 3d
animated series on tv in the near future , and
in time pick up from where final fantasy left
off. I still think it was such a pity that film
didn't get the people into the cinema to watch it.
But I think the advances they made will pave the
way for the future. Mainstream 3d Anime here we
come
Will this really improve movies? (Score:2, Interesting)
The actors might be able to play their roles slightly better if they know what the final result will be. In movies like EpisodeII they were acting totally blind in front of a screen for most of the movie. Very little of it was actually built.
The biggest question is "When will we have it at home?"
Another nail in the SGI coffin (Score:5, Interesting)
ob
SGI (aka Silicon Graphics Inc.) was found dead today at the age of 20. After being a high flyer in his youth, often seen hobnobbing with Hollywoods power elite, the latter years were not so kind and saw him in the throes of an identity crisis. Eventually his reliance on a small circle of friends was his undoing, as he was slowly replaced by more mainstream competitors. He will be sorely missed, as while he was at the top, he was a role model for "cool" in the industry, and helped to usher in one of the most exciting (and abused) technology shifts in the motion picture/video entertainment industry since the advent of talkies and color.
Re:further proof (Score:5, Interesting)
further proof that commodity hardware is killing innovative companies like SGI, and a FREE UNIX is helping it happen.
Linux is great for a company like ILM which is stuffed full of coders who can adapt it to suit their needs, not so good for many other companies.
Re:nothing inherantly special about dell/linux (Score:2, Interesting)
if you are talking about "running RISC chip A at 1Ghz is this much slower doing this than were I to run it on chip B at 1Ghz" - then that is totally different.
that is a benchmark that is useless - especially in terms of real world usage.
what is useful is exactly what I said in the first post - bang for the buck.
If you run Dell/Linux and you pay $500 for one entry level node, and your budget is $50K for this project, then you can have 500 nodes to crunch data on.
If you run SGI and pay $2000 for one entry level node and you have the same budget, then you are going to get more bang for the buck from the Dell/Intel/Linux combo.
But it isn't that Dell and Linux are somehow special - they are just cheap. SGI has plenty of solution that kick the shit out of any Intel/Linux combo ever could - but they are cost prohibitive.
you can point to Spec.org all you want, but that won't change basic economic theory.
A stunningly inaccurate article (Score:4, Interesting)
These images are *not* realtime! A PC is not capable of rendering a CGI screen, in realtime, and merging that, in realtime, with a video feed, and then displaying that, in *realtime*.
Say what you like about Linux, or high speed CPUs, or XXX vendor's high end GFX card - the architecture and the tools are physically incapable of this.
If you look at the extras on the LOTR:FOTR DVD set, you'll see people walking around, with a camera on a stick. This *is* displaying real time camera images, merged into a low res, non final rendered, scene of the Cave Troll fight in Moria.
A point of reference - the machine's they are using for this are SGI Octanes. Not Octane2s, but Octanes.
They did that work around, what, 3 years ago? And the Octane, at that time, was only 3-4 years old.
Can anyone show me a PC from 1997 that can manage that? Anyone?
Despite the fact that the Octane is an ancient piece of kit, there is nothing from the PC world that can match it's capabilities.
SGI have always been, and always will be, a niche player.
You would be a fool to buy expensive SGI kit for a renderfarm - buy Intel PCs with Linux. Similarly, you would be fool to try and do realtime CGI with that same kit - that's a specialist task that calls for specialist skills.
This article does not show that SGI is dying, or that they're being thrown out of the GFX workstation market.
This article *does* confirm what is widely known - the once cutting edge ILM are now many years behind people like Weta Digital.
Throwing around "Linux" and "Intel replacing SGI" sound bytes to try and get some news coverage for a dated effects house isn't going to change that.
Re:nothing inherantly special about dell/linux (Score:4, Interesting)
I work in TV, and I know first hand that SGI is losing out to commodity hardware running Linux, Windows and even to the Mac. SGI gear is just about hanging on thanks to discreet - but it's just a matter of time before an inferno for Intel product lands and a lot of Onyx racks hit eBay.
Unless, of coures, SGI fights back...
C'mon now.... (Score:4, Interesting)
I think its funny that two recent articles call Linux immature or maturing and that Novell gets bashed for calling Linux immature, but noone (as of this writing) makes mention of ILM saying the exact same thing.
ILM's move to Linux has been a long and gradual process. "Linux is still maturing," says Plumer. "When we started working with it a few years ago, it was still in its infancy and a lot of things we took for granted coming from (Silicon Graphics') IRIX operating system just weren't there - supportive drivers from different graphics cards and other things. "It took a while to mature, but right now it's going extremely well."
Ahhhh.... slashdot :)
Open? (Score:5, Interesting)
Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.
Re:nothing inherantly special about dell/linux (Score:3, Interesting)
The article states that they upgraded their hardware and the new hardware is faster and cheaper than the prior hardware... uhhh, right - I'm pretty sure that is how the hardware world works.
Where you could argue that Linux has its edge is stated right in the article - it is the driver support. SGI doesn't support certain drivers, and for good reason - they want to push their own stuff. So if they want to work with new hardware - like the new NVidia chips for realtime rendering the same way SquareSoft did, then SGI isn't going to help.
Also, workstation speed is all relative - it depends on what you are doing on the particular workstation - are they slower at working with real-time video? are they slower at network filesharing? is their memory bandwidth too slow for the hardware to make full use the processor?
To say it is too slow is a cop out - the hardware exists for a specific reason - SGI makes very action specific workstations, and they are areguably useless outside of that realm.
And while it is a fantastic thing for you to be able to throw around that you "work in TV" as if what you say is now backed by all of that business instead of just your opinion - then by me saying that I once worked at a special effects house, I should now have more power in what I say right?
I assure you that whether the effects house is SquareSoft, ILM, Digital Domain, or whatever - they all are businesses and have a single bottom line - they need to make money.
In order to make money, they won't ignore cost as you say. But it might look like that if they are rationalizing cost (a 100 node cluster of SGIs might be a million dollars, but a 200 node cluster of Alpha boxes might be 1.75million - they are spending more money, but they are getting a much faster overall cluster).
To argue over their workstations is silly in the end - the workstations are constantly being turned over at these places and nobody is ever satisfied with their performance. They don't really care if your workstation is top notch - what they care about is how fast the end product can be realized - if a faster workstation would result in that, then you get a faster one based on cost - but almost always, the entire focus of the drive of machines purchased is the rendering farm.
Even then, it hardly ever is truly purchased - it is a lease type deal since the turnover is so high.
I personally hated SGI when we worked with them and I much preferred the Intel boxes. So I'm not exactly standing up for SGI here, I mainly just thought the article was poorly written and should have called out the reason for switch better than just a reason to add one more article to the linux circle jerk.
Also I should note that I wrote SGI/Intel on WinNT up there - that is wrong - it was SGI/Alphas with WinNT. I would imagine that Intel and AMD now making the new 64bit chips will lead to a lot bigger jump over SGI.
Re:Two Towers (Score:3, Interesting)
This is much higher res (though obviously not THAT great) rendering, which is really useful.
I agree... somewhat (Score:3, Interesting)
The Xeon is new. That means you can get a good warranty and not have to worry about using used equipment.
The wiz-bang factor. These days most SFX software runs on both IRIX and Linux. Even Apple's Shake does. So does all of the latest Linux utilities. It's "cooler" to many people to use a Linux workstation over an Octane.
The CPU. Granted, the Octane's torque came from it's architecture, not its CPU... but this alone does not make up for the raw power of those Xeons. It's like racing an 18 wheeler with a F1 race car. The 18 wheeler can haul a lot more, but the F1 race car will get you to the local Wal-Mart a lot faster. For small tasks, the Xeon will feel a lot faster.
This is why you'll still see a lot of existing Octanes, Octane2s, Onyx/Onyx2/Onyx3000 systems in use by hollywood. They work fine. But for new employees, and for replacement hardware, you'll almost certainly see a dual proc PC running Linux. There are, of course, come artists that prefer one over the other.
As for render farms, you're right. It only makes sense to use Intel or AMD. Using SGI (or Sun) big iron for rendering would be insane. The render software isn't even optimized for IRIX or the SGI Origin architecture anymore. I think the very last holdout was ILM, who still had their three huge Origin 2000s. (Running 1999 R10K 250MHz processors [about the equiv of a PIII/550]..... no wonder they find their new renderfarm to be faster.....).
You're right on the money about ILM being behind Weta (and possibly others). ILM is still a cool shop, but not as current as many of the others. Hell, ILM did most of the work for Episode 1 on SGI O2s. O2s! The O2 was an ok video editor, by no means a 3D or CPU powerhouse! I can't belive they got it done at all. There are HUGE differences between the O2 and Octane.
ahh for the days of 1hr=1sec (Score:1, Interesting)
now many of those folks from 20yrs ago are running the 3d shops of today. Cheers to you all!
Re:A stunningly inaccurate article (Score:2, Interesting)
From about 1995-98 I worked for an effects company, Tippett Studio, in Berkeley CA. We did giant bugs for the film Starship Troopers using a range of SGI boxes from a few years old to brand spanking new. At the time those machines, running IRIX, where a totally different experience from running a typical PC: They were fast and WAY STABLE, but all $10,000+. Working there felt like having a ticket to the future, and you felt like a race car driver sitting behind one.
And then I departed Tippett Studio and bought a PC for a couple thousand bucks, running Softimage on NT, and guess what? - The sucker was faster than any SGI I had ever used, and almost as stable! Now I use Maya running on Linux - and it is also faster than any SGI I have ever used, and just as stable! Most animators I've talked to have had similar experiences - it's not that they want it to be that way, it just is that way.
Now I'm sure SGI can cook-up a box that's more impressive than a typical PC of today, but I'd have to sell my house in order to buy it, and I'd be stuck with it for a decade, struggling to save up to buy a new one. I'll stick with cheap PC's and Linux, thank you.
To say that SGI will always be a niche player is just ridiculous in my book. People use what's fast and cheap - PERIOD. Fancy logos and claims of superiority don't help people who just want to get the job done which minimum damage to the pocket book.
I don't think owning an SGI can even get you laid any more, so why bother?
Best be the idiot that has learnt, than the genius who can't.
Are those your ass-lips moving? (Score:1, Interesting)
Stuff like this is inevitable. Computers get faster, so what used to be very difficult is now easy, so you can do it with commodity parts.
What used to be impossible is now difficult. That's what SGI does. They don't charge 2 million for a computer because they're snobs, they charge it because 1) That's what it costs to develop the things and 2) They solve problems that people are willing to pay $2 million to solve.
All this says is that the CGI problem has become simple enough to solve that you don't need $2 million to solve it anymore. So SGI just moves on to solving other $2 million problems that cost $50 million to solve a few years ago.
All we're seeing is one problem set fall out of SGI's target market as different problem sets fall into it.
There are reasons for buying SGI... (Score:3, Interesting)
It's amazing they still have a few dollars in the bank. They've sold some patents and sold off Cray (albeit for pennies on the dollar from what they originally paid), but to last that long is impressive in itself.
Any any rate, SGI does offer unique and unmatched products *in certain areas*...
Do you need a shared-memory supercomputer that can scale to 512 processors with the same exact kernel that runs on a desktop model? Or how about 1024 processors with a simple kernel patch? Very few people need that much IO across that many processors, but for those that do, there is no better choice.
Do you need a machine that can handle dozens of channels of 2gbit fibrechannel without breaking a sweat?
Do you have an insanely complicated set of HD video files and other material that need to be layered/composited? Does this job need to be done yesterday? Is full-resolution/full-quality realtime effects work needed? Piranha or Inferno running on an Onyx 3000 (plus gobs of ram and disk arrays on several channels) can do this for you.
Are you interested in seeing the true potential of Linux? Do you want to work with a true Itanium2/Linux supercomputer... one that is way more than a cluster? Want to see a single machine (again, not a cluster) with 64 processors and 512 GB of RAM? Yes, Linux can handle it too, because of SGI's kernel patches and hw/sw architecture.
Not many people need or can afford SGI big iron... but for those that do, nothing beats the SGI Origin and it's baby cousin, the Altix.