Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Lord Of The Rings Being Rendered Under Linux 158

Along with an adventuring band of others, tmatysik writes: "Came across this article in the New Zealand Herald the other day. Seems that Weta Digital is now moving over to Linux for the rendering work on Lord of the Rings. Two quotes from the article especially caught my eye: [1)] 'We were able to get the SGI 1200 servers for about $15,000 each or $7500 a processor, and they run more than twice as fast as the [$40000] Octanes for pure rendering.' [and 2)] 'Just by putting in a Linux processor, the price to do a frame is up to a tenth of the cost as on an SGI workstation [running SGI's Irix operating system] so the things we can attempt are more complex.'" Update: 08/27 09:35 PM by CT : Rebecca from WetaFX sent us pictures of the team, and the mighty stack that shall render lord of the rings.
This discussion has been archived. No new comments can be posted.

Lord Of The Rings Being Rendered Under Linux

Comments Filter:
  • by -Harlequin- ( 169395 ) on Saturday August 26, 2000 @10:44AM (#825633)
    A lot of comments here equate LoTR with the MPAA. While the film is being produced for and funded by a MPAA member, I think it bears pointing out that the LoTR people have negotiated a fair amount of freedom from MPAA influence, and that the production companies involved (WetaFX for example) are not subsidiaries of MPAA companies (even if most of their more recent income has come from MPAA members).

    (And just to rub it in, the MPAA's region coding system is an illegal trade barrier in New Zealand :-))

    Unfortunately, if LoTR is as cool as it looks so far, the MPAA will reap the financial reward. On the bright side, it might convince them to break the mould again rather pump out more purile cringe-comedy films and the other garbage they won't leave alone.

    Another pain for the boycott idea - how to support people like the LOTR teams while not supporting the MPAA. Sigh.
  • (OT, yes I know...... sorry!)

    You should try Free Pascal (freepascal.org). I've done quite a bit with it, it's 32-bit, compatible with DOS (via extender) and Linux......

    -----
  • Thanks for the info! I'm relatively new to /. and am still learning (sometimes the hard way) what constitutes "helpful" information versus 'redundant" and the like. I expect I'll keep making mistakes, but I *do* try and learn from them!

  • Remember, Amiga isn't just a software package, it's hardware. Old, slow hardware.

    Linux, OTOH, is portable software. It can be ported to every new machine, and take advantage of the new speed to be competitive with other new OSs.

    It's entirely possible for it to still be in fairly common use twenty years from now.

    ---
    Despite rumors to the contrary, I am not a turnip.
  • I actually would like to try Win2K, but unfortunately I cannot afford either the OS itself, or the hardware it needs to run.

    If Microsoft would like to make available a stripped down version that would run on my system, at a price that a student can afford (remember that what to you might be the cost of a music CD means the difference between eating and starving for the next week to me), I would gladly try it out.

    It is not a matter of whether I am interested in taking the time to "bother trying it out" or not, the simple fact is that I don't even have the option of trying it out.

    It is not my "narrow-minded FUD ethic", but a simple matter of economics.

  • I think it's great that linux has found a way into the effects arena. It's power and price simple cannot be ignored

    Where linux is still really lacking though is on the actual workstations. At these effects houses, nobody but the admins even touch the linux boxes. People sitting at their irix/nt boxes just order up some processing power on their render queue, and wait for the linux machines to chew through it.

    I will be really impressed the day when linux is used for the actual production and artistic work. That day is definitely coming, but it isn't here as of yet.
  • It's a long time since the Pacific Peso was worth that much. The New Zealand dollar is around the US$0.43 mark and GBP0.29. Great for me, since I live in .nz and earn in USD and GBP 8).


    --
    My name is Sue,
    How do you do?
    Now you gonna die!
  • Fair comment, but [while I don't go shopping for this type of equipment myself] the prices on those SGI 1200's looked a steal to me.

    Buy something at half the price - buy twice as many - more than make up for the speed loss. I think that Titanic was rendered on x86's. Go figure. :-)
  • Though.. for a farm... yeah. You fire out tons of little linux boxes for the price of a single sparc...
  • Why the fuck did this guy get +3 funny when #52 got flamebait? #52 was funnier and posted first. This fucking comment is redundant not +3 funny. Get your head out of your asses moderators.
  • by toofast ( 20646 )
    That was a shameless plug for your own Slashdot-like website. Did it work? Have your hits gone up? Is Slashdot a viable free-advertising place? In any case, I went. I don't know if I'll return to your site, but I went. At least now I know about it.
  • Peter Jackson has already done that sort of thing. [imdb.com]
    ---

  • Even if they had used SCO Unix, it would have been far cheaper than that equivalent SGI platform.
    Yeah, but it wouldn't have been as stable, it wouldn't have had SGI support, it might not have had renderman support and it might not have been quite as nice to develop in-house stuff on. (If you don't think that they don't develop some of their own in-house graphics SW, you're nuts.)
  • Cracks of Doom become Blue Screen of Doom, Frodo gets hypnotized by eerie blue light and forgets about One Ring

    Halfway through the first (and second, third, and fourth) trial runs, box pops up with message "Program MegaSuperDuperRenderer has performed an illegal operation and will be shut down" followed by intense programmer swearing

    For some odd reason, no matter what they do, Sauron and his nazgul show an eerie resemblance to Steve Jobs (and gollum looks just a little bit like Linus...)

    No one can figure out why the stars move towards them all the time, making everyone sick

    By the time the movie is made, Microsoft has become Micro and Soft, and both of them want credit

  • Why is it that everyone who doesn't think Linux is the son of Zeus always gets labelled a Troll? Hey, if they rendered on PalmOS, you wouldn't print this. Linux this, linux that, who cares it's just another OS. OS's aren't everything, algorithms/software/innovation is. There is no innovation in this at all. Next Slashdot headline: Someone's mother uses Linux to check email. Hmmm, wow. Linux news! Front page!
  • Comment removed based on user account deletion
  • Ahh I remeber it now... The arguments on comp.sys.amiga.advocacy:

    "The Amiga is slow, old and obsolete. It's dead, time to move on."

    "Oh yeah! They use the Amiga to render the scenes in Babylon 5. Yeah! And I heard they used an Amiga 2000 to prop the doors open when rendering Jurassic Park!"

    Years from now in comp.os.linux.advocacy when people point out how pointless Linux is in the modern world, the advocates will now be able to say "Oh yeah! Well they used Linux machines to render Lord of the Rings!"

    Kind of touching, really. :)
  • It's interesting that Linux is being used, but that's not what makes it cheap. What makes it cheap is the Intel hardware. Even if they had used SCO Unix, it would have been far cheaper than that equivalent SGI platform.

    And just to be controversial [:)], I'll say that you really have to give the credit to both Microsoft and Intel. They are really the ones who brought affordable computing to the masses, and through that economy of scale we have the incredibly cheap power that we have. [God knows Apple didn't bring cheap computers to the masses, despite their "computer for the rest of us" claims. Their tagline should have been "the computer for the arrogant elite"]


    --

  • \subject

    --
  • At best the frames are 4000x4000x4 (i.e. 32bpp) or 61 MB a piece. This would give them about the same spatial resolution as a 35mm negative, and a comparable color depth. More likely they are 2Kx2K or 3Kx3K.
  • I believe that almost every high-end visual effects company is using Intel/Linux rendering farms now. You can't avoid it.

    One of the big reasons is that the rendering software is expensive, and it's priced 'per processor', so you really want to have the processors be as fast as they can be. Right now, that means Intel architecture (here at Hammerhead we use Athlons); and Linux is by far the nicest way to use the IA32 machines.

    Interestingly, the only company I can think of that doesn't use primarily IA32 boxes for rendering is Pixar (who write and sell RenderMan, the most popular rendering package). They use mainly Sparcs. One reason, I suppose, is that they don't have to pay for RenderMan :) The other is that on a speed/cubic-foot (as opposed to speed/dollar) metric; I'm told that the Sparcs are a little better. I don't believe that, though.

    I interviewed people at every effects house a couple of months ago for an slashdot article I never finished :( and every single one was building Linux render farms.

    thad

  • Heh, just incase you were saying that NZ and Australia are the same (as your comment suggests), we aren't. We commonly don't use the same international links either.

    It has been a long time since we were reliant on small links. There are numerous satellite links, and of course the old fibre links across the Pacific. I don't know if the Southern Cross cable is fully operation yet, but if it is then various networks and ISPs will have a whole bunch of bandwidth.


    Still, given that what I've just said will be taken as overreaction, I'll just shut up now :)

    Sorry, I just can't take misinformation that involves me being lumped in with Australia!

  • That just isn't true. Australia use the European standards (E1/3) as opposed to the American standards (T1/3). E1's are superior to T1's. Sorry.

  • Hate to break it to you guys, but when you're doing "real work" like rendering (or anything else), the OS is 100% totally and completely IRRELIVANT.

    You find the hardware with the best floating point, and best memory bandwidth. Then you try and get the rest of things as cheap as possible, which means Linux or BSD.

    Maybe this was about saving money and had nothing at all to do with the OS, that doesn't mattter anyway?

    *hears the sound of bubbles bursting*

  • most stable windows yet? that's like referring to a new missle as 'the safest nuke yet'

    -GreenHell
  • I really like IRIX (sigh). Nice environment, but limited by MIPS processors. Hope SGI plans to port over some of their nicer IRIX tools. I now run both, but still enjoy the very well tuned IRIX desktop. Go faster GNOME!

  • Hmm, couldn't you intercept the output and splice in a frame of oh, say, a flaccid penis?
  • Point the first: 3DS was originally a DOS app, I believe.

    Point the second: Cheap and generic can imply *not* prone to failure. A mass-market vendor that's shipped and supported a quadrillion units of a given PC configuration can be pretty confident that it's encountered and fixed all the problems with that configuration, and is therefore selling a solid PC. Saying that no-one can build a PC like SGI is just silly.
  • Before you "do the math" on what Weta paid per processor, you'll need a currency conversion table. When they say $15,000 per processor, they mean fifteen grand NEW ZEALAND which is about seven grand US. No kidding. Still, we do CFD on a 16-processor cluster (WAY more cpu-intensive than movie making! Sorry!) and we paid about fifteen grand NZ for the whole darn thing. I'd say they still paid at least 10 times more than they had to, just for the security of getting support (?!?Ha ha ha) from SGI (now aint that just the drizzilin' squits..).
  • Well, they are working on better networking. Clustering isn't forthcoming, but that's a user-space issue anyway. Linux SMP is better, but nowhere near as good as BeOS's yet. (actually it never will be. BeOS apps are inherently multi-threaded while Linux apps are not.)
  • I work at / attend a university with a large structural biology facility (and a number of excellent crystallographers). I'm actually learning bioinformatics, but there's a lot of crossover largely because of the shared need for high-power computing resources. Our lab is based largely on Linux, with a mix of Linux and NT on the desktop; we only have a couple of SGIs because for what we're doing (genomic sequence alignment, Perl, C, web development) Linux is obviously a more cost-effective solution (and we mostly support the systems ourselves).

    The structural biologists still use SGIs (and a few Alphas) for anything important, though. The combination of available software, long lifespan, and, yes, overall power makes the workstations very attractive despite Linux, but I also get the feeling that it's just something people are comfortable with. NT remains confined to a few boxes used for word processing, thank god. SGI also can apparently be very generous with people developing Unix software, which helps prevent mass migration to Intel-based systems.

    I'd rather use Irix, myself- though I learned Unix primarily on Linux, I was blown away by the elegance and stability of workstations that must be pushing five years old now (there's still an Indigo for special imaging use; there are some 9-year-old Suns lying around too). AND THEY WORK. I've made Linux scream in pain, on a VA Linux workstation, no less, though it's nothing compared to what I've done to NT. :) I'm hoping SGI will port their X server and window manager to Linux- It simply blows away anything comparable. I personally use Linux and VMware, but a lot of people aren't willing to put up with the inherent instability, clumsiness, and pure ugliness of X on Linux, and for this reason and the convenience of Office we now have Unix programmers using NT desktops.

    I really don't see much pressure to change systems on a large scale. With us, certainly, SGI has priced itself out of business; but we'll keep using Unix of some sort for a long time. The sysadmins here are all Unix/VMS types, and when your PHBs are themselves longtime Unix users/programmers there's simply no reason to switch. But you won't find many people doing their dissertations or reasearch papers in TeX any more- in that arena, MS has most certainly won.
  • It has been a long time since we were reliant on small links.

    Well, I read in the Economist that NZ's primary link to the outside world was a waxed piece of twine, and the Aus. government was even demanding that NZ pay for dixie cup replacement on their end... for bigger data needs they mentioned a bi-weekly boatload of DLT tapes makes the trip.

  • I know of one computer animation company that is a massively large user of VA Linux boxes, but I can't reveal who the company is.

    I don't know how you qualify "massively large," but here at PDI [pdi.com] we've been using about 140 PCs from VA for about two years now for all of our work. This in in addition to our 200 dual processor Origin200s. The PCs are 450 MHz dual PIIIs 2U boxes with 1GB. Our new boxes will have dual 800MHz PIIIs and 2GB of memory and will hopefully be 1U. We primarily use proprietary tools in our pipeline, so it wasn't too hard for us to get everything up and working. The toughest part was the UI code and the tools for film resolution flipbook and quicktime playback. We're going to start having our animators use PCs for the primary desktop machine by the end of the year.

    A few months ago, the Visual Effects Society met to discuss Linux usage. I can't find the link right now, but a farily strong message was released indicating the industry support for Linux. Although they still do have a large Sun farm, and still use SGIs on the desktop, Pixar too is working with Linux.

    We tend to run two frames at a time on each box. That way, they can share the available memory and other resources, and, when we need to render really big frames, we can just run a single frame at a time to get both processors for rendering and all the memory. I think Pixar does substantially similar things on the big Sun boxes, rather than using heavily multiprocessed code. Usually for us, it is memory, rather than processing power that presents the biggest problem.

    Daniel Wexler
  • That's ok, no one's looking at it now... :)

    Yeah, freepascal is pretty good; it's better now. I found it when it was fpk-pascal, and some basic functions weren't quite the same. (also, my event loops could take up 100% cpu in Linux until I explicitly told them to sleep; it's not like DOS anymore!)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • I think it would also have to be called Lord_Of_The@nospam.dontemailme.Rings
    [remove the nospam.dontemailme to read the title]

  • > An SGI RealityMonster
    > could still blow whatever sorry Linux
    > configuration they had away.

    An SGI RealityMonster would be a waste of money
    since this application doesn't require graphics.

    Of course, they might (!) need graphics for some
    other part of the movie making process, in which
    case they could kill two birds with one stone.

    However, the fact that they can get away with
    a low power solution such as the SGI 1200s means
    that they don't need all the connection fabric of
    a Reality Monster, so that would probably be a
    waster of money too.
  • > Perhaps SGI isn't supporting Linux even, I
    > cannot remember, however

    SGI does support linux...
  • Try again.

    The last SGI system listed (SGI 2200 2X 400 MHz R12K) is for a single processor system - notice the "1" in the column labeled "CPU". The results for that system are 319/343, which beat the fastest Intel results of 304/314 for the Intel VC820 (1 GHz Pentium III).

  • That's a really interesing mental image I got just now. Thanks. ;)

    Only a corporation could be capable of such a contortion...
  • "Linux renders ships onscreen, NT renders ships useless"

    (I run BeOS, Win2K, and BSD, so no flames :)
  • I can't find an article link for it, but with the lack of region coding here there have been recent problems where video stores have been parallel importing DVD's before the cinema releases.

    Naturally this is just now beginning to cut into ticket sales, which I guess is one of the few down sides of not having region encoding. Not that people have more choice, but that the big screens might start disappearing in some smaller areas.

    On the other hand, it might mean they'll start releasing movies immediately when they're available instead of this irritating profit maximising thing when they wait for the right season to come around. Sometimes we get films immediately or even a day before the U.S. because of the time zone differences. Other times we have to wait up to six months!


    ===
  • If more people would seriously consider using linux for their work, they might be surprised at what they could do - and how much money they could save. I'm glad that the people making LoTR decided to use Linux because now we will most likely get better effects. Now I just hope that other companies would look at Linux seriously... It would spread linux much more, AND the companies would be saving money so they could produce better/cheaper products.
  • They also used Windows NT [ssc.com] for 1/3 of the machines, which explains the presence of (and need for) of those KVM switches. :-)

    The article is a good read, explaining the rationale behind their choice of Linux, and comparing their experience to NT, Digital UNIX and other platforms. (There's even the gratuitous server-room shot, looking very much like a 90s version of the "Mother" computer room in Alien -- probably the same picture you saw.)

  • by tolldog ( 1571 ) on Saturday August 26, 2000 @04:49PM (#825676) Homepage Journal
    Big Idea [bigidea.com] made the same decision not too long ago.
    I never bothered to figure out what the cost/performance ratio was between the two platforms. I just knew that Linux was a lot cheaper.

    The thing that suprises me is that they only have 16 boxes... and that this is newsworthy. We have 42 (with almost exactly the same config) and plan on ordering more in the near future.

    One other difference is that they are using RenderMan and we are using Maya's [sgi.com] renderer, which has recently been ported to Linux. For the type of work that we do, Maya is more than enough for us. Also, we hardly have any frame times of an hour... if we do, I yell and scream to get it cut down. The difference is that we are only rendering to 724x486. The use a higher resolution for the big screen, we only have to wory about NTSC for now.

    The prices quoted in the article seem to be inflated a bit too. Unless they are quoted in the NZ dollar and that is about a 2-1 with the American. Octanes should be about $20k in the States. Linux boxes similarly configured, from VA [valinux.com], which we use, instead of SGI, are in the $5-6k range.

    The performance boost that we have got from adding the Linux boxes is amazing. We went from being able to render on 50 MIPS CPU's 14 hours a day to rendering on an aditional 84 Intel CPU's 24 hours a day. The comparison in render speed is about 1-1, slightly infavor of the MIPS chip.

    When it is all said and done, a move like this should be a no brainer to any studio. Rendering needs horsepower and system memory, both of which are cheaper in the Intel world. SysAdmins need Unix for ease of administrating and lack of down time... every studio I know considers the render system to be H.A. (high availablity). When things go down, projects get delayed... and that is a big no no in the production world (think of all the billboards you see with release dates on them... months in advance). Linux is a great choice for this.

    The only drawback is that the SGI boxes have the ccNuma interface, which is great for single frame renders.

    I know that I have posted many times on /. about rendering and Linux vs. SGI. It is great that there is now a story dedicated to it.

    FWIW: Our next two videos are the first to use our Linux render farm. Esther [bigidea.com] is at least 3/4 rendered on Linux boxes and Penguins [www.321penguins] will be mostly, if not all, rendered on Linux boxes.

    -Tim Toll
    Render Architect
    Big Idea Productions
  • A lot of people have touted IRIX as being so vastly superior in performance to Linux under these circumstances. It's good to see that Linux has caught up quite a bit. Granted, I'm sure that Irix still has a bit of an edge, since it's well customized for the task, but the race is close enough that Linux is more cost-effective. I've always been a bit leery myself of the concept of one operating system powering everything from PDAs to render farms, but I'm thrilled to see a Free Software product that can scale in both directions so phenomenally.
  • priced per processor...

    Not all renderers are. We use Maya's renderer, which they recently opened up the license on, so the price is nill for the renderer.

    The only price for us is the boxes. That is Intel/Linux advantage comes in for us.
  • I've always wondered about that foot in the mouth thing. When you think about it, the MPAA having their foot in their mouths is a doubly amzing feat, since their heads are allready so far up their asses. How does one get ones foot in ones mouth when said individuals head is up their ass? I've always wondered about that.
  • by torpor ( 458 ) <ibisum@@@gmail...com> on Saturday August 26, 2000 @11:04AM (#825680) Homepage Journal
    They'd have to take their head out of their ass before they could take their foot out of their mouth...

    ;)

  • the other reason PIXAR uses Sparc machines is because they don't pay much for them - they have a rather nice deal with Sun where sun gets bragging rights and PIXAR pays less.....
  • . Linux can be configured to do everything but take out your garbage.

    LOL. There is a garbage collector in Java, which runs under Linux.
  • He's talking about the 3D model for the image. I'm still not sure if this is several GB large, but it has nothing to do with the resulting pixel image.
  • I work for a visual effects company here in Australia, and we still use big SGI's along with NT renderfarms.

    We have movies such as The Matrix, Babe and Thin Red Line to our name.

    Animal Logic, mate? ;)

    The best thing about using NT is, the same machines which are used for animating and modelling in the day can be used for rendering in the night. This to me makes more sense than buying a machine for one specific role.

    The same thing could be said about SGI/Irix workstations or anything else that can run both the modeller/animator user apps and the automated network renderer.

    And since they have that much power I'm guessing that they need it to keep on schedule. But it would be useful to be able to add the workstations to the renderfarm-pool.

  • They may have originally BROUGHT cheap computing to the masses, but they don't intend to leave it there.

    Like the drug dealer who gets the junkie hooked with a "free sample", Microsoft and Intel have been steadily raising the bar, with bloatware, proprietary locks to prevent competition, and planned obsolescence. They went there to get the marketshare, now that they got it, they can do whatever the fuck they want.

    if it ain't broke, then fix it 'till it is!
  • They are talking about a render farm, not desktop machines.

    Using desktop machines for render farm is very good. And yes, currently, the desktop machines have to be NT to get the GUI software. At where I work we are using both Linux renderfarms and NT desktop machines for running batch processing with no problems. Most rendering software is available for both. We use tcsh on the NT machines so the batch processing looks the same.

  • I couldn't agree with you more.

    We looked at NT boxes. We are a Maya only house and we were unsure when the Maya for Linux release would be. We actually thought of buying the hardware, throwing NT on it, rendering with it and then switching to Linux when it was available.
    That was a shortly lived idea. None of us wanted to be responible for porting our code to work on the NT's. Even with a perl back bone, it looked like a night mare.

    Thankfully, the Linux renderer was announced and it came out, all in good time for us to use it on our current productions.

  • Yep, that's the same picture. I would have included a link to that article if I had had the link handy.
  • The New Zealand Herald site, IIRC, runs with ColdFusion on some sort of UNIX.

    From what I've heard, ColdFusion on UNIX doesn't cope terribly well with high loads; the NZ herald site has had problems with this before, even without being slashdotted.

    --
  • Plus the sparcs are 64 bit....
  • by Jeremi ( 14640 ) on Saturday August 26, 2000 @11:13AM (#825697) Homepage
    Hate to break it to you guys, but when you're doing "real work" like rendering (or anything else), the OS is 100% totally and completely IRRELIVANT.

    That's maybe 75% true. There are several requirements on the OS, though:

    1. It needs to be compatible with your rendering software, and the hardware that you want to run your rendering software on
    2. It needs to be stable (can't render much if you're blue-screening all the time)
    3. It needs to stay out of the way of the rendering app
    4. (to a lesser extent) it needs to be easy to install and maintain
  • by Malcontent ( 40834 ) on Saturday August 26, 2000 @07:06PM (#825699)
    " while (!world->perfect) {
    whine ();
    }"

    May I suggest.

    while (!world->perfect) {
    act ();
    }

    A Dick and a Bush .. You know somebody's gonna get screwed.

  • You are probaly a troll, but I am hoping that you read slashdot because you are soul searching, or at the least curious, so ok-- i'll bite :)

    Win2k would not ever be my choice. Much of what was discussed above does not apply. If you are interested in actually using the machine you are sitting at right now, then W2K will eventually fail miserably. Just think. What happens when a user wants something that has not been anticipated by all of the MS GUI designers? How about scripting? Real scripting, not just batch files. How about development. Granted the tools are nice in Gates land, but at what price? Visual Studio is a great place to work--provided that you are not interested in your code running anywhere else. Or even on the next version of the OS.

    As for the BIG IRON days being numbered, you are way wrong. I do believe that cheap clusters will erode and change that market, but not eliminate it. Take a look at the new SGI server line. Modular cluster computing. You get to build what you want how you want, even take it a part and make smaller clusters. These machines in their largest configurations will handle 500 processors (probably more) all running under ONE OS IMAGE. There are many classes of computing problems that require this. Nobody is even close to this level of NUMA development. Also look around. SUN has been selling lots of their E10000 series machines. They sell these because they work, and they will keep working for a really long time. Big problems sometimes require big machines. This will not change. As all the machines mature, the problems will change, and become doable on smaller hardware, but that does not mean that the big stuff will become out of date, it means that it will just get used for something else.

    I have older SGI machines pushing 10 years old doing things that are important to me. Because they run a UNIX variant, I am able to build software for them that was not even on the drawing board when the machine was made! Try that on Win 3.1.

    Now where were we with Windows? Oh yeah accomplishing the same result. Rendering probably would work on W2K. It proabaly would even have similar performance. But I am certian in 5 years time that cluster will have cost more in license fees, administrative work, and scheduled downtime (read: upgrades) than any UNIX cluster will. Pick your vendor, does not matter.

    As for long life windows does not stand a chance. Every try to do anything useful on a 10 year old pc with windows? Can you even find the software? Take that same machine running linux, and it can be a productive member of a network with zero trouble. Log on to it from a faster machine for display, and you may not even notice its age when doing day to day admin. Makes for great file servers, ftp, routing, mp3 players, lots of good stuff. You know since I began learning UNIX skills, I have not thrown away a machine since. I can ALWAYS FIND SOMETHING USEFUL TO DO WITH THEM.

    The OS really means everything these days. I know because I run and administer a few of them. Windows machines of any variant give me more trouble than any other. Running windows means that I have to work the way I am supposed to. UNIX of any kind means that I work the way I want, from where ever I want.

    Consider this last point. I really like working on an IRIX desktop. You know running UNIX machines means that I can sensibly do that for just about as long as I choose to. Not one penny of investment on any of the IRIX machines I deal with will ever be wasted until I decide that they are no longer worth it. Basically this means until they break. :) One of my favorite ways to run linux is from my IRIX machines. I lose nothing. This goes for any UNIX variant. You have choice. And your investment will last. This is what the PC vendors don't wan't you to really know about UNIX hardware in general. Even standard PC hardware these days is pretty nice. They would much rather have you buy a new one every couple of years when windows gets too fat to run on the one you have.

    Everything that I have learned about UNIX (and I started late) applies no matter what. Some machines will do more than others, and some will require a little different tweak here and there, but it all works the same. My experience on windows machines becomes useless as they change from year to year. I really hate that. Don't you?

  • by TheDullBlade ( 28998 ) on Saturday August 26, 2000 @11:14AM (#825702)
    ...unless Natalie Portman is playing one of the trolls who gets petrified, and who insists that the best way of cooking the dwarves is by pouring hot grits down their pants.

    Also, Frodo would have to be escorted to the Cracks of Doom by a whole team lead by Beowulf (IOW, a Beowulf cluster).

    Finally, this would have to be distinguished from the novels and the cartoons, not as the "live action movie", but as the "post-Columbine version".

    I'm not sure if a penis-bird would have to be somehow involved, but it would help.

    ---
    Despite rumors to the contrary, I am not a turnip.
  • Let me suggest an alternative:

    Frodo: Well, we made it to Mount Doom, the seat of Sauron's power.

    Samwise: Yes.

    Both look pensively at Mount Doom.

    Samwise: What does 'Microsoft' mean?

  • The "real work" is always done by some other program. That doesn't mean that the OS is irrelevant.

    ---
    Despite rumors to the contrary, I am not a turnip.
  • YES! Rendering under DOS!

    Wanna make a movie?

    (two weeks later...)

    What do you mean DOS is (at best) 16 bit and not running in protected mode?
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • by furiousgeorge ( 30912 ) on Saturday August 26, 2000 @11:20AM (#825708)
    >Hate to break it to you guys, but when you're
    >doing "real work" like rendering (or anything >else), the OS is 100% totally and completely
    >IRRELIVANT.

    In theory, yes. Rendering is just math.

    .... BUT. The os does come into play - context switching speed, VM system, general efficiency, etc.

    Originally production houses were switching from SGI to NT boxes to save $$$$ - hard to justify the price performance of an Octane when a cheaper dual PIII would whip it in number crunching.

    Now people are switching from NT to Linux. Personally I don't give a crap about what OS you use (i use both), but from my own personal benchmarks and graphics companies that i've contracted for, Linux can give 10-30% greater throughput on the same hardware.

    Also - most large production houses WERE based on SGI's - so it is easier to move their custom code from IRIX to Linux.

    (have you tried to administer NT boxes remotely? )

    j
  • Rendering also needs oodles of memory.

    The 1meg limit could be a bit of a problem :-)
  • Then apparently DOS is the ultimate render farm OS.
  • Sure, but you've missed the point. I'd agree the OS itself is irrelevant where rendering is concerned. I'm sure it could be done on MS-DOS if you wanted to. But that's still not the point.

    The point is that Linux is becoming something that is useful to everyone. Not just for UNIX geeks. Not just for trying out UNIX at home, because you can't afford a Sparc.

    The shared development over the Internet towards Linux has produced something useful for entirely new groups of people. This is the point. This is the wonderful thing.

    Linux doesn't need 100% marketshare to win. Linux wins everytime someone new downloads Linux and says "I like this".
  • Someone at the RenderMan newsgroup asked about it and several Pixar folks ansewered. RIB files (the scenes descriptions sent to a RenderMan renderer) would get in the GB range excluding textures. Here are some quotes (if I may) from Tom Duf and "Wave" Johnson:

    Tom Duff wrote:
    >
    > Daniel McFarland wrote:
    >
    > > What size (mb ) is considered a large scene file.
    >
    > Around here, 2000 MB is large, 500 MB is small.
    > Not counting texture maps, which normally
    > add up to larger than RIB files.

    note that 2GB would probably be the size after a catrib -ascii, i.e.
    the gzipped binary RIB file would probably be
    smaller. Having said that, I've routinely seen files in the 1GB+ range (after ASCIIfication)
    that were only a portion of the scene (i.e. they were to be composited
    over some other elements).

    Most RIB files of a scene would occupy only a few
    hundred megabytes on disk, assuming (as most are) that they are
    compressed binary RIB files.

    Also note that these days, many of your RIB files
    have Procedurals in them that are trivially compact in the RIB
    file, but expand out into a fair amount of data when the renderer
    actually evaluates them.

    2 steps forward, one step back :-)

    --
    --> Michael B. Johnson, Ph.D. -- wave@pixar.com
    --> Studio Tools, Pixar Animation Studios
    --> http://xenia.media.mit.edu/~wave

  • I am what you would probably call a Linux evangelist, but in this case I have to point out that rendering is a CPU bound process, and probably the only OS charasteristic that could significantly affect the performance of such jobs is stability or in other words *uptime*.

    I also have to say (even though I am an SGI stock holder) that lowend SGI servers are kind of expensive, and probably the best price/performance ratio under Linux is on Alpha or Athlon boxes. And if you use 1U or 2U rack units you can fit *a lot* of CPU power just in a single rack. VA's boxes are not that bad either.
    __________________________________
    Stop privacy invasion!
  • by Skald ( 140034 ) on Saturday August 26, 2000 @12:35PM (#825727)
    Hello, this is Frudu Baagins, and I pronounce Elbeereth as Elbeereth.
  • bad form to follow up on myself

    >Linux can give 10-30% greater throughput on the
    >same hardware.

    I'm speaking about *pure* rendering - number crunching.

    Interactive 3D graphics on Linux is still kinda sad and unstable. Getting better, but it ain't there yet.

    j
  • Come on guys, this is old hat. I did this four years ago with a cluster that was 5x as large! :-)

    Linux Helps Bring Titanic to Life [linuxjournal.com]

    - |Daryll

  • by pq ( 42856 ) <rfc2324&yahoo,com> on Saturday August 26, 2000 @10:13AM (#825736) Homepage
    After all, if these guys support OSS, they're supporting a bunch of long-haired hippies who "beleive in downloading software freely over the internet..." So the MPAA should put its foot down on this one -- if it can get it out of its mouth, that is.

  • Three Tux's for the Elven Kings under the sky.
    Seven for the Dwarf-Lords in their halls of stone.
    Nine for Mortal Men doomed to die,
    One for the Dark Lord on his dark throne
    In the Land of New Zealand where the shadows lie.
    One Tux to rule them all, One Tux to find them,
    One Tux to bring them all and in the cluster bind them.
    In the Land of New Zealand where the shadows lie.

    Sorry, I couldn't resist :)
  • by Elvis Maximus ( 193433 ) on Saturday August 26, 2000 @10:20AM (#825740) Homepage

    Lord of the Rings Being Rendered With Linux. Well, that it: the ultimate geek story. No point in hanging around trying to come up with something else. Let's all pack up and go home.

    -

  • After all, it is more secure, just what you need to deal with Cracks of Doom...

    No? I'll get my coat...

  • Rendering isn't good benchmark for comparing systems performance. Renderers use very small subset of OS features and spend most their time on computations.
    The best OS for renderer on 1uP machine is "NO OS" (renderers need OS only for "activating" more processors and for very simple communication & file management).

  • It would be cool, but it will never happen, for two reasons.

    First, there's the issue of studios protecting their intellectual property: have to be sure that those 3D Jar Jar models don't make their way onto the internet, or that images from the films don't leak out early.

    More importantly, the amount of data required to describe a frame in these kinds of movies is immense. We're talking gigabytes for a single image. So you're talking about gigabytes of communication and then a few hours of processing. It just doesn't make sense.

  • by Anonymous Coward
    linux has been used before for CGI rendering- a lot was brought up about Digital Domain turning to Linux (on DEC alphas) as a cheaper solution to do a lot of the water rendering effects in Titanic
  • I had heard that it's illegal to sell a region-locked DVD player in NZ, but what about the discs? Are they region coded?
  • thad,

    After Digital Domain successfully used Linux-based computers to do much of the computer animation rendering in TITANIC, I think a lot of people realized that modified versions of Linux can be used for computer animation purposes.

    That's why companies like VA Linux and Penguin Computing have sold many, many systems to a number of computer animation companies--especially dual and quad Pentium III/Xeon boxes, with the boxes often running in clustered fashion using Beowulf. It's a case of where the relatively cheap software can run the high-end hardware built by the companies I mentioned.

    I know of one computer animation company that is a massively large user of VA Linux boxes, but I can't reveal who the company is.
  • DVDs in New Zealand will be region coded, just like any other region of the world (although there is one 'whole world' region code that is very rarely used.)

    Some Chinese manufacturers are now selling DVD players with 'hidden' region hacks in their firmware - of course the codes to unlock these hacks are magically made known via the Internet. So there are probably ways of getting a region-free DVD player in NZ, quite apart from the whole PC and software hack approach.
  • by malducin ( 114457 ) on Saturday August 26, 2000 @11:52AM (#825751) Homepage
    For those interested, there was a BOF meeting at SIGGRAPH 2000 dealing with Linux and 3D. The notes by Brian Paul are already available: SIGGRAPH 2000 Linux / OpenGL / 3D Birds of a Feather Meeting [linux3d.org]
  • James Cameron's company Digital Domain [d2.com] was already using Linux several years ago. For "Titanic" they had a huge render farm consisting of about 100 Alpha machines running Linux. I saw a picutre of it, and one of the more interesting things is that they didn't use rackmount enclosures; instead they had a room full of mini-towers networked using 100Base-T. In the middle of the room there was a giant KVM switch so that a single keyboard/monitor/mouse could control any of the machines.

    The render farm was used primarily for "helicopter shots" where the entire ship could be seen at once. The ocean waves were completely CGI, and as we all know water takes a s**tload of computational power to render realistically, hence the army of Alphas.

  • by malducin ( 114457 ) on Saturday August 26, 2000 @12:02PM (#825754) Homepage
    It's only twice as fast for rendering which is mainly a function of processor speed, system bandwidth and memory (and to a lesser extent network bandwidth). Intel and AMD processors are way faster than MIPS processors, but SGI has high througput systems, and also the best OpenGL implementation. They probably use the machines for interactive work during the day (and I don't doubt the new Octanes are better than any), but during the night they join the rendering farm, where they could really shine. Your claim about the software is incorrect, Photorealistc RenderMan runs mainly on Unix based OSes (Solaris, IRIX and Linux) and Windows NT. So they for sure have measured how "good" PRMan runs under de 1200 and standard Octanes, and the CPU performance would make a differnce.
  • After all, it is more secure, just what you need to deal with Cracks of Doom...

    Why would I want cracks of Doom? It's already had the source released. Better would be cracks of Quake III, I think.
  • Plus the sparcs are 64 bit....


    They also have 128bit support. Don't know that any rendering software can actually make use of this though.

    The 64bits should be able to make a difference in massively rendered scenes. Also, I don't think Solaris (on sparc) has any reachable file-size limits.

    I still think of Solaris/ SPARC as a more robust / higher quality solution. If I had to guess, I'd even say better than IRIX / mips.

    This isn't to say Linux isn't catching up. Just that, those extra dollars actually do go somewhere. In a farm-like situation, however, it's really doubtful that it's worth all the extra money.
  • Dare I say it... BeOS?
  • by barracg8 ( 61682 ) on Saturday August 26, 2000 @01:06PM (#825759)
    Sorry, but when fsck did (Score:-1, Flamebait) become (Score:4, Insightful)?

    Moderators: when the guy SHOUTS 'IRRELIVANT' he is doing it to piss people off. This is ill-informed flamebait.

    A render farm *does* require these things of it's OS:

    • Reliability. if you spend half your time crashing, rebooting, and redoing work that has been lost, you need twice as much hardware, at twice the cost, to get the same amount of work done.
    • Low memory footprint. Rendering is very memory intensive. The less space that is wasted by the OS, the faster you go.
    • Runs on the hardware that you have decided it approprite to your requirements.
    • A nice, friendly, productive operating environment. Remember, many of these animators will be happiest telling the render machines what to do in a UNIX environment, coming from an IRIX background.
    • An OS that keeps out of your way. Eg how much time does your OS spend sitting in the scheduler?
    Sorry if I get a bit flamey, but this post is just *SO* overrated.
  • This just in!

    DVD-CCA has is suing the makers of "Lord of the Rings" movie for using Linux. One of the lawyers for the organization says that "Anyone who uses that illegal and immoral open source software diserves to get sued". Moments later the lawyer was seen setting old ladies on fire and pushing them into oncoming traffic while chanting satanic messages.
  • by Spurious George ( 225993 ) on Saturday August 26, 2000 @10:32AM (#825771)
    The scene: Frodo and Samwise are following Gollum up the windswept crags of Mordor. Suddenly, off in the distance, a strange, darkish creature appears...

    Frodo: "Aww, crap! Not another Nazgul!"

    Samwise: "Umm... Hey, Fro, that doesn't look much like a Nazgul to me!"

    Gollum: "Then what the hell issss it?"

    Samwise: (squinting) "Uh, I think it's a... it's a penguin!"

    Frodo: "WTF!!?!?! There are no frickin penguins in Middle Earth!"

    Gollum: "It'sss thosssse damn foolsss in ssspecial effectsss! They've ssscrewed up our precioussss movie, they did, they did!"

    Frodo: "Hey, penguin! Get the hell out of here! We're trying to make a movie, dammit!"

    Gollum: "Curssse thossse Open Ssssource bassstardssss! Thisss really pisssesss me off!"

    Samwise: (Squints again) "Oh, never mind... it looks like it is a Nazgul after all! My bad!"

    Frodo: (Smacks Samwise) "Dumbass!! Hobbits never, ever say 'My bad!'"
    (Looks over at Gollum) "And what are you looking at, you shriveley little gimp?"

    Gollum: "That'sss it! Ssscrew you guyssss; I'm going home!"

    THE END

    --
    while ( !universe->perfect() ) {
    hack (reality);

  • The thing is that people haven't touted Irix as being superior in performance for rendering, for doing the actual creation a SGI mips workstation will blow most anything away. Rendering is just pure and simple CPU, nothing really more than that, it doesn't take a big graphics pipe, just a big fast CPU, as long as the OS can get information from the drives to the CPU fast cool. Infact it running Linux has very little to it's performace, the x86 CPU is the one that is key there. Irix never was the slow point for rendering, but Irix only runs on mips and mips doesn't render fast. The operating system is MUCH less important than the CPU for rendering graphics (most of the time, never make a blanket statement).

    Many shops are using commodity boxes for rendering (running Linux, BSD, Solaris x86, or even NT), but for most of those same shops you'll have to pry their SGI workstations out of their cold dead hands.

    Linux and it's cost effective is a pretty useless point for most shops these days, the rendering software often costs 20+ times more than Irix does (lastest ver of Irix is $600), the savings comes in that you don't need that extra visual performance for rendering that the SGI system gives you, so you can use MUCH less expensive commodity hardware for that.

    Spelling and grammar checker off because I don't care
  • I think the reason why this story made the news and not yours is that SGI's PR company is better than VA Linux's :)
  • Maybe 50% or more of DVD players sold in the UK have region hacks as well. Mine does.
  • Pfft, CGI is for losers. They should be using Jim Henson's Muppets for LotR!

  • That is more than possible...

    SGI has one mean PR machine. But... Linux is also an area that SGI is trying to prove they can get into. VA doesn't have to do that...

    Also... I find it interesting that SGI would be behind the story... it is cutting the legs out from under there Origin line, the *true* SGI render solution.

    I think maybe this had more to do with where it was (in NZ) and what was being worked on...

  • More importantly, the amount of data required to describe a frame in these kinds of movies is immense. We're talking gigabytes for a single image. So you're talking about gigabytes of communication and then a few hours of processing. It just doesn't make sense.

    Right. What we have here is a truly heterogeneous network, with latencies you cannot calculate, as well as complete unreliability as to the availability and processing power of a node. The best use of such processing network is when you have something like SETI@home: Massive local computations on very little transfer of data across the network, and the ability to slice the processing needs into minuscule chunks so you can easily duplicate processing of those chunks in case a node goes down. Distributed internet processing is excellent when you want to solve problems with lots of computations for simple answers, much harder when you want to produce large output.

  • I noticed these comments in the story:
    "Linux is not bleeding edge stuff", which I guess means that Linux must be mainstream, and also:
    "We can't afford for the system to go down" - Why didn't they use Windows 2000 then - it's the "most stable windows yet".

    Oh, and I submitted this several days ago, but no-one was interested then.
    2000-08-22 09:28:12 Lord of the Rings to save money using GNU/Linux (articles,movies) (rejected)

"In my opinion, Richard Stallman wouldn't recognise terrorism if it came up and bit him on his Internet." -- Ross M. Greenberg

Working...