Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Movies Media

Digital Movie Projection: Can It Live Up To The Hype? 291

hobb writes "OK, so Roger Ebert's not a technical genius, but he's written an interesting piece on the future of digital movie projection (theatres, not home.) Read his essay here. Digital for home systems is great, but will 1280x1024 be good enough for theatres? That's about 10mm dot pitch, folks... "
This discussion has been archived. No new comments can be posted.

Digital Movie Projection: Can It Live Up To The Hype?

Comments Filter:
  • stay a normal distance away.
    The thing I'm curious over is, why is HDTV almost 50% higher resolution than this? HDTV goes up to 1920x1080, and this is significantly lower... But, it's what Mr. Lucas is using for Ep 2.. Oh well. At least it means no more scratches.
  • Forget about the laughable technical errors in Ebert's article; let's cut to the chase. Ebert makes the following sweeping statement:

    I have seen the future of the cinema, and it is not digital.

    This is absurd. Ebert sees a demonstration of the current implementation of a video projection system, doesn't like what he sees, and then jumps to the ludicrous conclusion that "digital" projection is inherently a Bad Thing.

    I take strong exception. Based on Ebert's review, I agree that the current implementation probably does have a ways ago. (I'll reserve final judgment until I've seen it for myself.) However, I will say for the record that the future of cinema is digital, and in a very big way. The digital projection system of the future will blow today's technology away -- and yes, that includes Ebert's precious MaxiVision48 system.

    This "future" may not be as close as the hype has been leading some of us to believe, but it is there, and it will be waiting for us once the technology matures. Anyone who doesn't see this as self-evident must be unbelievably myopic when it comes to technology.
  • but rather an evolution. Mr. Ebert seems to think the first digital projector will be the only one ever to come out and give the best possible results. But I figure he'll find out in a couple of years that you can't push old technology forever and that the digital projectors have come a long way by then. But the concept of competition is always good. This will make both Maxivision and TI work harder :)
  • It's the same as most billboards you see these days. View any of them up close and you can see a series of dots, depending on the intended viewing distance these dots could be the size of your fist. However, walk back a few metres and suddenly the dots aren't as noticable as they once were

    So long as the minimum intended viewing distance (the closest anyone will be to the screen) is scaled in proportion to the the dot pitch there will be no problem. With most cinemas I've been to recently here (Australia) the screens are at least a good 2-3 metres above the ground, with the first row of seats at least 10 metres back from the screen. So I couldn't imagine any problems using the screens as suggested in this post in most modern theatres. Any closer than that though and things may need to be reviewed.

    What I cant understand though is why would a director/producer/whatever ONLY create the movie for this resolution? It seems far more logical to me to create it at a much higher resolution (well slightly higher than HDTV will require) so that in situations where I higher resolution is possible, it can be fully utilised. I can think of nothing worse than having this HDTV become widespread, but being wasted because of a few shortsighted people in the film industry.

  • When I first heard of digital movie theatres I wondered how they managed it. It looks like they didn't. It seams they have a bit of a ways to go yet before it's realistic. Something tells me they will have to quadruple the resolution before they get good enough images. They also still have the data nightmare to solve.

    On the other hand I'm intrigued by the MaxiVision48 mentioned. That sounds promicing. It's still film, but it gets past the 24 frames per second flicker problem that keeps me out of the theatres.

  • Remember hard disks, and what was told about them a long time ago: 'Don't worry about that mechanical crap, we'll have ultrafast RAM storage in no time, and there is Moore's Law to prove it!'

    What happened? Analog kept improving. And RAM mass storage do exist, but almost nobody can afford them.
  • It can project film at 48 frames per second, twice the existing 24-fps rate.

    NTSC does 60 fields per second when it's the evening news or a soap you're watching. Doesn't make much of a difference, does it?

    MV48 uses a new system to pull the film past the projector bulb without any jitter or bounce.

    Electronic systems don't jitter or bounce either. They also don't have scratches or dust, unlike film.

    The source of their signal is an array of 20 prerecorded 18-gigabyte hard drives, trucked to each theater. This array costs an additional $75,000, apart from the cost of trucking and installation.

    $75,000 for a twenty disk raid array? That's pretty damn expensive. And has he never heard of tape? A new movie could fit on a couple of 70GB DLT tapes, no need to "truck" a new RAID array in.
  • 1280x1024 is about 10mm dot pitch over what area? I mean, if you project 1280x1024 over a screen the size of North America, that's about a TWO MILE dot pitch.

    I understand that I'm abusing the concept of 'dot pitch' a bit to make my example, but saying that a resolution necessarily equals a dot pitch is just incorrect.


    --
  • I saw the Digital Edition of Phantom Menace at a Pacific theatre in LA County.

    Like Ebert says, it was better than the analog print, but not by much. The CGI effects were sharper but I couldn't say that any of the live action looked better. The improvements were pretty subtle.

    One thing Ebert doesn't mention is that the digital version won't get marred by dust and scratches as the film version would. So in this manner, digital does give you a better picture than film.

    I'd like to see the MaxiVision48 system in action to compare it to digital projection. If it's only 50% or 100% better than regular film, it's still leaps and bounds over the current state of digital projection.

    It being cheaper and compatible with existing formats, is definitely a plus. You've got to wonder how much a truely superior digital projection system will cost.

    Right now, some theatres (in LA) have raised ticket prices to $9! How much do you think they'll raise prices to recoup the costs of a digital projection system? If MaxiVision48 works as they say it does, this means that a theatre will only have to have 1 projector system to play both regular and 'enhanced' movies. With a digital system, they'll need to keep a regular film projector for any analog only movies they get! I'm not going to be willing to pay a lot more (frex $15 or $20) for a barely noticable improvement in picture quality.

    Remember digital does not always mean better quality. Digital solutions often beats analog solutions based on convenience, price, or consistency not on higher quality!

  • I've heard a lot about this in the past year or so, and I don't think that digital movies will really ever make it. One thing is that there is a large risk of theft with the movies stored in digital form. It would ba a lot easier to obtain an "advanced copy" by simply dupin the drive with the film on it.

    Second, some people have invented a method of filming where you remove the old analog sound strip off the side of the film and replace it with each individual frame being longer. For ease of speech i will call this "wide standard" (by no means is it the official name though). I've heard that this method makes for no film scratched, bubbles, etc... and there is no piracy risk there. Finially, if you are making a new theatre, the least expensive of these three (standard, digital, and wide standard) is the wide standard, with digital projectors being by far the most expensive

    Quite franly, I think that if we are gonna see any new technology in the theaters, it will be the afforementioned wide standard.

    ---------------


    ---------------
  • Roger mentioned 4:1 compression, im assuming that is a lossless compression. Wouldnt it make sence to use MPEG2 or even the brand spanking new MPEG4...
    No I dont think D-Movies will go at that resolution...and speaking of MPEG2 and resolution, anyone know if there are plans to make DVD at HDTV resolutions? It seems like THE issue no one is talking about.
  • by 97jaz ( 33263 ) on Sunday December 19, 1999 @12:39AM (#1461993)
    Digital video has crap resolution. Why? Because to achieve the same sort of resolution you get with film, you need to store enormous amounts of data.

    You can't afford the required amount of memory/storage. So you use compression.

    Which means you lose quality (because you certainly aren't using *lossless* compression -- not when you need huge compression ratios), which was what you were trying to gain in the first place.

    Until multi-terabyte storage is fast, cheap, and small, film will continue to be superior. As is so clearly is now.
  • They could grab the signal from the satellite and try to break the encryption (as DVD encryption has just been broken).

    This is not a fair comparison. With DVD, the method to decode the signal must be somewhere on the user's (hence the potential pirate's) device in order to watch the movie. No such restriction is required when transmitting movies to theaters. The key does not have to be made available in any way, shape or form to potential line sniffers.

    You simply give the theaters the decryption key, and send everything through the encrypted pipe. In the unlikely event that somebody does crack the key, simply switch keys and issue all the theaters a new one. (In fact, it would probably be a good idea to switch keys from time to time anyway, just to be safe. Not possible with DVD, cause that would break all the current players.)

    Of course, he still has a valid point when it comes to bribing projectionists. Depending on how much access they have, that could present a risk of the key getting out too. But assuming they can trust the theaters to keep the keys safe, there is virtually no risk of piracy.

    To be even more secure, give each theater a different key and encode a custom stream for each one. If one key gets compromised, the rest are still secure. The cost would be more processor time to encrypt a new stream for each target, and increased bandwidth usage because multicasting becomes impossible with this method. Probably overkill, but you know how paranoid the movie industry gets.

  • The CGI effects were sharper but I couldn't say that any of the live action looked better.
    I'd be surprised if the live action didn't look worse. Of course, it depends on how it was originally filmed, but the resolution of film is considerably higher than any digital video system in use today.
  • by Nigel Bree ( 45256 ) on Sunday December 19, 1999 @12:43AM (#1461996) Homepage
    HDTV resolution? The first thing to remember is that HDTV is not a single resolution and frame rate, but rather a wide collection of different ones. There are a lot of standards to choose from. :-)

    The reason for the current resolution limit for the digital theatre projectors is a simple one of physical manufacture - the various systems such as the light valve are not CRTs. As with devices such as LCDs or the CCDs used in video cameras, these devices need to be made a certain size, and with that size comes the problem of yield. You might be able to tolerate a stuck pixel on your notebook PC, but in a theatre?

    Another thing to remember is that resolution numbers are not the be-all and end-all. Budget and independent filmmakers are taking up "prosumer" equipment based on DV, and shooting in progressive-mode PAL (25fps) for transfer to 35mm. And when professionally transferred to film and shown in theatres, those digital-video images can look pretty damn good.

    Ebert's enthusiasm for a true 48fps film process is understandable. Some people in the TV world gush every bit as much about moving to, for instance, a 720-line 60fps progressive scan mode instead of 1080 interlaced. There are aesthetic judgements to be made here which are very important.

    The other thing to bear in mind is that the quality of the movie you are watching is every bit as important as the gee-whiz technical aspects of getting it to you. Digital video cameras - DV-style, not even the HDTV ones - are already enabling independents to make films for inconceivably low budgets. The real battle is in getting the art distributed to where people can see it.

  • 1280x1024 sounds like odd choice because of its unusual aspect ratio (unusual for cinema that is). Most movies won't fit in to this, meaning that they will end up using those damn black blocks above and below the picture that we all know from waching widescreen movies on a TV. 1280x1024 is bad, but the real resolutions will be even less than that. Sounds like bad deal to me.

    I disagree with Ebert: The future of cinema is digital. It this system won't cut it, someone will make one with 10k by 5k resolution, 60 fps, 12 sound chanels and so forth. Digital "film" offers some undisputed benefits over real film, and because of the allmost infinate flexibility of digital technology all the benefits of celluluid can be copied in to the new systems.

  • > It would ba a lot easier to obtain an "advanced
    > copy" by simply dupin the drive with the film
    > on it.

    What's worse? Duplicate a digital movie, or physically stealing the actual reels themselves?
    At least you've saved some dough. :)
  • You could equally ridicule (for example, let's not get sidetracked) a suggestion that solar powered cars are the transport of the future, because obviously we'll be using cold fusion powered flying machines 'in the future'.

    However, if you mean 'the near future' then from his description it sounds like this MaxiVision48 gear is pretty sweet and relatively inexpensive and worthy of promotion. I know I certainly don't want to be paying x times as much at the cinema just to see a little "Digital" logo at the beginning of the flick.

    Physical film will obviously have it's limitations (scratching etc) and I agree that we will without a doubt see a decent digital solution as the expense of digital comes down.
    In the mean time though (10-15 years?) it sounds like I'll be more than happy to sit down in front of one of those MaxiVision48 systems and pay a reasonable price for the privaledge.

    Don't go cuckoo over the word 'digital', digital technologies should have to earn their place like any other.
  • But compression which loses information degrades quality. This is preceisely what you're trying to avoid.

    Everyone is led astray by how much better digital video is than analog video. It is much better. But it's markedly inferior to film. Your DVD copy of movie X looks better than your VHS copy, no doubt. But the DVD is still using a compression algorithm which loses information, and if you were to project it on a big screen and compare it to the film version, you'd notice the difference right away.

  • question: how exactly is improvement measured here? ebert keeps making reference to whatever type of projection being by 80% or 500% or whatever... is the number simply a ratio of resolution? because i would be perfectly happy if the current resolution was kept, but the media was digital, keeping the movie cleaner.

    when i saw episode one the day it came out, the projection at the theater i went to (one of the best new jersey theaters) was great... but when i saw it again a few weeks ago when my college was playing it (obviously a used reel), it was downright grainy... if the movie was digital, degradation wouldn't be a problem, and i think that improving resolution isn't so important. quality would be great if you could just get rid of imperfections in film by going digital.
  • I remember when a 64MB hard disk was huge. Now most home computers come with that much or more ram. Compared to what existed 10 years ago, we do have RAM mass storage.
  • by 97jaz ( 33263 ) on Sunday December 19, 1999 @12:56AM (#1462003)
    Film's resolution is considerably higher than HDTV's. This post is just plain false.
  • Yeah, but now we all want to store more. It makes no difference that 64Mb of RAM used to be mass storage. It isn't now.
    Mass storage today is still expensive.
  • I agree with you. It was Ebert's very sloppy sweeping use of "the future" that I was protesting against in the first place.

    Indeed, for all we know, MaxiVision48 could very well be the projection system of the (near) future. However, for Ebert to say flat-out that "the future" does not hold any place for digital projection systems is just silly.

    Ebert might possibly have made some sense if he had qualified the phrase "the future", as in: "Digital projection systems are just not going to cut it in the near future, because the technology employed in the current digital system is not yet mature; meanwhile, there are cheaper better analog alternatives, such as the MaxiVision48".

    But alas, he didn't.
  • Over a standard cinema screen, possibly?
  • Think about it. The distributor fedexs a DLT tape to the theater, who then copies it on their RAID before the show. Just like they have to do now when they put the reels onto a single platter, except copying a tape is faster and easier. Ebert seemed to think that a new $75,000 RAID array neeed to be trucked to the theater for each new show.
  • > Mass storage today is still expensive.

    More or less by definition. If it's inexpensive then everyone will have it and it will be considered normal storage :)
  • Film is clearly better at this stage.

    Eventually however, it should be possible to have higher resolution and higher frame rates with digital than with analog film.

    If digital storage space is a problem for higher resolution/frame rate digital movies, one could always store the compressed digital data on ... film.

    When digital storage and transmission technologies catch up, use them in the film's place.

    There's no reason one has to go all digital, all the way, right now.
  • [tongue in cheek]
    you mean that it's not storage for the masses...hence affordable?
    [/tongue in cheek]
  • In between the arts-student verbiage (...film creates reverie, video creates hypnosis...) he does make one interesting point that hasn't really been discussed very much in improving picture quality in Film/TV. The improved analog system he discusses increases the frame rate to 48 fps. I'm interested to see how much this increases perceived picture quality.

    We all know that, for games, 50-odd fps is way better than 25-odd fps (and many people are prepared to spend thousands of dollars on fast CPU's and graphics cards to achieve it). In addition, 8-mm film is generally shot at 18 fps and this definitely looks jerky.

    Has anyone seen this system, and does the increased frame rate make a real difference in smoothness?

  • All the DVDs I've personally seen aren't at all that much better than VHS. (Better than VHS on NTSC perhaps? I wouldn't know, but VHS on PAL isn't noticably worse than DVD.) Wear and tear on tapes, that's the worst factor of VHS, but as a geeky kinda nerd that I am, NOTHING irritates me more than seeing those icky digital compression artefacts on DVD..

    And DVD is still bog-standard lo-fi non-hdtv.. DVD could have been the medium to push HDTV (which I'd like, definition is SO much more important than wide screens). But it isn't. Ouch.

    DVD is here to stay though, digital television is, too.. But for film? 1280*1024 is hardly impressive if you're lucky enough to have a 21" monitor.. Let alone the big screen..

    I can think of one benefit: no wear and tear on the film copies they play at theaters.. Becasue it seems that unless you get to the absolute first showing, there are scratches and blots and stuff all over the picture.. Sigh..

    I guess things will never be perfect..

    --
  • Why would 1280x1024 be the limit for cinema projections because of bandwidth limitations, but 1920x1080 be possible for home? That doesn't make sense at all.

    All these bandwidth and storage problems will be solved in the very near future. Film has had 100 years to evolve, digital projection will take much much less.
  • 1280x1024 is about 10mm dot pitch over what area? I mean, if you project 1280x1024 over a screen the size of North America, that's about a TWO MILE dot pitch.
    I understand that I'm abusing the concept of 'dot pitch' a bit to make my example, but saying that a resolution necessarily equals a dot pitch is just incorrect.


    He's obviously talking about over the area of a theatre screen. Seems to be about the right order of magnitude, and it's a good point. Of course, that's for what essentially amounts to a proof of concept system from TI. Quite nearsighted of Ebert to judge the future of an technology based on one of the very first (real world) implementations of that technology.

    --
  • The problem with the situation is that all the savings of digital filmmaking accrue to the producers while all of the costs of equipping digital theatres are borne by the exhibitors. And the initial costs are quite high. And a couple of years later, the first generation projectors will be as outdated as my old XT sitting in the corner of my living room (which my wife is currently using to write a screenplay BTW) and will need to be upgraded.

    While the studios' and distributors' costs fall, the exhibutors' will cost rise dramatically and they will have to raise ticket prices to stay profitable. This will lead to a reduction in attendance and thus lower returns to the studios.

    Also, since likely not all theatres willl change over simultaneously, patrons may have to choose between seeing the digital version of the movie for $20, or seeing the analog version for $8. This will further affect the short term profitability of the early adopting theatres.

    I suppose they could strategize around this by releasing some movies as "digital only" and others as "analog only" during the transition phase or prereleasing the digital version a couple of weeks before the analog version. However the the other problem of digital theatres raising their prices to cover the equipment costs will likely be solved only if the studios subsidize the deployment.

  • One big advantage of digital for those of us who don't live in the same country as Hollywood, would be the potential for simultanious world-wide releases of movies. The production cost of rolls of film is such that the studios wait until US cinemas have finished with the reels before releasing them overseas. This is the cause of the hated DVD regional encoding (to stop the home release version being mail ordered from the US by countries still viewing the celluloid version!) I would very much have liked that situation to end.

    OTOH I want the best cinema experience possible, and if physical film offers that, then that's what I want.

    Interestingly, the sale of APS (The new still film in the smaller cartrage, with the small data store for picture information) cameras vastly outstrips the sale of digital ones. Analogue has won here, despite the enormous digital camera hype (remember that?) so there's no reason why it shouldn't do so for motion film too.
  • I'm sure if this takes off well, M$ will want in on it all. Windows for Cinema anyone? I'm dreaming of a cinema sized BSOD :)

    OK, so it'll probably be mostly hardware based but I thought it was a [not so very] amusing thought.

  • No chance. 1280x1024 can be displayed successfully on a standard 25-dot-pitch 17" monitor.

    Throw 1280 pixels across a 30-foot (10 meter) screen, well, that's 128 pixels per meter, or something just under 1 pixel per cm. About 8 times a 10mm dot pitch, in the best of circumstances.

    And many large theater screens are much much larger than 30 feet -- I'm just thinking in terms of monster many-small-screen cineplexes in the above example.




    --
  • I'm not sure exactly what you mean.

    Are you saying that one could use film as a digital medium? (I don't think this is what you're saying, but I'm unsure.) In that case, you would have the exact same problem you have with RAM/disk space: you'd need enormous amounts of film stock -- truly unwieldly amounts.

    But, if you mean that one could record the movie using some very high resolution digital scheme and then convert the frames to actual analog film...well, it's interesting, but there are two issues:

    1. The filmmaker still needs absurd, expensive amounts of digital storage.
    2. The D-A conversion has to be superb, otherwise there is no point to it. And (good) D-A conversion is hard.

  • as an audio/visual technician i get to see a lot of the cool sound and projection equipment long before j. random end-user. as of right now, the majority /. argument is correct, resolution from a digital projector is crap (i use them a lot to project full motion video in my line of work (as well as projecting /. onto the exterior walls of my hotel (i promise to post pictures soon))). they can handle the needed frame rates that i saw one or two posts worry about, however, the image is pixelisious. the other problem with these projectors is that they have a tendancy to 'wash' the image which means that every time a new movie is run, the settings must be re-calibrated to each particular screen (a very tedious process).

    on the bright side of things:
    as i am also an theatrical sound designer/tech and audio engineering student i was able to get into 'ldi' this year. it's an annual trade show on professional lighting and sound equipment. anyone on /. who was able to attend that here in orlando, fl. knows that digital movie projection is just over the horizon. i don't like to endorse items early, but at the moment, based on a demonstration of new projection technology by electrohome they are pretty close to getting what the public will expect: a crisp, clean, bright image.
  • Well, I'd suggest that in general if someone is talking about 'the future' they mean 'the near future' as it scarcely makes any sense to talk about anything else.

    I mean, direct sensory simulation of the brain is obviously 'the future', but that doesn't make me slag off people who say that digital projectors are 'the future' ;)
  • "the possibility of simulaneous worldwide distribution"

    don't be silly. you think it takes three months for a film to cross the atlantic from the us to ireland? i can ship books by boat faster then that. i can ship *me* by plane faster.

    they delay release due to marketting and legal reasons. censors in each country need to see it, and you need movie stary to wander around each country's talk show circuit to hype the flick.

    now digital actors... that might solve the latter problem...
  • NTSC does 60 fields per second

    This is true, however, there are 2 fields per frame, thus only 30 frames per second, only slightly better that film projection.

  • by rlk ( 1089 ) on Sunday December 19, 1999 @01:47AM (#1462030)
    I'm an (amateur, but serious) photographer. I do indeed like using Fuji Velvia and other fine grained, high resolution films wherever practical. I use a tripod whenever I can. So why do I think that a resolution that would be unacceptable for me for serious photography would be just fine for movies?

    Last night I printed one of our wedding photographs on my Epson Stylus Photo EX (and before anyone starts commenting on this gross violation of copyright, not to mention photographic etiquette, I'll point out that our deal with the photographer included throwing in the negatives and all that). This was a fairly low resolution scan (1280x1024, I did it for a screen background). The print is on an 11x17 piece of glossy film. It's using the Gimp's print plugin that Michael Sweet originally wrote that I've enhanced (URL below). I can see the pixelation -- if I look at it carefully from less than 12" away with my left eye, which has unusually acute close-in vision. Even then I can only see the pixelation in sharp transitions, such as between my tuxedo jacket and my shirt where the line is only about 15 degrees off parallel. It's obvious if I look reasonably carefully that it's not as sharp as a good quality photographic print, but it doesn't look pixelized.

    And the point is? A movie theatre is not an optimal location for spotting imperfections. For one, it's in constant motion, so it's usually impossible to focus on any one spot for long enough to see any artifacts. Secondly, if the projector is even slightly out of focus, any pixelation will be blurred out of existence.

    I'm not an expert on motion picture film, but the resolution enhancement over normal 35 mm film is not as great as the 70 mm format would lead one to believe. Taking into account the sprocket holes and the soundtrack, I'd be surprised if the actual frame width on 70 mm movie film is greater than 50 mm or thereabouts. If it's 50 mm wide, the length of a frame should be about 27 mm (at least if the depiction at http://www.theatres.sre.sony.com/imax/film.html is reasonably accurate -- the long side of the film stretches across the width, rather than the length). 35 mm still film is 24x36. So the movie frame is bigger than the 35 mm frame, but not spectacularly so (it's smaller than the smallest "medium format" photographic format, which is nominally 60x45 mm but actually a bit less). High end consumer digital cameras are currently in the range of 1800x1200 pixels, and they produce quite satisfactory non-critical prints.

    2 Mp resolution might not be sufficient for Imax (very large format, with a huge screen), particularly at theatres such as the Omnimax at the Boston Science Museum), but I suspect that for most motion picture purposes, it's quite adequate.
  • I mean, direct sensory simulation of the brain is obviously 'the future', but that doesn't make me slag off people who say that digital projectors are 'the future' ;)

    I'm slagging Ebert off not because he said that something is the future, but because he said that something is not the future.

    What if I said to you that direct sensory simulation of the brain is not "the future" because the future is digital projectors?

    I'm betting you would say something to the effect of, "Well, I'm sure that digital projectors will figure in somewhere in the near future, but please don't entirely discount the possibility of direct sensory stimulation of the brain because of that."
  • by kevin lyda ( 4803 ) on Sunday December 19, 1999 @01:49AM (#1462032) Homepage
    I'm surprised at you people!

    One of his comments struck a nerve with me - the fact that hollywood "suits" don't care about the technology. They just follow the hype like dogs in heat.

    Sound familiar?

    Our little clique isn't the only one that has shoddy solutions foisted upon it by clueless "suits." It sounds to me that Ebert, a flim geek of sorts, is pointing out a case just like this. He touches on the technical problems, the emotional ones, and how the solution works in practice. All arguments that one of us might use against a PHB advocating a 100 box nt cluster using VB scripts and MSSQL as a web solution for a site getting 1/100 the traffic of /. with 80% static content.

    Digital film will probably win in the end, but there's no reason to start hiking my ticket prices for the crappy quality we'll get now.
  • Analog works best for an analog medium. We all know this. Something is always lost in the translation of light moving through colored celluloid to a string of 1's and 0's. This has bben the case with CD-Rom's The good thing about CD's is that a cheap CD player is better than a cheap record player. The media is more convenient, more durable, and can be more reliably mass produced. These qualities make it perfect for home electronics.

    The theatre business however will always tend towards whatever delivers the best picture quality, and that will continue to be film. With some of the higher end projectors that are coming out, one of which is mentioned in Ebert's article, will, as promised, maintain the preeminence of film as the distribution media. 10 years ago they were talking about Beta taking the place of film even in the theatres. It was not to be.

    Beyond the technical hurdles, there is the romanticism still inherent in hollywood that demands that film be used. This will not die. Film has an inexplicable quality of tone in every image. Certain films for certain scenes will be a mainstay of directors who know the look of film.

    ~Jason Maggard
  • NTSC does 60 fields per second when it's the evening news or a soap you're watching. Doesn't make much of a difference, does it?

    60 interlaced fields equates to 30 full frames.
    30fps vs 48fps, I'd say that could be a substantial difference.

  • You said it yourself:
    High end consumer digital cameras are currently in the range of 1800x1200 pixels, and they produce quite satisfactory non-critical prints.
    Non-critical? Do you imagine that filmmakers consider their frames to be non-critical?

    Take a look, sometime, at Chris Marker's "La Jetté." All but a couple of seconds of this film are composed of still shots.

    The point is: I would notice the difference, just as you notice the difference when you use your digital camera. Some people take film rather seriously.

  • >Ebert seemed to think that a new $75,000 RAID
    >array neeed to be trucked to the theater for
    >each new show.

    IIRC he explicitly discounted the cost of shipping them about as only being part of the demonstration. Anyway I'd suggest that $75000 per projector plus however many extras they'd need to cope with changing a projector to a different picture in a timely manner is far from inexpensive.
  • Well, let's see. When was the last time you had your monitor in interlaced mode, if ever?
    Perceptually, there is a huge difference.
  • Interestingly, the sale of APS (The new still film in the smaller cartrage, with the small data store for picture information) cameras vastly outstrips the sale of digital ones. Analogue has won here, despite the enormous digital camera hype (remember that?) so there's no reason why it shouldn't do so for motion film too.

    Uh, seems to me that you're making the same mistake Ebert has made. APS is about as far as consumer analog photography is going to go, but digital has just begun. The infinite flexibility of digital is avoided at this point because of real, but short term, problems like resolution and the cost of media. In any digital versus analog comparison, it only takes time and research to improve digital to the point of surpassing analog or at least being indistinguishable by the general public. For example, CDs are in some ways arguably lower quality than vinyl, but most people can't tell and appreciate the benefits of digital. Any media where analog and digital compete will eventually go digital (at least mostly.. some purists may stick to vinyl or film or whatever) for that reason.

    --
  • by rogerbo ( 74443 ) on Sunday December 19, 1999 @02:15AM (#1462043)
    The system used for the recent Phantom Menace digital projections was the Texas Instruments DLP system. The specs are here:

    http://www.ti.com/dlp/products/cinema/specs_star wars.shtml

    Or here for more on the system:

    http://www.ti.com/dlp/products/cinema/

    Yes it only has a resolution of 1280 by 1024. HD systems at home do have more resolution than this, but the home HD systems are cathode rays not projection. It's much harder to make a projection system very high resolution than a tube system.

    But the resolution will get even higher. Hughes has a system already the ILA-12K (http://www.hjt.com/products/ila12k.html) that does 2000 by 1280. It will keep increasing.

    The effective resolution of film (ie. the analog messy strip of celluoid) is around 4000 by 3000 pixels. Digital special effects that are mixed with live film footage are rendered at aywhere from 2048x1550 to the above 4K rsolution.

    But the advantage of digital is that the colour reproduction is much more accurate and when you project film, the film is moving at high speed and jitters from side to side so you get blurring.

    I imagine only films that have a large proportion of their content created digitally will go with digital projection in the near future. Then there is a real advantage for the director that he knows the colours he sees on the computer screens when they are creating the effects are exactly what will be projected. When you shoot to film there are a huge array of isssues with film stock, look up tables, gamma curves and the only way to know what your colours will actually look like is do go out to film and do a test screening (expensive).

    Digital projection is the future but the current systems will improve a lot before it becomes the only system used.
  • * The TI systems in the demo theaters bear no relationship to the real world. They're custom installations that do not address the problem of how a real film would get to a real theater. The source of their signal is an array of 20 prerecorded 18-gigabyte hard drives, trucked to each theater. This array costs an additional $75,000, apart from the cost of trucking and installation.

    18 Gig is about $300 (at least here in the UK) giving a total cost of $6000 for 360 Gig. These prices are also falling continuely.

    If you wanted to change the film once a week then you would need a 5Mbit/s connection to do it. Fast, but not that fast.

    * Even so, a movie is so memory-intensive that these arrays must compress the digital signal by a ratio of 4-1. At a recent seminar at the Directors' Guild in Los Angeles, digital projection spokesmen said that in the real world, satellite downlinked movies would require 40-1 data compression. This level of compression in movie delivery has never been demonstrated publicly, by TI or anyone else.

    Mpeg2 should be able to get close to this if you allow lossy compression. (I think, I'm not an expert).

    * The picture on the screen would not be as good as the HDTV television sets now on sale in consumer electronics outlets! TI's MDD chip has specs of 1280 by 1024, while HDTV clocks at 1920 by 1080. For the first time in history, consumers could see a better picture at home than in a movie theater. A higher-quality digital picture would involve even more cost, compression and transmission challenges.

    Now this is just plain stupid. If HDTV can be delivered at high resolution on the fly by cable, satellite or ariel, then why can't digital films? Distribution is just not a problem. If the figure of 360 Gig is correct then this is just a pack of DVD disks, total cost 100$?

    1280x1024 sounds like its a bit too low resolution but this figure will be improved on. 2000x1000 or 4000x2000 maybe required but I doubt this is too far into the future. Look how far LCD displays have come (and come down in price).

    * One advantage of a film print is that the director and cinematographer can "time" the print to be sure the colors and visual elements are right. In a digital theater, the projectionist would be free to adjust the color, tint and contrast according to his whims. Since many projectionists do not even know how to properly frame a picture or set the correct lamp brightness, this is a frightening prospect.

    One disadvantage of film is that you can't adjust color, tint, contrast etc. Sure, projectionists may not be the correct people to make these adjustments but if the adjustments were made when the projector was installed and then locked into the machine, what is the problem?

    * How much would the digital projection specialist be paid? The technicians operating the TI demo installations are paid more than the managers of most theaters. Hollywood is happy to save money, but are exhibitors happy to spend it

    Digital projector specialists would be paid lots. However, how many would you need? Apart from a fan, there are no moving parts in a digital projector so the maintanance requirements should be far less. Also as you would not need a projectionist, the overall cost to the exibitor should be less.

    * What about piracy? Movies will be downloaded just once, then stored in each theater. Thieves could try two approaches. They could grab the signal from the satellite and try to break the encryption (as DVD encryption has just been broken). But there is a more obvious security gap: At some point before it reaches the projector, the encrypted signal has to be decoded. Pirates could bribe a projectionist to let them intercept the decoded signal. Result: a perfect digital copy of the new movie. When the next "Star Wars" movie opens in 4,000 theaters, how many armed guard will 20th Century Fox have to assign to the projection booths?

    Piracy would be a problem. I would guess the best way around it would be to put the decode into LCD drivers. This would make it very difficult to tap into, though not imposible.

    * Film is harder to pirate than digital video because a physical film print must be stolen and copied. An MV48 print would be even harder to pirate than current films; it would not fit the equipment in any pirate lab. Those fly-by-night operations, which use ancient equipment cannibalized over the decades, would have to find expensive new machines.

    This would only occur in the short term whilst MV48 was non standard. Once it became a standard the pirates would be able to get hold of cheap second hand equipment.

    The same logic goes for the digital equipement. To start with, pirates would have difficulty in getting hold of projection equipment to hack. Once it became a standard though, this would be a lot easier for them.

    However, one of the main advantages of digital is that you do not need multiple (read expensive) copies of the film to distribute. Therefore, the film could be distributed to the whole world in one go rather than the current mess of US first, then Europe, then the rest of the world. This means the current market in Europe for pirated US films would be less as the Europeans would be able to see the film at the same time as the US. They would have no need to wait for the film and it might get rid of the DVD region encoding as well. Sure there might be reasons why Hollywood would not want to distribute the whole world in one go, but at least digital gives them the option.

    Colin

  • The reason DVD (and most DV formats) look better than the analog equivalant is becuase of noise. It is the same reason that your compact disc sounds "better" than your LPs. The noise floor is SO much lower on a digital signal. If you do critical viewing on just about any consumer digital video format, you will see motion and color artifacts all over the place.
  • You can make any resolution fit any aspect ratio depending on the shape of your pixels (ie. non square pixels)

    Check this page:

    http://www.ti.com/dlp/products/cinema/specs_star wars.shtml

    The pixels are "anamorphically sampled". What this means is that even though the mirrors that project the pixels are square they use a lens that distorts the rectangular picture to a square one when they digitise the movie. Then they do the opposite when projecting. So on screen your pixels end up being wider than they are tall.

    Note, because they used anamorphic sampling when they digitised the movie you do not end up with the people being stretched out.

    But the image fills the cinema screen with no black bars (thankfully).
  • There are so many ill-informed claims and errors in this post , I don't know where to begin.

    1. the fastest tapes are not around 1MB/s. In the low-medium price range, there is DDS-4, AIT, and other tape technologies: Size: upto 40Gb, 3MB/s (sustained,uncompressed) Let's spec out a DDS-4 solution: 20Gb,3MB/s native. Street price $1300. Hook up 20 of those in a library and you have 400Gb capacity with 60Mb/s throughput.

    2. Same price range as the RAID array, there are tape drives (AMPEX) that go to 15Mb/s (sustained,uncompressed) with sizes upto 330Gb.

    3. You are getting 50MB/s from an EIDE drive? Last time I looked, a high-end SCSI drive had only 300+ Mbit/s internal transfer rate with sustained transfers of round 20MB/s. And you have an EIDE drive that goes up to 400Mbit/s internal?????


  • by Paul Johnson ( 33553 ) on Sunday December 19, 1999 @02:52AM (#1462050) Homepage
    What counts is not the dot pitch, but the angular resolution of the system as seen by the audience.

    What I know of this comes from still photography, but its also at 35mm (i.e. a negative 24x36mm), so I can say something intelligent.

    If you do the sums for a 35mm still, it is considered "sharp" if a single point on the object maps to a cirlce of diameter less than 0.004 inches on the negative (known as the "circle of confusion"). That corresponds to a digital resolution of around 3000x2000. Of course you can go finer. But that is roughly the best performance you can expect from a 35mm film.

    Now, whether this makes any difference depends on whether you can see such a small object. The question is: given two small dots in the scene, can you see whether there is one dot or two in the projected image? The point at which the two dots merge into one is the resolution, and the angle subtended by the two dots is the angular resolution. I'll dodge the difference between angle for the camera and angle for the viewer: projection systems are designed so that the middle seats get the right perspective.

    The angular resolution of a good human eye is 1/60th of a degree (1 arc minute). So an ideal cinema screen would need to match that with around 60 pixels per degree. Right now I'm wearing spectacles, and without moving my head they put a frame on my vision about 80 degrees wide. I haven't measured a cinema screen from the centre seat, but I'd expect something nearer 40 degrees. 40 degrees times 60 pixels per degree gives 2400 pixels. Which is not too far off what 35mm film gives (at its theoretical best).

    So current XVGA systems are not up to the job of replacing film, but give us a 3000x2000 pixel screen and it will look better. And Moore's Law suggests that we will be able to do that fairly soon.

    Of course there are other issues. As others have noted you have the problems of physical wear and dirt getting onto film, and the costs of printing, versus the 100% reproducability of digital and the costs of piping all that data around. But you can bet that the studios have looked at these numbers and figured that the lifecycle costs look interesting. And no doubt someone has told them of Moore's Law too.

    I remember the same argument in the early days of digital audio. The first CD players sounded harsh in the high treble thanks to the steep filters required. Analogue purists declared that digital would never replace analogue. But where is analogue now? A niche split between rich die-hards and poor elderly people who can't afford to replace their existing LPs. Physical analogue film will go the same way.

    Paul.

  • What if tapes will be smaller, but will contain complete image of each disk on separate tape? Then they can be copied in parallel, then disks will be reassembled as RAID. 18 HDs will decrease the copying speed 18 times at the cost of adding more tape drives, copying boxes and switching SCSI buses between RAID adapter and copying box -- still cheap compared to the rest of equipment involved.
  • Duping umpteen gigabytes of data is not that trivial. Duping a film might actually be easier. To copy the data you need to get hold of the storage medium for long enough to read all the data off it. To copy the film you need to get hold of it for long enough to run it though a duplicator (which is basically four reels and a gizmo to hold the original and print together as they run under a light). Both can be done faster than "real time". Developing the copy can be done later of course.

    And on top of this you can encrypt the digital version, so it only gets decrypted in the projector. Its actually fairly simple to produce a tamper-proof box that won't project the film if the seal is broken. This isn't totally secure of course, but it does increase the cost and difficulty of making an illegal copy.

    The best way for our putative copying mafia to proceed would be to pay for a private showing in front of some cameras. Alternatively I can imagine some fairly simple optics that would redirect a small fraction of the light from the projector and focus it onto a conventional CCD. It wouldn't be cinema standard, but it would do just fine for an illegal DVD master.

    Paul.

  • A film consisting of continuous still shots is another matter. I still think that for most mass market films this resolution would be quite adequate. And it wouldn't have other problems, such as dust on the print and all that...
  • http://www.tiac.net/users/rlk/print-3.0.2.tar.gz

    Particularly if you have an Epson Stylus printer (vintage Photo EX or older), and you're running the Gimp 1.1, you might want to give this a try.
  • Ebert also mentions Douglas Trumbull's Showscan process, which if I remember rightly is another faster (than 24fps) film system. As I remember, Trumbull was working from research that showed that just on the high side of 30fps is a perceptual threshold for human vision. Once you move past that threshold (as TV almost does) images often are percieved as being much clearer and more realistic. With a 48fps system you get that additional clarity, with the steadiness you get from the (alleged - I have not seen it yet) better film transport system and slight additional sharpness required of the higher "shutter" speeds required by the 48fps in some situations. There may be something here.
  • I think Roger Ebert is wrong on this.

    There is a simple reason why: a digitally-encoded movie for theater projection will easily fit on a 12" (305 mm) optical disk (if we're using the same pit size as those on DVD discs). It's a LOT easier to handle a single 12" optical disc than several big cans of 35 mm or 70 mm film.

    Also, because the disc is 305 mm in diameter, you can encode easily things like multi-language spoken audio tracks separate from the rest of the soundtrack (and even more, multi-language subtitling on the same disc!). Let's take for example the second Star Wars "prequel" movie due in May 2002. For the European market, the optical disc for digital projection systems will have separate spoken soundtracks in English, French, Castillian Spanish, German, Italian, and Portuguese plus subtitling in Catalan, Arabic, Czech, Slovak, Polish, Serbo-Croatian, Greek, Hungarian, Romanian, and Bulgarian. It's vastly cheaper to do it ONCE so the movie can be simultaneously released to just about all of Europe at the same time.

    Also, with digital projection, the complete lack of film "jittering," the total lack of scratches and dust, and superb color saturation means visual quality will of course be superb.

    In short, once the cost of digital projectors start coming down (and given the rapid development in the computer industry, they will come down very rapidly), future movie theaters will no longer need the extra space needed to store the large bulk of film; it'll all be reduced to 305 mm optical discs.
  • A couple of things. It's sad to see digital movie projection being pushed at a laughable 1280x1024 resolution -- surely that is doomed to fail. A more accurate conclusion for Ebert would be that "I have seen the future of cinema, and it is not Texas Instruments."

    First of all, considering that most of new motion pictures are delivered with an aspect ratio of 2.35:1, the effective resolution would be 1280x545 (assuming square pixels). I've heard of other up-and-coming digital projection systems, and they have at least three times more pixels.

    Second, contrary to popular belief, most frames, if not yet all, of a modern cinematic experience have been stored digitally at some point on their path from the camera to the silver screen. The motion pictures are printed to the celluloid film from this digital master data.

    Needless to say, elimination of this last conversion phase is the holy grail of digital movie projection and IT WILL ARRIVE when the projectors get closer to the native resolution of the master data.

    Everything that can be digital will be. I'm betting my ass on it.
  • Anyone remember Dolby SR [dolby.com]? It claims to get better than CD quality out of a cassette tape. They were putting it in analog 2 track pro machines for a while, but I don't think it is in use much anymore.

    This technology is similiar. Tape had a 50 year head start over digital recording, and look how long it took for digital to take over. Convience, low noise floors and consistancy will always win out over high resolution. Also, check out some of the articles in Broadcast Engineering [broadcastengineering.com] for what HDTV really means for most of the world. According to studies of visual acuity done years ago, most of us won't really notice much of a difference in HDTV's resolution (unless you have a wall size projector, and you sit very close to it). But, what we will notice is lower noise floors, and better color reproduction.

  • Theaters want digital for something other than movies. They want it for sporting events.

    A few examples:

    I was in Brazil during the '94 World Cup and you wouldn't believe the masses of people gathered at big screen televisions in every city park in the country. Literally millions of people went to see EVERY GAME that Brazil played on the big screen. I was in France during the cup in '98 and it was a similar scene, except they had stadiums built around the TVs and it was a high definition widescreen TV.

    I was also in Salt Lake the two times the Jazz played the Bulls for the NBA championship. Thousands of people gathered outside the stadium each game and watched the game on a big-screen TV.

    Now imagine that when you wanted to see a sold-out, important game instead of fighting for space at your local park and seeing a poor image on a conventional big-screen you could go to the local theater and watch the game sitting on a decent chair, seeing a crisp, unobstructed image, and getting great sound. Now lots of people would still prefer their local park, but you can bet that the theaters would sell a lot of tickets. I can think of some people that would actually prefer the theater to actually attending the event!

    I haven't even mentioned traditional pay-per-view events like boxing matches, concerts, or (gag) wrestling.

    Digital will take over, but for a lot of reasons that Rodger hasn't even mentioned.

  • It's not an either-or proposition - consider the case of CRTs vs. flat-panel displays. The CRT (vacuum tube) technology has continued to improve, and remains cost-effective in all applications where cost is more important than bulk and weight (i.e. on my desktop; and probably yours). Either-or thinking is probably an artifact of the myth of "Progress" which is not the same thing as our actual experience of technological and social change.

    The data-transmission and storage requirements described by Ebert (he's the fat one, BTW) are extremely damming. The other points he made (priracy dangers, operator costs, lower-res than in-home DVD, etc.) are also very strong - and the nature of the movie business is his area of expertise.

    All that aside, both digital and analog technologies will continue to develop, this isn't the last we will hear of this.
    --

  • This is where digital film projection--especially if they put the movie on 305 mm optical discs--become very useful.

    Let's take for example the sequel to Star Wars I: The Phantom Menace. For the Asian market, a single 305 mm disc for theater projection will have voice soundtracks (separate from the rest of the soundtrack) in English, Japanese, Mandarin dialect Chinese, Cantonese dialect Chinese, Malay, Filipino, Thai and Indonesian, plus subtitling in Vietnamese, Indian languages, etc. It's a LOT cheaper to do it that way.
  • "Toy Story"? Of course, you wouldn't notice: it's a computer-generated movie!

    You also would never mistake it for a photographic image of the real world. Unless you were legally blind.

    Also, the new film technology that Ebert wrote about eliminates jitter -- so there goes the bulk of your argument.

    Ultimately, though, I think you're right that the masses won't care. It's just very unfortunate.

  • However, with CCD's already at 6 megapixels and will probably hit 10 megapixels within 18 months (and likely 16 megapixels by 2004), not to mention the cost of hard disk storage dropping like crazy in the last four years, the era of digital recording for "epic" movies has pretty much arrived.

    In short, I expect within 5-6 years it'll be fairly common where everything will be done digitally: recording the movie, playing back the "daily rushes," editing, combining of special effects and audio, etc. Because it is ALL done digitally, the picture quality will stay excellent because as the movie is duplicated, there will be NO loss in picture quality.

    In short, a LOT of movie studios are watching with great interest what Lucasfilm does with the sequel to Star Wars I: The Phantom Menace; if the results of Lucasfilm using an all-digital process of filmmaking does come out to be superb, then the rush to all-digital filmmaking will happen like a tsunami.
  • by jalefkowit ( 101585 ) <jason AT jasonlefkowitz DOT com> on Sunday December 19, 1999 @03:50AM (#1462067) Homepage

    As much as I respect Roger Ebert, I have to say that he's just missing the point completely here.

    Ebert contends that film-based systems will be better than digital in the future because film will provide a better-quality image than digital can; and on this count, I think he's absolutely right. Even uncompressed DV lacks the "warmth" (for lack of a better term) of film, and the MaxiVision system he touts sounds like it provides an image that nobody in the DV world can hope to match.

    The problem is, image quality is unimportant. Now, before everyone gets up in arms here, think for a second. Who is clamoring for image quality that is better than today's films? General audiences? Nope, they are happy with the cruddy image from a poorly set up projector in a shoebox theater in their shopping mall. Theater owners? Nope, they make more money by dividing their space into multiple small, low-tech screens rather than lovingly setting up one beautiful screen and cutting the number of movies they can show by 11/12ths. Studios? Nope, they know that what makes them money: formulaic movies with name stars presented on as many screens as possible. If they could make money presenting more striking images, they'd all be doing IMAX films by now.

    So where it counts -- money -- MaxiVision & other advanced film systems are irrelevant, because nobody wants them bad enough to pay for them. Digital, however, is a different story. Digital offers a big money benefit to one of these players -- the studios -- because it cuts dramatically one of the biggest cost in distributing a film: prints & advertising.

    P&A is one of the biggest line items on a film's budget, running into millions of dollars. Each theater which is going to show your film needs a "print" (an analog dupe of the film) to run through their projector. In fact, they need more than one, because prints wear out or get scratched or otherwise start to die after awhile. When you consider that each print is absurdly expensive, and that a movie that "opens wide" goes to 2,500+ theaters, you can see how this gets expensive quick.

    Digital changes all this. Suddenly you can stop sending reels of film around (which are expensive) and start sending around magnetic disks (which aren't). Even better, you could conceivably ship the image via a fiber optic cable or satellite connection and avoid "prints" altogether. Then "P&A" just becomes "A" and you've just saved millions, which to a Hollywood executive means that his project is that much more likely to be profitable and thus advance his career.

    So, while I understand Ebert's position and wish that we lived in a world where he was right, where the quality of the experience was the prime factor, we don't, and he's wrong. Digital will overtake film, not because it's better, but because it's cheaper -- and even the most beautiful MaxiVision 48 images won't convince the Hollywood moneymen to ignore that math.


    -- Jason A. Lefkowitz

  • Keep in mind that once a digital system, any digital syste, is established in Hollywood, it will probably be about as long 'til the next enhancement as it's been since scope 35mm prints (i.e. since the 50s... or 50 years). Hollywood moves very slowly, so it's very important that they not adopt a crappy digital standard, like the TI or the Hughes one that Ebert rips apart in this article.

    Also, some of the problems he sites are inherently true of any digital system - the current distribution system of a bunch of 18G hard drives trucked to the theaters is infeasible. To use satellites at the unacceptable 1280x1024 resolution would take 10x more compression than the system they demonstrated.

    Also, going to digital projection systems will have one incredibly bad side effect - cutting smaller theaters totally out of the loop. We're talking $100k upgrades here, places like the Brattle Theater and Coolidge Corner (in Boston) will no longer be able to show anything remotely new. This hurts the Indie film industry, since those theaters are where things like Princess Mononoke [not exactly indie, but] and Dogma premiere and have their test marketing done.

    Having been an officer of MIT's Lecture Series Committee, I can say that a move to an all-digital Hollywood would likely kill any small 35mm projection operation. And that would truly be a shame.

  • I haven't seen the system Ebert mentions, but it's very much the case that improved frame rate (i.e., increased temporal resolution) really does change things a bunch. The need for more temporal resolution is a major part of why your 30 (or 25, for me) frames of video a second is split into interlaced fields.

    720-line, 60fps Hi-Def video is breathtaking. I shoot a lot of sports footage on DV, and my Canon camcorder can shoot either 25fps progressive-scan (no interlacing) or regular interlaced. The "look" of the two is completely different when viewed on a TV - the motion strobing of the lower frame rate isn't really noticeable by itself, but flicking between modes the interlaced footage is noticeably smoother. It's subtle.

    One of the very interesting perceptual phenomena that regularly appear on the DV-L mailing list is that progressive-scan video shot on camcorders like the Canon XL1 actually looks, according to viewers, like it originated on film. Why isn't 100% clear; perhaps it's just that the lower frame rate of film is something we're trained to notice. See here for more on the film-look phenomenon [adamwilt.com]. The one aspect of film look that digital video cameras generally can't provide is shallow depth of field; that takes a large imaging area - compare 35mm film versus a 1/3" CCD, and you'll see why film wins.

    In fact, 24fps isn't going away despite the move to higher frame rates for HDTV systems. See here for Sony's blurb on 24fps progressive [sony-europe.com].

  • The future of the cinema is cinema! Arguing about different formats and technologies is generally fruitless and useless.

    Besides, the chase for more clarity and better resolution is a bit absurd. The thing is, celluloid film (with all it's pros and cons) is an integral part of cinema, and the things it brings to the screen (apart from the actual moving pictures) is a big part of the movie experience just like brushstrokes and the pattern of the canvas are a part of experiencing a painting. Imagine watching Chaplins "The Gold Rush" in a computer colorized, digitally cleansed version where no signs of aging can be observed in the picture on the screen. Kind of takes away the atmosphere, doesn't it? "The Gold Rush" is an old movie, made with celluloid that takes on an aging process just like any other earthly material. It gives the movie an earthbound connection that tells us that this movie was made at a given point in time quite a long time ago, gives us a sense of its history, so to speak.

    Digital of course takes that away. Not necessarily a bad thing in itself, it just depends on what the moviemaker wants to get across. This is kind of like the movement in art that came (I think) out of the pop-art movement, where painters started to paint pictures that looked like photographs with amazing clarity and it takes a bit of time for laymen to determine wether it is an actual painting or not. Anyway, that method never caught on and hardly anybody paints that way anymore because the material (canvas, paint, brushes) are as much a part of a painting as the actual picture. Some painters go through a long and painstaking process in their work to get the effect of an aging Rembrandt painting. Another example would be all the computer-colorized versions of "Casablanca" and various other movies that were all the craze in the eighties but no-one even thinks twice about doing this anymore. That was an experiment in taking old movies and trying to recreate them the way their directors supposedly would have wanted to see them had they had colour film to begin with (colour was already an option at the time Casablanca was made, I think). By doing that their connection with time and space was altered and skewed and these versions never did seem right and never caught on. And the colours were a bit shitty, too.

    The thing I'm trying to get across is that the material is an inseperable part of the film. Why do so many independent filmmakers still use 8mm or 16mm celluloid film (although they may use digital for editing and sound)? And look at the video-revolution. It isn't until now that you see videotape used for making movies, and that is because the filmmakers have learned how to use the material to their advantage. In films like "Blair Witch Project" or the danish Dogme movies ("The Celabration", "the Idiots") video was used most of the time and the directors did not try to hide technical faults but used them to their advantage ("The Blair WP" is quite an interesting example in the way it uses video tape for realism but distances the audience from the action with b/w film). If you want to see an interesting example of use of videotape the try to find Lars Von Triers "The Kingdom". Filmed on video, transferred to 16mm film and then transferred back to video! Trying to eliminate the feel for the material from the moviegoing experience is a bit like producing tasteless pop-corn so it won't interrupt the audience while it's watching the movie.

    Digital screening will eventually catch on. It serves its purpose, esp. (like Ebert points out) in computer generated s.fx. scenes. Imax hasn't caught on, but it serves it's purpose in showing dazzling movies about the wonders of the pyramids or life and death in Serengeti. But the question is: when is digital ready for the big time. I say it's ready when one can seamlessly switch between film and digital projectors at screening time. Or when digital tech. can actually imitate exactly all the faults of celluloid film or videotape.

    Or I may be totally wrong:-)
  • It's not just compression that creates problems. Current video projection technology has some pretty serious problems - it's expensive, unreliable and has problems with image quality and reproducing some colours.

    Film stock may be expensive, but the equipment required to show it is excptionally reliable and fault-tolerant. You can get results from thirty year old projectors that currently avalible video systems can't compare with. It will certainly be possible for digital technology to replace analogue (it's already doing that on TV), but I wouldn't hold your breath.
  • That compression ratio is almost certainly from some lossy compression such as MPEG2. That's not a big deal, by the way, because a final digital "print" will be able to use inter-frame compression (motion estimation and such forth) to achieve that ratio.

    Existing digital camcorders compress their signals right from the word go, to manage their consumption of tape and help keep the transfer bandwidth over IEEE1394 reasonable. However, to keep the data stream easily editable, only intra-frame compression is used; DV uses a fixed 5:1 DCT-based compression, some "pro" formats use a fixed 3.3:1 compression ratio. The MPEG-based HiDef cameras do the same.

    The 5:1 compression used in DV is just visible(see here for the SMTP analysis [adamwilt.com]) but for most people the only really noticeable effect of the chroma subsampling used in DCT-based compression is that it makes chroma-keying (e.g. bluescreening) much more difficult because of the averaging - the trick there is to use green as your key colour, because DV camcorders typically retain more information on green than other colours.

    Remember that one of the things we're getting back from using Digital is less generation loss along the path to the finished product - it's often staggering how many generations a film print can be from the original negative, and none of that loss applies to digital. The end-stage compression is a non-issue.

  • The AMC 30 theater in South Barrington, Illinois is testing the TI system in theater 17. (Bicentennial Man is playing 5 theaters now, but only 17 is Digital).

    I saw it Friday night. Unless they had made a big deal about passing around survey cards about how what we thought of the picture quality, I *might* not have noticed the difference. Yes, it was brighter, and things were much clearer (especially while moving).

    What bothered me were visable compression artifacts, especially in smooth gradients. One scene with a sunset in the background was particularly noisy. Anything with large amounts of bright blue also seemed full of digital noise. The people I was with didn't notice at all, but perhaps since my job is deeply involved in digital video, I can't help looking at it. :)

    There were also some specks that appeared that *looked* like film specks. The intro TI had at the beginning of the movie said that it was pure digital, but perhaps some scenes were done on film?

    The biggest thing I noticed was that bright objects didn't bleed all around them. Everything always stayed sharp. It wasn't an enormous change like I was expecting, though.

    Kevin

  • When Phantom Menace came out, it was released in two theatres with digital projectors. I happened to be near the Paramus, NJ theatre doing so, and we went to see it. Before the show, there was a THX guy who came out with a microphone to talk about the technology. I think they said it was on a 350G raid array, though maybe it was 700G. Anyway, the technology was very similar to watching a DVD on a projection TV--you could look back and see the separate red, green, and blue lights. Of course, it was in far higher resolution than you would get with DVD--the image was perfect. Normally with film, the film wears and you get a scratch here and there on the screen, or the audio will have hiss if the print is old. With digital, it is like always having a perfect print.

    Sure, the technology is expensive now, but the quality is there. So in a few years when the next highly-anticipated movie comes out, instead of just putting restrictions on the sound systems, they may require digital projectors.

    I, for one, give all my thumbs up to digital projection in theatres.
  • Okay, could someone explain what the difference is between "10mm dot pitch" and "1 pixel per cm"? 10mm is 1cm, after all, but maybe I don't understand dot pitch.
    --
  • I haven't seen a digital movie in the theatre to comment on it, and I think that what we have now is "good enough" quality wise until digital is a cheaper and better alternative.

    What I don't like about going to the theatre is the framerate (something I've never liked). I believe (correct me if I'm wrong) that today's movies run at 24fps in the theatres. While that's fine for most things, when a movie does a very fast action scene or pans over very fast, it's hard to tell what's going on. There's just not enough data there for your eyes to catch up with. To me, the real achievment would be to get this number up to 60fps, which seems to be a standard for truely fluid realistic movement (at least in video games). I can see the limitation now with film, because the reels would have to be about 3 times the size, and we'd need different projectors (probably), but with digital, as it progresses, I can see this coming a reality. It would probably make for some truely breathtaking films.
  • Read the article and take note: Here is a man who knows the difference between "its" and "it's".
    --
  • I like how you bash Ebert's argument, calling him "unbelievably myopic when it comes to technology" and then give exactly (hmmm, let me read it over a few more times to make sure) zero reasons why you are right and he is wrong. Why not tell us about these "laughable technical errors" and about how "The digital projection system of the future will blow today's technology away" instead of simply skaking off analog technology and a respected film critic with a wave of your hand.

    I think analog will always be an alternative that people will chose. Here is why: if i have analog film, I can hold it up to the light and see what is on it. If my projector breaks or the film has a flaw in it, i can rig up a new projector from basic parts or use tape to mend the film. Digital will always be higher tech than analog and therefore harder to access directly, this is why it is superior.
  • One thing I haven't seen mentioned much in any of the discussions is - what standards will digital projection be based on? Until digital prjection is standard everywhere, there is still time to improve the baseline standard for what you'll be projecting.

    That's where Ebert's vision of the future is compelling - a technology that works at higher framerate (40 fps instead of 24). Sure the detail of the prjection being jitter-free doesn't apply to a digital projector that doesn't need to move film past an element.

    I think a great comparison here is APS vs. digital cameras. Sure you can get some nice digital cameras now. but if you want a really good quality photograh that exceeds a digital vesion, APS makes a great intermediate step until digital cameras really become the equal of film based cameras - in terms of resolution and storage.

    Similarily, MaxiVision48 makes is a great intermediate step to use before it's practical to go with digital projection, and may as a side effect raise the bar for what we see from now on.
  • by garagekubrick ( 121058 ) on Sunday December 19, 1999 @09:02AM (#1462155) Homepage

    Finally, a topic on /. I actually know something about. I was hoping this article of Ebert's would turn up, cause it pissed me off and thrilled me at the same time. And please, if any you know Ebert's public email (he does have one at compuserve) address, please post it.

    First I'd like to discuss Ebert's misconceptions.

    They were computer-generated in the first place, so they arrived at the screen without stepping down a generation to film. And because they depicted imaginary places, it was impossible to judge them on the basis of how we know the real world looks.

    This is not true at all. The film was scanned into the digital medium for digital projection from film (analogue) negative. Even the effects shots. See American Cinematographer, Sep 1999. In fact, one shot in the film - a non effects shot - was shot on a prototype digital camera. It's when Anakin talks to Qui Gonn outside his house in the desert at night. No effects in that shot, just an unpublicized ruse to see if anyone would notice.

    * It can project film at 48 frames per second, twice the existing 24-fps rate. At 48 frames, it uses 50 percent more film than at present. But MV48 also has an "economy mode" that uses that offers low-budget filmmakers savings of up to 25 percent on film.

    He doesn't detail this saving problem. The fact is, for independent filmmakers, the prohibitive cost is film, the raw stock and processing and development of negative. It's why so many have turned to digital. Shoot an hour for ten U.S. dollars on Mini DV. Shoot an hour on 35 mm film at 24 fps works out to roughly $4000. Now, at 48 fps, this cost doubles. So what's the "economy" mode here? As well, he completely skirts the issue of the fact that the system's vibration free tech would need to be implemented on nearly every camera in Hollywood. The main mechanism for filming today is the claw / registration pin system. Filming in the analogue sense is really a matter of taking 24 still photographs per second. A claw pulls down a perforation in the film, aligining it into the picture gate, while a registration pin aligns the perf so the film remains within the register of this gate. 24 fps is the standard, and running at higher speeds often needs serious maintenance and reengineering. As well, running at higher speeds = more light. The faster your framerate, the more light you need. This increases costs on film.

    And it can handle any existing 35mm film format--unlike digital projection, which would obsolete a century of old prints.

    And how good have those prints been maintained? The fact is, such classic immortal films as Vertigo needed extensive digital restoration work in order to present it as it was seen in its year of release. Thousands of films have been lost forever. Earlier this year Ebert even mentioned that there might not be an existing print of Robert Altman's Nashville - a film from the 70s. Classic films now are safer as digital masters.

    One advantage of a film print is that the director and cinematographer can "time" the print to be sure the colors and visual elements are right. In a digital theater, the projectionist would be free to adjust the color, tint and contrast according to his whims. Since many projectionists do not even know how to properly frame a picture or set the correct lamp brightness, this is a frightening prospect.

    And at the same time - most films shown in cinemas are underpar to the cinematographers wish. A recent popular technique, known as many things but basically silver retention, desaturates color and creates bolder contrast. These prints are more expensive and as a result only a few prints are shipped utilizing this tecnique - which the film was shot for - so that the majority of viewers never see the film as intended. Seven is a particular case of this.

    Add to this the fact that Kodak themselves, and Martin Scorcese has campaigned against this, have found that most cinemas dim their projector bulbs under the misguided idea that it extends the bulb's life. It flat out doesn't. It just leads to a muddy, darker picture. Kodak sent technicians to several theaters armed with light meters and found most films projected at a full stop or two under their proper foot candle level. Add to this variations in print reels - and films are on several differnt ones - and you have a subpar projection process. Many cinematographers love the process of approving a DVD transfer because they can properly time the entire film - and a helluva lot of color timing is done digitally now. The cinematographer of The Full Monty shot a film called Hideous Kinky in Morocco, and he told me that he could've turned shots in daylight in the desert into midnight blue using new digital color timing tecniques. The digital projection tecnique could have s locked down system approved by the filmmakers so that the projector is rigged to only show it at their levels of choice - thereby making sure that there is an optimal standard for all showings of the film.

    * What about piracy? Movies will be downloaded just once, then stored in each theater. Thieves could try two approaches. They could grab the signal from the satellite and try to break the encryption (as DVD encryption has just been broken).

    Digital projection is not MPEG or MJPEG. The compression algorithim is 50:1 - adaptive block size rather than fixed block size. It compresses frames without regard to one or the next, whereas MPEG and MJPEG compress the information that is the same in each subsequent frame. This leads to picture artificating, which the digital projection system does not have. In other words, it is a proprietry, high storage medium with its own compression algorithim at a high cost. Pirates would need more than a simple telecine (transferring film to video) to pirate the film - first they'd have to break the encryption, which would be vastly superior to DVD's pathetic 48 bit, then they'd need the extremely expensive tech to decode that signal to a low fi master for pirating. Good luck, pirates.

    As for the image recording itself - we do not know what system it will utilize. Sony and Panavision have yet to elaborate on what test shots for Ep. 2 have turned out as - nor what compression or resolution etc. it will use. It will not be an existing format like DV or DigiBeta or MPEG.

    Hollywood has not spent a dime, for example, to research the intriguing question, do film and digital create different brain states? Some theoreticians believe that film creates reverie, video creates hypnosis; wouldn't it be ironic if digital audiences found they were missing an ineffable part of the moviegoing experience?

    Umm, which is why video rental is such a huge business? The fact is, for the most part, the audience just doesn't care. And I have experienced states of emotional reverie from movies watched on DVD rather than a cinema. I went to see the IMAX film Everest during which a hair ended up in the gate of the projector. The result - during an emotional moment- an enormous tentacle from space lashed out at our heroes, and continued to do so until the end of the film - was hugely annoying. I complained to the manager. He told me I was the first, indignant, and rude, customer to complain about their high standards. The amount of misconception that still exists about letterboxing is insane. Letterboxing means you see more of the picture as intended. It's as simple as that. How many DVD users know what 16x9 anamorphic means, despite attempts by the DVD community to educate them?

    There are issues here. For instance, watching a film projected means that you spend, during a two hour long film, an hour in darkness - maybe creating a dream like state. Digital projection does remove this flicker effect. But this is esoteric, and I doubt audiences even care.

    As for questions raised here in slashdot:

    The resolution in the TI system doesn't fit the width of films shot in a widescreen aspect ratio.

    There are many different ways to make a film have a wider picture. Super 35mm, for example, utilizes the area in a frame that an optical sound wave is normally stored on, creating a fuller frame image. This is usually cropped down to 1.85:1 or 2.35:1 for a widescreen film, chopping off the top and bottom of the recorded image. This is great for effects people because they can reframe shots in the post production process. There is also anamorphic, which uses lenses which squeeze onto a standard frame a wider image and projection uses a lens which unsquezes the image. Star Wars was projected in this manner - the raw image was compressed horizontally, and a lens was put on the projector which expanded it to a full widescreen image, no black bars.

    The resolution is nothing near that of analogue film

    Absolutely true, but it is improving. The TI system cannot be considered as a dot matrix field of pixels in the standard LCD projection or monitor sense. It uses a system of dichroic mirrors to relay each beam of light representing a pixel. The resulting pixels do not have a stacked, square relation to one another. What it cannot reproduce is that film does have a resolution. It's determined by the number of silver halide crystals in the emulsion. But these crystals are of a random shape and size, and do not conform to pixels. It's messy, chaotic, and gorgeous. Picture grain (on analogue) is the result of seeing these crystals in the image, when a film is underexposed. It is true that digital projection cannot match this chaotic aspect of the film picture.

    HOWEVER - as much as you read about the digital revolution, I've seen it. I've seen effects technicians working on major Hollywood films. And the amount of work they're doing that is invisible and are not for show - reframing shots, eliminating a modern car in a peroid film - is stunning. And when these are projected on film integrated into picture that hasn't been messed with digitally, at a 2000 line resolution, you do not notice. What you do notice about effects that give away bad effects are lighting discrepancies and bad rendering or false, too smooth movement. Think of Toy Story - which went from 4k line picture in a digital medium onto film - thereby it was sourced at those 4k lines. Did you notice it came from a pixelated source? No. Bottom line: you are already viewing in your cinema images that have less resoultion than real film.

    Wow! Higher framerate for film. Just as good as getting 60 fps in Q3A rather than 30!

    This gets into theoretical doctrine, which is messy. Film has been, for the past 70 years, a 24 fps medium. No one has complained that Lawrence of Arabia sucks because of 24 fps. 60 fps is more important in virtual point of view exercises because it better replicates real vision and the subsitution for mouse scrolling for your viewpoint. Undoubtedly, I agree with Ebert, the Maxi Vision system must look great. However - when a filmmaker doesn't shoot in the standard, normal 1/60th of a second shutter speed at 24 fps, the result is noticeable and unusual for audiences - such as the battle scenes in Saving Private Ryan, which were shot at a faster shutter speed, removing motion blur in images.

    The projection of film is an optical illusion utilizing the perception of persistence. Most film frames individually with movement have motion blur. Persist this image one after the next, and the brain interprets it as movement. This is a huge debate about perception and so on, which I shouldn't get into here - but the fact is, cinema is so old as a standard, that 24 fps is what people almost expect when they see a movie.

    The digital revolution is on, and its gonna crush film, Ebert is a Luddite - or - Hollywood is just hung up on buzzwords and trends and thus this system.

    Which is why I was so stunned at his article. He is anything but a Luddite. He was one of the first critics to use the Net, and often writes about tech issues. Yes, the digital revolution is on, and there are going to be huge problems. James Cameron, who does know a helluva lot about this, has said the problem is that Hollywood will go for the cheapest, and therefore nastiest, system. Whatever system they get so they can maintain control of distribution even greater. Imagine if the studios were hooked up to every projector and ran them from their HQ - yeah, it's not a pleasant idea is it? Two weeks into a films release and a scene is causing a media uproar - HQ deletes it from every projector in the world. Etc, etc.

    Likewise, the digital revolution is on - and it's not just a buzzword. The fact is, 90% of all films go through some form of digital process now, be it in editing or corrective opticals (traditionally done with an optical printer) or FX. This often entails painful procedures to get film to match the framerate of video systems (30 fps or 60hz NTSC) - 48 fps will make it even worse. There is so much money squandered getting film from one analogue medium into a digital one then back to analogue - that in the long run it's more effective for all parties concerned to move entirely digital. I'm telling you, here at ground zero, as a film student who has managed to see the new tech - that filmmaking in the traditional sense is undergoing a massive change - and it is unstoppable. What astonishes me is butting heads with traditionalists who believe everything must be done to stop filmmaking going digital - but haven't realized it already has - and that this new tech is liberating in that the real indie filmmaker can really make something for cheap, really cheap. Films that would never get made otherwise have been done on DV.

    We are getting to the point of - if you can imagine it - you can show it. Which I find personally liberating. Especially if I can do it faster and cheaper - or if a kid in Kansas in his basement can. I own, in my PC, for less than a really cheap car, the equivalent to a mid 90s TV station's image processing and editing capabilites. At the same time, too much content is now being produced - too many crap webcam soap operas, the Truman Show made real but in an almost more craven manner. The many headed Hydra that digital has brought to image capturing and editing has only just appeared, and none of us knows where it will really take us, or what the future of filmmaking will be. But it's better to be informed of the truth of the situation than to give into preferences for more familiar formats - because of some kind of notion of "purity". Filmmaking is the manipulation of time, space, and emotion. It is an optical illusion. Nothing more, nothing less.

  • don't be silly. you think it takes three months for a film to cross the atlantic from the us to ireland? i can ship books by boat faster then that. i can ship *me* by plane faster.
    The film doesn't spend three months on a boat before getting to Ireland. It spend three months in US theaters.

    Duplicating a film print is an extremely expensive and time-consuming process, and for that matter, you can only make so many copies before destroying the original. So instead of making a copy for every theater, studios make a limited number of copies, and force the theaters to take turns. (Usually by staggering international premieres.)

    Digital "film" would solve this problem, by allowing unlimited lossless duplication.
  • This is a pretty ignorant comment.

    DPs will be hardly obsolete with WYSIWYG filmmaking. DPs are more responsible today for almost painting with light - composition, given the worship of the director - has become a directorial, camera operator realm. To light well on any sort of video system, and do it well, is much more difficult than on film. Gaffers do not just set lights where they feel they should or is necessary. They do it under the supervision of the DP who knows that this light will fill out the key light coming from the overhead 10k which has been diffused in order to make it feel more like early evening. By using a Kino Flo as the fill light, they will get a clean, soft white light, leading to a more radiant face in close up. But if they tilt the Kino Flo so, which the gaffer doesn't know, they'll put the left eye of the actor into shadow, etc. None of this will change on video.

    Just because a person can see what they're filming does not mean they have a trained or intuitive eye to subconsciously alter a picture using lights - where you put darkness and light, what colors will pop out and which won't, where the depth of field is and where that draws the viewers eye. Look at your average, say, television soap opera shot on video. Now look at the video segments of Run Lola Run. Both shot on WYSIWYG video, but entirely different in tone and mood. Because of the DOP.

    Having worked with DPs, and met and talked with some of the greatest in the world, this tech will not make them obsolete. It will make them even more vital. What's difficult for them is adjusting to the fact that most of the time they are dictating, and sometimes hoping, that what they imagine will end up on film. But there is no way a gaffer can match say, the images in The Thin Red Line on video, without the eye of John Toll. Much of what you think is natural filming, where you just point the camera and shoot, is a carefully maintained illusion. There are diffusion nets and silks up and maybe even a 10k HMI in broad daylight.

    I'm sorry, having worked on films, I have to say, you're just plain wrong. The reason films look different, and feel different, and have different textures, and tones, and emotions - is often the result of light, and the DOP who understood to combine this light would result in that. And the reason why Sony Imageworks has leading DPs come to lecture the effects designers is so that they understand this process of painting with light when they do effects.

    As for dailies, a HD monitor does not for one moment replicate what its like to see dailies on a cinema screen. Having edited in analog and digital formats - a good Steenbeck or Avid with the best monitor system will not show you what something is like when it's blown up to ten feet high. Ever seen the Pixar crew watching their all digital films at sessions on lil monitors, everyone huddled around? No, they all sit in a traditional cinema.

  • Expect to see reflective LCD display technology come out in the near future. Reflective LCD is not at all like your standard LCD, but is basically an LCD built over a semiconductor. Reflective LCDs can handle higher resolutions than DLP and should be cheaper to manufacture. In addition, they don't suffer from the annoying artifacts of DLP (i.e. color separation).

    Film won't be used much in 10 years. HDTV actually has higher resolution than 35mm film and can have a higher frame rate. The 24fps HDTV standard came about not only to show films on HDTV, but for producers to record directly at 24fps. 24fps suffers during high motion scenes, but many film directors insist that they must have the properties of film (i.e. 24fps, grain, etc.)

    One other advantage of using all digital is that currently there is degradation when digitizing film to add effects and going back to film.

    HDTV also has a higher dynamic range than film. The new CCDs can handle darker material better.

    In addition, within the next few years it will be less expensive to use tape instead of film. There's virtually no time spent on development, and if a shot is messed up, just rewind and start again.
  • All you will need is the time to download (for a price) a one-view copy of the movie to your home computer (the amount of time will vary by your bandwidth, obviously).

    Yes, just like everyone uses Divx now. ANY kind of attempt to create a one-time viewable movie will fail. Any format can be cracked, any file can be copied and saved. This is not an exageration.

    Also, I do not want to watch movies in my house. First, no matter how big my TV is, it will not be as big as a movie screen because there are no walls in my house as big as a movie screen. No matter how big my TV is, it will always be projected from behind the screen, not sent over my head from a projector (or if it would, not only would the system be even more ungainly, but any move I made would get in the way of the projection, again because there is not enough room in my house for it). Both of these would detract from the quality side of the expirience. Looking at another way, going to a theatre with crouds of people around you also seeing the movie is an important part of movie viewing. Star Wars (special edition that is) would have not been an exciting expirience if we were just isolated small groups watching it on the "big" screen for the first time in 20 years (ie the first time ever for us youngens). And then there is the problem of having to invite a girl over to your house to sit in a dark room alone for the first date...

    So basically, no, theatres are not going to go out of style.

  • I saw Toy Story 2 in the theatre in Orlando a couple weeks ago and I had the same reservations everyone here is complaining about: "Low resolution, artifacting from compression, just a gimmicky technology, etc."

    I've done a lot of digital video stuff, so I have been well "trained" in noticing all the various artifacting that can be caused with lossy compression and all the other things that go along with digital video. I'm also a movie snob, so I'm also well "trained" in noticing all the little niggly things that can screw up a movie for me that the "normal" movie viewer may not notice.

    I was pleasantly surprised.

    I noticed *no* pixelation, I noticed *no* artifacting of any sort. The two major concerns I had with image quality simply failed to materialize.

    Now, what I *did* get was the best color I've ever seen on a movie screen. Bright blues, deep greens, vivid reds, etc. If you've seen any Pixar movies (TS, ABL, TS2...) you know their opening scene with the rendered Disney castle against the sky blue background, well, I've never seen that blue sharper and the image look more crisp than I did in the DLP theatre in Orlando.

    PLUS, as an added bonus, since there were none of the typical film flaws or frame jitter normally associated with 35mm film projection, the film was overall more *enjoyable* to watch because there were no distracting 1/30sec bits of dust or scratches or annoying little blips on the screen to have to tune out and the image was ROCK STEADY for the entire film.

    So, all I have to say: don't even try to judge the technology until you have actually SEEN it in action.

    -=-=-=-=-

  • But where is analogue now? A niche split between rich die-hards and poor elderly people who can't afford to replace their existing LPs

    Hey, I resent that. I'm only 37 (that's hardly elderly) and I'm not exactly poor either. But I like my old LPs. And even in those cases where I've bought the CD version of a particular album which I already owned on vinyl, I still use the LP when I can because it sounds better, even on my cheap, 15-year-old Technics linear tracking direct drive turntable (i.e. the argument that you can only get quality analogue reproduction at very high cost doesn't correspond with my experience).

    This isn't, for me, a matter of techno-ideology. I just know what kind of sound I like.

    Digital systems by their very nature cannot reproduce a real-world analogue signal perfectly any more than analogue equipment can - digital systems are not perfect, they just distort the signal differently than analogue. So a lot of the statistics you might quote in defence of digital aren't necessarily relevant. Human vision and hearing are *not* digital, nor do they respond to stimuli in a linear fashion. So the distortion inherent in analogue recording and playback might well be less destructive to the perceptual qualities of the medium than a digital process.

    Consciousness is not what it thinks it is
    Thought exists only as an abstraction
  • there were no distracting 1/30sec bits of dust or

    Oop, my bad. Of course, I meant to say "1/24 sec" which is film frame rate...

    -=-=-=-=-

  • by WNight ( 23683 ) on Sunday December 19, 1999 @12:26PM (#1462207) Homepage
    Ebert didn't get much right. The big problem is that he compares prototype, proof-of-concept systems, with the best of the film systems, and assumes that nothing can get better.

    I'll address his points one by one, pointing out the errors.

    But how good is digital projection?

    Here's the first and probably biggest mistake. He doesn't realize that technology is changing. The proper question is, "How good is digital projection NOW?" In a few years, it'll be bigger, better, and cheaper.

    its inventors claim, 500 percent better.

    One wonders how they came up with this number. Is watching a movie five times better on their system?

    And it can handle any existing 35mm film format--unlike digital projection, which would obsolete a century of old prints.

    Here's a pointless and barely accurate statement. How often does a theatre show a print that's twenty years old? Very rarely, because by that age they're brittle and faded. This is if the prints happen to be lying around. It's very unlikely there'd be many old films for a projector to display.

    Estimates for the Texas Instruments digital projector range from $110,000 to $150,000 per screen.

    Sure, the upfront cost of a new system is higher. So we should never upgrade anything by this logic. I mean, fixing an old car is almost always cheaper than getting a new one, even if it'll break down sooner and cost tons more to run.

    Digital systems are prototypes now, and are thus more expensive, but they'll get cheaper, and promise free or very nearly free delivery. The updated film system requires more film, making shipping even costlier and doesn't offer a future reduction in cost, like digital does.

    The source of their signal is an array of 20 prerecorded 18-gigabyte hard drives, trucked to each theater. This array costs an additional $75,000, apart from the cost of trucking and installation.

    A 400GB array costs $75k? Well, even assuming this was true, the cost would come down drastically over the next few years. Twenty drives and a server will end up being two drives, easily shipped.

    Even so, a movie is so memory-intensive that these arrays must compress the digital signal by a ratio of 4-1.

    And the promlem with compressing the signal is?? 4:1 compression with a decent algorithm is barely noticable, especially if you don't have a hard limit of 1/4 the uncompressed bandwidth to stay under. (If the film is 100MB/sec, 25MB/sec is trivial to attain, if you can hit peaks of 50MB/sec... If 25MB/sec is the hard limit, as in, downloading over a link offering only that much bandwidth, it's a little bit harder.)

    digital projection spokesmen said that in the real world, satellite downlinked movies would require 40-1 data compression.

    This actually seems fairly accurate, but I don't imagine they'd use satellite downlinks, it doesn't make sense when they could simply run fiber to the theatre for a higher upfront, but negligible ongoing cost.

    The picture on the screen would not be as good as the HDTV television sets now on sale in consumer electronics outlets! TI's MDD chip has specs of 1280 by 1024, while HDTV clocks at 1920 by 1080.

    Here he compares the prototype systems with HDTV of the future, and the best HDTV of the future, 1920 being the highest of the resolutions, not the one that most broacasts will be in.

    And then he misses the obvious point... If can can broadcast this HDTV signal, in higher quality than the digital projection, you'll probably have the technology for a higher resolution digital projection.

    One advantage of a film print is that the director and cinematographer can "time" the print to be sure the colors and visual elements are right. In a digital theater, the projectionist would be free to adjust the color, tint and contrast according to his whims. Since many projectionists do not even know how to properly frame a picture or set the correct lamp brightness, this is a frightening prospect.

    And here we have the famous "Customizability is bad, because you're not as smart as we are, and if we say it's best this way, then don't fiddle."

    Sure, some projectionist are probably color blind, but there are two things he missed, one is that the digital projector and the digital signal don't degrade or change, so you won't have to constantly fiddle to keep it in focus and bright enough. And if there are color controls, what's to keep them from sticking a sensor behind the screen to read the displayed colors and making the adjustments automatically?

    A technology isn't bad if it can be misused, his "tweaks are bad, because people have less taste than me" argument is like saying cars are bad because you can get them painted in ugly colors.

    How much would the digital projection specialist be paid? The technicians operating the TI demo installations are paid more than the managers of most theaters.

    No, really? The engineers travelling with the prototype systems are more highly paid than a young kid? Sheesh.

    This assumes that the system need be so complex to operate that it requires a trained engineer. I can't imagine it being more complex than modern home-theatre... "Press this button to start it, and use these controls to tweak it. Hit this button to stop it if the bulb burns out." Do TV's require electrical-engineers with specialization in antenna theory to operate them?

    If it does have any complex theatre-servicable parts, one technician could service the whole theatre, and would probably do something closer to swapping out a dead unit for later repair, than on-site service.

    This also ignores the benefits of having only one projectionist instead of one per machine. When you have to fiddle with film, and be on hand to fix problems that crop up, you need one person per machine. When you simply press 'Start' and watch the screen on a video pickup watching for problems, you don't need to be right there, and can hit 'Start' on many movies at the same time.

    One 'projectionist' (VJ?) and one tech would have to be cheaper than eight-ten projectionists as are required now.

    They could grab the signal from the satellite and try to break the encryption (as DVD encryption has just been broken).

    This show's he doesn't understand the technology. DVD encryption has fundamentally flawed because it was relying on untrusted (and untrustable) hardware to decrypt the DVD. It was only a matter of time before a key was grabbed, the Xing accident only made it easier.

    A digital projector on the other hand, being manufactured by the movie industry, could be 'trusted', because it's the last step in the chain before shining the movie on the screen, and because they could use crypto in the only way it can really work, from one trusted and secure machine to another.


    Actually, no. The projector itself would probably be the decryptor, and would be a sealed black-box (basically) given to the theatre by the movie companies, with which the transmission systems would communicate and agree upon a session key with public-key crypto. I doubt these would have an output labelled 'Dub pirate copy to disk'. And it's unlikely a trained tech, let alone a projectionist, could jury-rig one.

    Pirates could bribe a projectionist to let them intercept the decoded signal.

    Didn't he just finish telling us how you had to store this on a $75k system of 20 18GB HDs?

    Either he expects the average pirate to carry around these huge $75k disk systems, or he expects the storage to get cheaper.

    An MV48 print would be even harder to pirate than current films; it would not fit the equipment in any pirate lab.

    So, by this logic, a digital system would be perfect. With trusted machines at both ends, with huge storage requirements, and with no similarity to custom hardware, the digital system should be much more resistant to piracy.

    They set aside the aesthetic advantage that MaxiVision48 has over digital.

    Wow, a refinement of an old technology, using special film, gives better quality than a prototype of a new technology. I'm in shock.

    It's actually interesting to note that he does subscribe to the, 24fps is only good enough, not great, school of thought. Someone should transfer this to the undying "How many FPS are enough?" threads...

    When they hear the magical term "digital" and are told their movies will whiz to theaters via satellite, they assume it's all part of the computer revolution and don't ask more questions.

    Could it be because the costs of film reproduction and distribution and so high that avoiding this is well worth subsidising the theatre's purchase a new hardware?

    Wouldn't it be great if first-run movies came out across the world at the same time, instead of other continents having to wait for North America to be done with the film before getting it, and even then, getting the used and scratched film, after months of use? Actually, this might partially solve the DVD region code problem, if movies could reasonably be played worldwide at the same time, they wouldn't need to restrict region 1 DVDs from working in foreign players just to artificially create an audience for the big-screen version.

    Rainforest Cafes could put you in the jungle. NikeTown could put you on the court with Michael Jordan. No more million-dollar walls of video screens, but a $10,000 projector and a wall-sized picture.

    This assumes that these companies can spare the space for a projection system, which requires having a unobstructed area between the projector and the wall... And that they can afford the film costs, with a projectionist the run the whole thing...

    A wall of LCD screens will soon be incredibly cheap by comparison, especially because this application doesn't have problems with small join marks between screens, or higher number of dead pixels than would be salable on a laptop.

    But, if you accept the word of a technology pundit with no technology skills, who urges you to buy into a dying system with incredibly high upkeep costs instead of looking to the future...
  • damn so and sos, I submitted this story last week. Anyways, the REASON I submitted it last week was I think Ebert hit the nail on the head. Digital projection is NOT all it's cracked up to be. It's got pretty crappy resolution and is limited by the 24fps framerate which is a POS. Anyone who plays any game knows that the higher the framerate the smoother the motion looks. 24fps was originally used because it was the LOWEST framerate that could fool the brain into perceiving motion to save money on film prints. People have just stuck with it. One of the biggest problems with digital is the cost. I mean how many of your local 4 screen theaters are going to fork over 150k for a new digital projector when their old film one works fine. Digital movie projectors are just glorified presentation projectors whose resolutiopn isn't the greatest in the world. Or so it goes.
  • And whos with me that if making the film gets cheaper, ticket prices will simply continue to rise- outpacing even inflation? The way entertainment industry stuff works these days is that the industry sets a standard "price" at the maximum that people will pay for it, then spends all that money on advertising costs. If the cost of production goes down, they don't lower prices- they pour the extra money into more advertising. As I've aruged before, advertising, by its very defintion of spending tons fo economic capital jsut on trying to change people's preferences, is terrible thing for an economy- a ridiculous waste, and a ridiculous distorter of prices.
  • Two points here: one is that didn't I hear something about plasma screens eing on their way for home use? I know they're incredibly flat, and can do HDTV specs easily, but what resolution do they normally function at?
    The other point: blur is not always a bad thing. A friend of mine saw the Phantom Menace in on film and then on digital projection, and he said that the special effects looked more "fake" on digital, because it was sharp enough to see that they weren't "perfectly" blended into their scenes.
  • I've been working in 'digital film' for 22 years, from the New York Institute of Technology's Computer Graphics Lab until today, at Hammerhead Productions.

    First, let me say that I agree with Roger Ebert that high-temporal-fidelity film is great. The comment that it looks '3D' is what everybody says; it's that different from normal 24 fps film. Douglass Trumbull, of early special effects (2001, Silent Running) fame has spent the last 15 years trying to get a technology called Showscan off the ground. Showscan used 60 fps film, and can be seen at a couple of Las Vegas 'ride films'.

    Unfortunately for Trumbull, MaxiVision48, and other advocates of high-frame-rate film; people see high-frame-rate moving pictures every day; TV is 60 fps. Now, most people say think that TV is 30 frames per second, but each frame is made from two fields, each offset by 1/60th of a second, so the net result truly is 60 fields per second. The motion of TV is incredibly smooth compared to film. Anybody can tell this difference, although few people know what they are seeing. As an example, look at a daytime soap opera, compared to a prime-time show. One of the biggest differences that you see is the frame rate, as all prime-time shows are filmed on film, and then transfered to video. This distance from reality, filtering the time more coarsely, is what we see as 'film look', and is perceived as higher quality.

    At my previous company, Pacific Data Images, I was a strong proponent of doing animation at 60 fps. We did mostly 'broadcast' animation, things like show titles. But when we started doing commercials, we found that we had to work at 24 or 30 fps; as that was what was expected. You could talk until you were blue in the face that 60 fps was 'better', and you couldn't sway anybody -- because it was perceived as worse by viewers.

    I've seen the digital projections of both Star Wars and Toy Story. I went to Star Wars in a very dubious frame of mind, based on my previous experience that 'better' was seen as 'worse'. I thought that people would miss the flicker, and would even miss the grain and scratches. I left the theater completely convinced that this would succeed, though.

    Digital projection mimics film in many ways; but it really does seem that just the annoying flaws are removed. Nobody really likes scratches or splices or even film-gate jitter. I did perceive the loss of flicker -- a film projection is completely black half the time; and the digital projection isn't. Not yet, anyway. This lack of flicker grabs your eyeballs in a different way; I'm not sure how to describe it...but it's a little more immediate; a little less separated from reality.

    As for the other parts of Ebert's article -- he saw a prototype. The resolution will go up, soon to 1920x1024. The problem of transporting the date will go away with better technology (for instance, the transparent flourescent CD-size disks pointed at by Slashdot recently). The ability to 'color time' films is actually a huge win for digital projection; you will be able to do much more powerful color manipulation digitally than you ever could striking prints from analog film. The piracy issue can be avoided somewhat -- I agree with earlier posters that the decryption can happen in the light-modulator itself; so that you wouldn't have to ever have a decrypted signal.

    Roger also claims that studio people don't care about technology. I completely disagree; we have a thriving technology-of-film community here in LA, and we have been discussing all of the issues that he has brought up here, in wretchedly thorough detail, for the last few years.

    thad

  • All that means is that they're not using the right storage medium to store the digital information. It sounds like this is the "first cut" product.

    All the "Maxi" product does is increase the end goal - 48 frames/second plus increased resolution.

    I suspect that if they used something like an optical tape system (which can store terabytes per reel), they wouldn't have to slap together a kludge like an array of hard disks to "play back" a movie.

    Of course, the ultimate goal is so that they DON'T have to transfer physical media around - they want to be able to stream the movie directly into the theaters (and to whoever else wants to pay for it) w/o bothering w/physical transport.

    Can anybody do the numbers to figure out how fat the data pipe would have to be to match or beat the "Maxi" frame rate & resolution (assuming lossless or no compression)?
  • MUCH fewer moving parts? The TI projectors have over *three million* independently moving parts, there's a moving mirror for each pixel for each color. I'd wager that the TI projectors have more moving parts than any other machine ever made in human history.

    OK, they're small parts...true. And the reliability is astounding. But still...

    thad

  • It is true that a negative is higher resolution than HDTV, HOWEVER, after all the mastering is done and the film is duplicated the resolution is lost. My information comes from SMPTE (Society of Motion Picture and Television Engineers) at a persentation done a few weeks ago at Sony in San Jose. Yes, film is higher resolution, but once it has been processed the advantage is lost. HDTV maintains its resolution throughout. Also, a CCD has higher resolution in low light levels than film since fast film must be used. In addition, the CCD has a higher dynamic range, i.e. there is much more low-level detail in dark areas with a CCD than is possible with film.

    As far as grain, there currently is work progressing on digitally adding "film" grain to HDTV recorded video.

    According to the SMPTE presentation, once the film is scanned to digital and goes back to film, the resolution is lost. The best film scanners cannot match what a good CCD can do at this time.

    It's like photocopying a 1200 dpi grey-scale laser-printer image with a 1200 dpi scanner. Since the dots likely won't match up exactly the scanned image will not look as sharp or as good as the original. The same sort of thing happens with film. Let's face it, there are multiple generations of film to film transfers done before the film reaches the theaters. My guess is that there are a minimum of 3-4 film-film transfers. The first would be from the original film copied to a new film, since it has to be cut and edited. This would go from a negative to a positive. Now, this golden tape isn't about to be used to make duplicates for the theaters (unless somebody is really stupid) so this positive is duplicated again into one or more negatives. These negatives are then used to copy to the film that goes out to the theater. Now, that's four copies made. If the original resolution was 4000x2000 you're now probably down to 1500x800. If digital processing is done between the original film and the film that goes out to the theater then the resolution is probably higher, more likely around 2000x1000. This isn't far from HDTV which is 1920x1080. Not only that, but the HDTV can easily be recorded at 30FPS progressive or 60FPS interlaced. For the die-hard film bufs it can also be recorded at 24FPS progressive.

    At the SMPTE presentation some A/B footage was shown with footage taken with an HDTV 24FPS camera and a 35mm film camera. The HDTV looked very close, except where there was a lot of contrast in which case it looked BETTER since there was more detail in the dark areas.
  • This assumes that Moore's Law applies [...]. In fact, screens improve rather slower than Moore's Law. 5+ yrs ago, I had a screen that could do 800x600.

    Moore's Law applys to transistors on a chip, not pixels on a screen.

    But transistors, or at least devices, on a chip are exactly what we are talking about. CRTs have not followed Moore's Law because the costs are dominated by the physical manufacture of high-precision glass bottles. Projectors use either LCD or digital mirror shutters, and these have been improving much faster than glass. Not as fast as pure silicon: I'd put the Moore Constant for them at around 24 months instead of 18.

    Paul.

"Virtual" means never knowing where your next byte is coming from.

Working...