Posted
by
CmdrTaco
from the click-here-to-see-episode-2 dept.
the coose writes: "EETimes is running an interesting story describing the current turmoil over DTV standards. It involves many players, including, but not limited to, Microsoft, Sun, the ITU, and SMPTE."
This discussion has been archived.
No new comments can be posted.
"By the time things get to ITU, it's usually too late," said Jerry Bennington, the secretary of ATVEF and a consultant to CableLabs. "ITU does not necessarily resolve anything."
Actually, MPEG looks pretty crummy whenever it has to deal with slight changes in color. I was watching a DVD and saw artifacts all over in one scene of Pulp Fiction where the main characters were standing in front of blank off-white walls. Slight variations are a challenge for video coders, as many of their algorithms are based upon fairly large changes. Small changes just don't get noticed.. -- Ski-U-Mah! Stop the MPAA [opendvd.org]
The blockiness you see in MPEG is due to the block size of the estimated area. The block size is proportional to the data rate you are recording at. If you record MPEG data at 1Mb/s you get lots of blockiness. If you record MPEG data at 25Mb/s you see NO blockiness at all.
The way to reduce artifacts is to increase the bandwidth used to record/tranmsit the video. Increased frame rate and resolution won't remove artifacts.
BTW, MPEG as seen on DirecTV averages to around 3-4 Mb/s. DVDs are like 6 Mb/s. Professional video recorded at 25, 50, and even 100 Mb/s has no perceptible artifacting.
I vote that from now on anyone who uses the pseudo-word "boxen" should be hit in the head with a sledgehammer. It was never funny. It was never clever. As a matter of fact, it's quite stupid. Please stop or I'll kill you.
AC--
Box phonetically equivocates to baw-ks. Boxes phonetically converts to Baw-k-ses, a moderately more linguistically annoying phrase, since you've got to loop yourself from the s sound, to an "eh" sound, back to the "ess". The resolution pluralization became Boxen.
I doubt it was ever intended to be funny or clever. It's just easier to transition from "ks" to "en" than it is to transition from "ks", add a short "eh", then back to "es".
Good example--when going from "Rack" to the plural, you turn "Rack" to "Racks"/"Rax". You can't make the same conversion though for Box, because you've already got that ks/x sound in there. Thus you get something like Boxen.
If you think I've taken this analysis too far, you're the geek who threatened to, and I quote, "kill" somebody over your dislike for a five letter work ending in N instead of S. Not that I took you seriously:-) It's all cool. Lots of people get worked up when they don't get enough seks.;-)
Yours Truly,
Dan Kaminsky DoxPara Research http://www.doxpara.com
Every company is going to push whatever standard they perceive will give them the greatest market advantage.
Personally, I don't really think it makes much difference which standard is decided as long as some standard is finally *laugh* agreed upon.
Does it really matter? DTV - even if the highest-quality, cheapest to implement, no privacy invading, open source solution is chosen - still won't replace the gaping hole left in your existence where a social life used to be. =)
Get out some more, don't worry about TV too much, it'll still be there when you get back. Who cares if it's missing a few lines of resolution? =-)
I don't know for sure, but I would assume one of the reasons that X-ray film (which is very large compared to 35mm optical film) could still only have about 2K resolution is that the grain size is so much larger than with regular optical film. Making a special film that responds to such high frequencies of light probably results in a very large grain size. But then again, I'm just speculating here. I am really surprised to hear that X-ray film can be replaced by 2K digital. I would have thought that more resolution would have been needed.
In any case, the grain size for film that responds to visible light is extremely small, and keeps getting smaller every year with the introduction of new film socks. I've seen 70mm prints of movies from the 60's and 70's, such as "Lawrence of Arabia" and "West Side Story" that looked truly amazing, and yet today's 35mm film is getting to where it looks almost as sharp. That's because the grain size has decreased dramatically over the years, and will continue to do so in the future. That's a big advantage of sticking with film. You can always benefit from advances in film quality without having to buy a new projector. But once you switch to digital, you are stuck with the same pixel count forever.
If you're looking for a reference about 4K resolution being equivalent to film, here is an article [cinematographyworld.com] that talks about a panel discussion between film and digital proponents. The folks from Kodak talk about how 4K scans show the level of detail on 35mm film to be between 8 and 12 million pixels, which is much greater than the paltry 2 million pixels offered by 2K digital.
And if you really want to see just how good film can look, try checking out Technicolor's newly revived dye-transfer printing process. Articles talking about it can be found here [directorsworld.com] and here [mkpe.com]. It's pretty amazing. I've seen it in first hand, and it blows digital projection away. The picture looks so sharp and clear it's as though you could reach right into the screen and touch it, and the color is absolutely incredible! If you live in the Bay Area, you can see a dye-transfer print for yourself if you go to the Century 25 [centurytheaters.com] in Union City. "Mission: Impossible 2" and "Shanghai Noon" are showing there with dye-tranfer prints. It's the best film quality I've ever seen. If you can, go check it out!
The games are too slow, and the email screen could be laid out a lot better, agreed. The shopping system works as well as you could hope for for a dumb home user on a TV system though.
If noone goes and actually creates the systems, then nothing will happen. The games might be interpreted piles of junk (I assume that, or the system as a whole only has a Z80 controlling it), and I would prefer Tetris and other puzzle games (even some classic platformers) to the trash on there currently, but as a real example of a technology it is great.
When the next generation Digital Boxes come out, hopefully within the next couple of years, then there will be a lot more power available to do stuff correctly. Hopefully they will also have ADSL support, and a built in DVD drive (why not use the one set of MPEG-2 decoder chips instead of multiple sets?).
On of the rules of business it to get it out there quickly whatever the faults. Sky has done this knowling it isn't the perfect solution, but they also know that if they didn't release anything then OnDigital (Terrestrial Digital TV) would be on the case. If only the SKy boxes allowed 2 way satellite transmission for IP traffic as well...:-)
referential connections (ie ensure a link encoded within the content)
If you make the assumption that everything is a computer and that XML or Java engines will be embeeded into every computer and consumer item known to mankind (and Mars 127.0,0.1) then if any any point you insert you hacked/hand-made/sniffer device that removes those not-so-subtle control signals, then you can kiss your replay features good bye. Look the technological trends, the Amiga/TAO/Sony announcement of being able to hook multiple IP cores within the same OS. The support of big name companies for selected XML "standards" (with convenient amnesia about certain inconvenient consumer friendly functions). If each silicon slice supports the necessary control protocols, then when you replay back an "interactive" movie, they will be able to tell who/when/what and be able to dictate the terms of the availability, target bits to a specific machine, and eliminate 2nd sale, redistribution, cross-transmission, and lending/borrowing priviledges. Education and entertainment are going to be big money spinners and the players want to keep every penny of it. If you think DVD was bad, wait until every bitstream is encoded.
Take a careful look at how the protocol extensions are to be used in their examples:
Sorry guys, not only can (as McNealy put it) you kiss your privacy goodbye but you might as well forget about keeping any bits you buy/lease/rent whether software or content.
I thought the "film" look was because of the slow motion tracking 30 frames per second, whereas video looks real due to the high frame rate, making things look more like they do in real life...Actually, you say first that people think video looks more real, but then go on to say that film is more similar to the way we see things--that doesn't really make sense...
Well, HDTV is many different formats (both laced and progressive). You can lower the resolution and squeeze more channels into your bandwidth or push it up and have one.
And the transition to HDTV so far has been extremely slow. Only one station in this market (92nd, Tri-Cities TN/VA) is even broadcasting HD, and I'm not sure if they're using any actual HD production equipment or just upconverting their NTSC signals. The reason? HD equipment is extremely expensive (a good HD overhaul can cost $1,000,000 and up, easily).
I've been waiting for a better format than NTSC ever since I started in this industry. HD sounds good, but it is extremely uncertain in 85% of the country or more. Plus there are a lot of issues with lowpower stations that will be displaced by HD broadcasts that seem to be holding it back somewhat. _______ Scott Jones Newscast Director / ABC19 WKPT Commodore 64 Democoder
I have been a much happier man since I abandoned the demon box four years ago. Now when I do get close to one of the cursed things I really start to notice how they grab your consciousness away from you. They have a scary sedating affect upon a person. I see so many of my friends whose lives revolve around "what's on TV." I won't say its all bad programming ( I like watching the Simpsons,) but in general it's shit. TV exposure in my youth has contributed to my past attention span problems. I see it affecting my friends' capabilities making them appathetic, unambitious people who have so much potential yet never use it. "Fuck DTV. Fuck TV."
In the UK, DigitalTV also has games, email, shopping etc as well, using Sky TV's Open platform.
As a whole, the platform bites ass. OK, a year from now it might be a bit more efficient, it might allow on demand email, tailored information, good games, browsing(after all, it is a browser that you are using, I've coded bits of it). Good idea, bad execution.
ATVEF embeds HTML and URL information in the television signal. Your computer/WebTV/smartTV then sucks down the desired conent from the net and displays it onscreen with the television program.
They're going to want to avoid allowing consumers to tape the content. One thing they're afraid of is that you'll record "Dharma and Greg," cut the advertising out (It's digital -- you can do that seamlessly) and repost it on the Internet. The horrors.
And yes, they'll be quite interested to know that YOU (And they will KNOW it's YOU) are watching the Live Goat Porn channel every night from 6 to 9:30. The up side of this is you'll get directed advertising for all that cool Live Goat Porn merchandise. You might even get honorable mention on the Live Goat Porn web page (http://www.livegoatporn.com) for being their best fan. Wouldn't that be cool?
I see a mental focus problem similar to M$'s -- M$ seems so focused on Windows that the OS is viewed as incidental undercarriage to be engineered mainly for Windows purposes (on *nix, the OS is the focus, and GUIs are optional payload, and competing GUIs can evolve, to everyone's benefit).
Similarly, the DTV folks seem to be very focused on TV, and seem to view the digital data-broadcasting function as incidental undercarriage instead of a general purpose infrastructure that can carry an optional HDTV payload.
The consequence is evaluating alternative modulations and encodings in terms of TV instead of in terms of delivery of data with various levels of integrity and timeliness.
E.g., IEEE-1394 ("Firewire" etc.) addresses the issues of sharing a channel between isochronous (guaranteed chunks/time) uses and asynchronous uses. Similar concerns should guide DTV broadcasting, if there is going to be a clean path to future digital mixes, which may include stuff not yet invented.
If the bottom layer is fixed and dedicated-purpose, it will force the invention of ugly tunneling kludges whenever someone wants to transmit something new (like doing things in the comments of HTML).
It is easy to imagine lots of interesting ways of using a flexibly designed digital channel with a fixed max capacity. Let's hope the bottom layer design doesn't do perverse things to interfere.
There will be plenty of pay-per-view content, but to build the market there will be content available for free or at a flat subscription rate like cable. And that content will be lucrative to provide, so it will continue.
I'm not talking about the availability of "free" or subscription-based content. I'm talking about putting the video rental market out of business. That's one of the goals of the studios/networks in transitioning to (H)DTV. Fewer intermediaries take a cut out of PPV revenues as compared to rental, which would benefit the content producers and cable companies (frequently one and the same) immensely.
Of course, this will restrict your choices for home viewing to the top dozen popular movies, but that's better for the studios anyhow - no pesky indie or foreign flicks to compete with their "Mission Impossible N"s or "Runaway Bride"s (or "Battlefield Earth"s).
Would somebody PLEASE just drop a nuke on Seattle and get it over with..
Hey! I live in Seattle. Bill Gates lives in Medina, on the other side of Lake Washington, where Redmond is.
Now, you won't get any argument from me about dropping a nuke on Redmond, which would take out most of the bad places and give us new lakefront property - they already chopped down all the good forests that used to be there.
But just broadcast a message on PBS and NPR and the local college and high school stations about 5 minutes before impact so we can get down in the basements, ok?
Everyone's perspective is limited by what they know. TV will looks totally different in 20 years from the standards committees expect.
So why do we still have CR and LF - the carriage return on a typewriter and the line feed on a typewriter?
Personally, I think we should just announce the Open Source HDTV format is the European one, code it to run on Linux and BSD, and let the chips fall where they may.
Take some initiative people - choose the best format, not the ones the manufacturers want!
The current ATSC modulation standard (8-VSB) doesn't work in the real world. There have been vague promises of "magic chips" to fix the problems, but nobody has delivered on their promises. I have to conclude that 8-VSB is dead, even though many people claim that it is "just pining for the fjords".
COFDM and DVB-T work today, the hardware is in mass production, and it looks like it will become the international standard for DTV.
I'm for the HTML based plan like ATVEF, since it has all the advantages of an established standard. Plenty of creation tools, few intellectual property issues, and plenty of people that know how to code it.
Those proposing new display engines (usually based on Java) point to the limitations of HTML and XML as unsuitable for use on television.
HTML/XML is also more accessible to people with vision and cognitive disorders, since it's basically text that the client can interpret however they will. Pardon my language, but how the *fuck* do they expect to do large fonts, alternate colors, and speech synthesis on *Java*. If they put in all the hooks to do this well, they'll end up with the OS/360 of content development.
Plus, you can do Babelfish-style translation of the content with HTML/XML. Again pardon my language, but how the *fuck* is anybody supposed to translate Java bytecodes from English to Spanish. Sure, English is used a lot all over the world, but I imagine Hollywood still gets significant revenues from dubbing movies and TV programs into other languages. With Java, they have to translate the source and recompile for every language. Even if they wanted to bear the expense, the configuration management issues are formidable.
And how can Java be indexed by the search engines? Much of any media company's value is in their archives. By using Java, they lock their product up where people can't find it to buy. Java either costs sales, or you have to pay to recreate the information in a form that *is* indexable. In other words, they've gotta have a web page (or the equivalent) replicating the content.
Instead of using pure Java, I think they ought to use an HTML/Java combination. First, they would extend and/or clarify HTML so that it rendered consistently (standard table behavior and standard font sizes). The biggest problem with HTML is that basic things like tables are wildly different across existing browsers. Simply codifying how tables, images, fonts, and styles display for the TV Browser Standard would be the most important step. Most stuff, like National Geographic supplemental information and football player stats, would be well-handled by HTML.
They would use Java plug-ins for the truly unusual stuff. For instance, the zoomable-spinnable-interactive-3D-instant-sports-r eplay would be Java.
(Does this remind anybody else of the "Would you like to know more?" bits from the Starship Troopers movie?)
Besides the fact that it's old and was never very funny, I also suspect that you are ugly.
For future reference, I recommend you arrange your attempts at humor in such a manner which allows for variety and a continual modification of themes so that you don't bore everybody to a premature death.
Just a thought.
--Fantastic Lad, the most amazing script kiddie ever!
100% agreed. Just convince those politicians at the FCC to accept that they messed up. Most people here will know PAL and NTSC... do we want that to happen again?
The Chinese Lottery is essentially a crypto term referring to the following concept: Each of China's billion residents is given a set top box for their TV. The boxen each receive a block to attack(this can be broadcast simply by continually outputting a stream of if(serial == "12345"){block="0xAABB1234")). The cost of the attack is borne in small chunks by whatever the market will bear to pay for a set top box, plus each individual paying for the electricity(which is of less value, presumably, than the amortized value of each block being cracked).
Then whoever's box cracks the code displays a message, "You've just won a million dollars. Please call this number to claim it." The government collects the box and gives some sum to the winner. Then they take back most of that sum through taxation(OK that's not part of the Chinese Lottery, but it's part of *every* lottery), and everyone's happy.
Now consider that most of these settop boxen will end up actually having phone connections to the media owners, so that Pay Per View can be implemented. Now the owner of the box doesn't need to be paid off or even informed that his machine was being used to crack some code, and that the cost of that cracking was being borne by his electricity bill. The result of the crack can be sent down the pipe at will.
All they need is the ability to execute code...
Yours Truly,
Dan Kaminsky DoxPara Research http://www.doxpara.com
Try checking the Sigma Designs news server, at news.sigmadesigns.com [sigmadesigns.com]. I seem to recall seeing some mention of that, and that it was intended for computer monitor viewing, or something. --
Actually, the 1920x1080 24 fps progressive format is exactly the same format that is being used for Star Wars Episode II. That's right, it's being shot in the exact same resultion that is supposedly going to become the broadcast standard for TV. Why would studios being doing that? If they use the same resolution for movies as is going to be used for TV, why would people go to theaters anymore? It doesn't make sense.
I am under the impression that the filming is done at a higher rez (1900 some lines), but will be downsampled to 1280 lines. The same article (in Home Theater) said that while each individual frame is capable of yielding higher resolution than that, there is a "shutter gate error" or something like that that reduces the effective location and resolution accuracy of film.
"The animal is inside-out." *boom!* "And it exploded!" - Galaxy Quest -
Digital video is where it's going, and bloody fast. Set the standard at 48 or 72fps, and use every second or third frame if cutting back to film (or double or triple the frames if going from film to television).
That's been one of the ongoing topics at the Seattle International Film Fest, which wraps up this weekend. Most of the directors have said they're going to digital, partially because you can film for under $10,000 instead of $250,000.
So expect this to be the "new story" next year, when all these films start coming out of the can (ok, I guess that's outmoded too).
And let's start seeing the wide format more in use. Most of our world is lived more horizontally spread-out than vertically spread-out, New York and Hong Kong being the notable exceptions.
Also a discernable trend, as more people start filming for their own countries, partially due to the drop in cost factor. So, we should expect to see way more films in wide format.
A humorous anecdote - at one point in the film fest a whole bunch of us had to tell the theatre manager (a nice woman who we like) that the aspect ratios were way off, when they persisted in showing a french film (Girl on the Bridge) at the Cinerama with the wrong settings. Of course, this was just after they had completed showing How The West Was Won using the Cinerama tri-screen, so they had to totally rework the whole theatre. It took us five minutes to explain why those of us in the audience knew how the film should be prepped.
<i>Then they take back most of that sum through taxation(OK that's not part of the Chinese Lottery, but it's part of *every* lottery), and everyone's happy.</i>
Niether the Canadian national lottery, nor the local one here in British Colombia are taxed. Different approaches to paperwork, I guess.
The studio production standards we've cited are not to be confused with the broadcast standards listed below.
Actually, the 1920x1080 24 fps progressive format is exactly the same format that is being used for Star Wars Episode II. That's right, it's being shot in the exact same resultion that is supposedly going to become the broadcast standard for TV. Why would studios being doing that? If they use the same resolution for movies as is going to be used for TV, why would people go to theaters anymore? It doesn't make sense.
Because film is going digital. All the directors at the Seattle International Film Fest, which is much more of a director's fest than Cannes will ever be, have said they're going digital for at least half their upcoming films.
And the argument about theaters was used during the release of color TV. Theater attendance is way up since then. If I'm watching Gladiator, I want to see it at the Cinerama, not at home. Sure, there are a lot of made-for-TV movies that aren't any better on the big screen, but try telling me that The Matrix is the same on HDTV as the big screen and I'll ROFL immeadiately.
Unless of course, they have no intention of ever giving us the HD resolution at home. And since all the major TV networks are now owned by movie studios, they can do that.
Actually, they have no choice, the legislation requires them to broadcast in HDTV by 2003 (or was it 2004, can't recall).
So, here what's going to happen. HDTV resolution will be used for theaters, which will be a step down in picture quality, and then regular resolution digital TV will be used for the home, and the extra bandwidth from the digital broadcasting licenses that were given away by the U.S. Congress will be used for a bunch of stupid "interactive" content. Isn't that nice?
No, you'll see HDTV resolution at the mini-malls most likely, and have to go to the city to see higher res versions on a "true" big screen, where sound and experience will become more of the big picture.
Now if they could just issue candy in non-plastic wrappers to avoid the sound crinkle, I might start buying theatre candy again...
Actually, they have no choice, the legislation requires them to broadcast in HDTV by 2003 (or was it 2004, can't recall).
The FCC requires them to broadcast digitally by that time, but does not require them to broadcast H/DTV (1080i or 720p). And a lot of the broadcasters plan to show 4 480p channels - 1 of the same crap they've been dishing all along, and 3 additional channels of shopping.
Do you realize how much bandwidth that would require? Even with a highly compressed video stream you still need several Mbps, and since you're not special that means everyone needs a pipe that size. No nationwide network can provide that amount of data yet. The quip about booting to Windows it also pretty stupid, the point of open source is that if you don't like something you fucking write your own. Show some initiative man, complaining gets nothing done.
Vision and cognitive disorders? Aren't these the same people who can't watch TV because of this? Should I put fucking Braille on my keyboard just in case a blind person comes over to my house and decides he wants to type up an email? You ENTIRELY missed the guy's point with Java. They want to use Java as the actual compiled code that handles processing of such things as text and video. There is no need to translate the display code into different languages. The text the display code would be handling could be in any damn language people wanted. It'd be awfly easy to have one large XML file with text in several languages and it merely picks out the tag that says English or Spanish. HTML would be a pretty fucking stupid markup language to use because it tells the rendering program how to view the data. Contextual info needs to be just raw data that a parsing program can figure out how to display according to a set number of preferences. I shouldn't have to worry about HTML formatting on my television.
We're seeing arguments over television, one of the horsemen of the technological apocalypse. Interactive television is a misnomer, you're not interacting with Buffy the vampire slayer of Carson Daily. You're merely getting to choose at which camaera angle you view them or look at their bio on the side of the screen. It reminds me so much of Fahrenheit 451 where his wife thought she was interacting with the people on the television because they paused while she read her lines. Thats actually exactly that people are asking for. Its funny because we stare at the glowing picture on the television like pre-historic man sat and stared at his fire. Its amazing to think how far we haven't come since them. Theres an alternative (a rather cheap one) to the DTV troubles. Before virtual reality was really feasible we used this thing called real reality. Instead of needing fancy goggles and a body suit all you need is some shorts and tank top. Theres tons of applications for real reality too. Some of my favourites include the beach and driving. Quake is fun and all but nothing beats running around playing paintball until you're too tired to drive home. Oh well, some people like sitting on their couch enough to put an ass print in it.
So called e-cinema or electronic cinematography seems to be moving toward two HDTV standards: 24p (24-frame progressive) and 60i (60-field interlaced). Those who are after the "film look" prefer 24p, especially since the progressive approach results in fewer artifacts (aberrations) and higher resolution. The 24 fps speed is also the same as film--even though the extra sharpness of video sets it apart from film. However, compared to 24p, 60i (30 fps) does a better job of tracking motion, which means that zooms and camera movements--especially when done rapidly--appear smoother. (These must be done slower with film or 24p video.) Either of these standards can be converted to film. With the proper equipment and electronic setup, either of these standards become almost indistinguishable from film when projected. (If there is any "fault" with digital video it's that seems "too sharp and clear" compared to film. Of course, I guess we can all get used to better quality if we really have to!) At the same time we are seeing the beginning of a move to e-theaters, or movie theaters that use video projection equipment. A dozen or so major theaters around the country use digital projection now. Several digital feature "films" are in the works now. The studio production standards we've cited are not to be confused with the broadcast standards listed below.
More information on the DTV Standards [cybercollege.com] is available.
They're after an encrypted signal path from the studio to the screen. This is the single biggest hangup preventing (H)DTV from hitting the market, and is the reason why cable systems aren't carrying HD signals.
Forget video rental, it's pay-per-view only w/ (H)DTV.
The new "interactive features" will be used to track your viewing habits with alarming granularity.
I was thinking about the parallels between this and the various TV, videotape, and more recently HDTV standards. And the world has changed. The hardware necessary is cheap enough now that it will happen in software. New codecs can and will be developed. New viewing software will be downloadable. That wasn't possible for TVs or VCRs and was only on the horizon when the HDTV debate began. We are likely to see incremental improvements rather than needing a legislated concensus to get manufacturers to shift to a single new standard. Sometimes life is good.
The implementation of the receiver programming system is a very minor issue compared to the issue of getting the signal to the receiver in the first place. The big question is: will the present 8-VSB system work? Why not COFDM or even some other technology? There are numerous sources I can point you to, but a Google search [google.com] can get you lots of useful information easily.
What's happening with (H)DTV in other countries. IIRC in Japan they were pursuing a High-Def analog format. Anyone know what's going on with high-def in Europe? It also seems as if the television makers are trying to keep DTV capable TV prices at a premium as long as they can. For example, Sony makes a Wega 34" 16:9 HDTV capable set which sells for ~$6000. In America, this and some of their high end XBR projection screens are the only progressive-scan models. Whereas in Japan, they offer 29" and 34" 4:3 progressive scan models at seemingly resonable prices. AFAIK, when tuner boxes for DTV come down in price, it will only make a big difference for sets that progressive scan, which should be able to display 480P at least.
from the proliferation of standards i can see two trends. first: that they can't agree because each party wants to control the spec. standard operating procedure for large tech companies. second, that no one is exactly sure what features are really going to be needed in the future.
i think the answer is that we simply don't know and that there will always be new features needed. so how do you upgrade consumer devices in the field? it has to be as easy as clicking a button to upgrade and having the change done immediately, transparently, and free. anything less will be unacceptable to consumers and then they simply won't upgrade. imagine if all web pages were still written using HTML 1.1.
The "ideal" resolution would be a resolution which was backwards-compatiable with what exists today. This would be (PAL x NTSC)/(Highest Common Factor).
Frame rate should be calculated the same way. 25f for PAL, 30f for NTSC. Therefore, the "ideal" frame rate would be 150f. Yes, this is a lot higher than existing systems go, but it makes transferring existing NTSC or PAL footage a doddle. You don't lose any information, and you don't need to do any interpolation.
Such a resolution and frame-rate should also reduce the horrible artifacts produced by MPEG. (MPEG cannot handle low light well. It looks like a JPEG in a 16 colour resolution.) The only way to remove artifacts is to INcrease the resolution & speed, not decrease it. If you decrease it, the artifacts are still there, they're just not so "visible". Which means exactly nothing, since you don't watch a single frame, but a whole movie.
At least in Europe we have had trouble free DigitalTV for 2 years now. There is nothing wrong with it quality wise - Widescreen is excellent, the audio is excellents and the picture quality can be excellent.
Since noone has a TV capable of showing a 1920x1080 picture, or even half of that, I don't think that the lack of a high resolution DTV signal is that bad. That capability can be added on later, when there are more satellites providing more bandwidth.
Of course, HDTV looks amazing, so when it becomes affordable it could be an interesting option to purchase. Unfortuneately, there is nothing on TV to actually watch, so I don't currently have a TV and I don't intend to get one.
Last thing I want is to have to have a box with "Microsoft Media Player" logos on it. Looks like that is the way America wants to go, with MS-MPEG4, etc. A TV is meant to perform a function - show TV. In the UK, DigitalTV also has games, email, shopping etc as well, using Sky TV's Open platform. You ust can't browse the web using the box, but the next generation of DTV boxes in europe will be able to do that as well. Hopefully, Mozilla with its extreme customisability will be chosen for these boxes ahead of PocketIE or whatever.
Next: People to incorporate ReplayTV and DVD players into the DigitalTV's themselves.
There's a lot of behind-the-scenes dealmaking going on to try and insinuate copy protection methods into DTV signals. They're trying to prevent people from recording programming off the air (since that would prevent you from buying the home video version of whatever it is).
Pay very close attention to this. Look for copy protection systems in the upcoming standards, and call attention to them. It's very important that it not come to pass.
Reading this article, I am reminded of a paragraph appearing in the 2 June 2000 issue of Science (Vol 288, p. 1597).
New Technologies often initially attempt to reproduce the previous technology in form and function. For example, when word processors were first introduced, many were designed to emulate typewriters: Characters were entered along one line at the bottom of the screen, and white "electronic paper" has to be scrolled up and down to the "writing head".
Interactive television strikes me as the same sort of revolution. Everyone's perspective is limited by what they know. TV will looks totally different in 20 years from the standards committees expect.
Is it that they are not going to allow it or do we not have the means to (inexpensively) record hdtv. What kinda of storage would you need to record 2 hours of 1080x1920 @ 30fps? DVD is 480 x 704 and a mpeg2 compressed movie takes up 4-8 gigs. To record HDTV we are going to need a Tivo like box but with a much larger drive. I'm sure when that type of tech become cheaper we will have recordable HDTV
Forget video rental, it's pay-per-view only w/ (H)DTV.
That's funny. Then why do I find plenty of examples in nearly every medium of people willing to give me content for free and sell advertising space to sponsors. I find this on the radio, TV, the Web, some smaller newspapers, checkout aisle swapsheets and low budget newsprint computer magazines. And people who buy the latest home entertainment technology are a really enticing market niche to be able to sell advertising time in front of. They have disposable income and are willing to part with it for new gadgets. They are likely to be younger and novaphiles, so they may become repeat customers.
There will be plenty of pay-per-view content, but to build the market there will be content available for free or at a flat subscription rate like cable. And that content will be lucrative to provide, so it will continue.
This reminds me of that Simpons Episode where Homer doesn't allow Bart to go see that Itchy & Scraty movie. Homer tells Bart to watch TV instead, and Bart, who is upset at Homer for not letting him go says something like "TV Sucks".
Homer got really upset and said (in a growling voice) "Bart, I know your upset, but don't say anything you don't mean" or something like that...great episode!
You know what I want? I want to go to a web site for my favorite shows at any time of day or night and select any current episode for playback. Ideally I could get this at a nice high resolution and a reasonable frame rate without having to boot my computer to Windows (Which is difficult since I don't have Windows on my computer.)
Somehow I don't think the content providers would be fond of this idea, despite it being the logical next step for content delivery services.
It's inutterably stupid to choose a video standard based on 24fps, simply because that's what film uses.
Digital video is where it's going, and bloody fast. Set the standard at 48 or 72fps, and use every second or third frame if cutting back to film (or double or triple the frames if going from film to television).
And let's start seeing the wide format more in use. Most of our world is lived more horizontally spread-out than vertically spread-out, New York and Hong Kong being the notable exceptions.
Though, of course, all this is moot when considering broadcast television. That shite ain't worth watching no-how. I just want my DVDs to look good...:-)
Just put a synchronization timestamp/token in the VBI, and that's it. (not enough bits to do much else, especially if you have to make room for closed-captioning)
If you want to see a show, go to the show's website, and the magical little box on top of your TV will access the appropriate channel for your area. Then the magic box picks up timestamp/synchronization info from the VBI for the SMIL metadocument to synch to.
I wouldn't be suprised if this could be done today with entirely existing technologies, appropriate software and a TV tuner card.
The funny thing is, 4K can be done. Arri's latest film recorder does so.
Yes, you're right. It can be done, but far too many special effects and post-production houses just don't do it. They think 2K is "good enough" and just don't bother with 4K. I really hope that 4K becomes the standard for movie resolution, but it seems that many people just don't care. They hear the word "digital" and shut their brains off, thinking that as long as they have that buzzword going for them, they're getting high quality.
I noticed that Industrial Light and Magic is not one of the listed installations for the new Arri scanner, even though lots of other major special effects houses are using it, including Digital Domain (co-founded by James Cameron and used for movies like "Titanic"). Apparently George Lucas is among those people who think that 2K resolution is "enough". Can't he tell the difference? Is he blind? I, for one, thought that "Star Wars Episode I" looked blurry, and that's because pretty much the entire movie was scanned, processed, and printed at 2K resolution, which is just not good enough to match film resolution. Give most people a side-by-side comparison, and they'll agree that 2K digital just isn't as good as film.
I have to disagree about the 8vsb/codfm thing. in the sf bay area, i saw no trouble with 8vsb, my friend now receives fine with an indoor antenna the 7 or 8 dtv channels now on air. I saw great results with low power transmission in testing and one of the big arguments for codfm has to do with mobile reception, which is irrelevant for home use.
in any case you are right about the manufacturers - it will cost too much and kill the entire hd format to convert. im sure that sinclair would love this.
The problem with the ATVEF is the ip aspect - everyone wants a closed 'their' ip format, so they can rake in the $. In this case, the format MUST be totally open, or it will die.
I am very interested in your comment about 4K resolution being the appropriate resolution for 35mm film. However, I'd like to point out that the standard for digital radiography "x-rays" for telemedicine was 2K (a few years ago, when I was heavily involved with medical informatics). This number was determined after studies showed it was essentially as good as film for diagnostic purposes. Indeed hospitals are moving away from film altogether, due to cost, record-keeping, and other constraints.
Of course, film is a visial art that may benefit esthetically from 4K, while reading an x-ray is an art that is accustomed to working around a somwhat fuzzy image, but very little film (except the Zapruder film or a UFO/bigfoot sighting) gets the kind of up-close per-frame scrutiny that an x-ray gets. And an X-ray print is hundreds of times the size of a 35mm frame (digital X-rays are usually displayed on at least high-res 21" monitors)
These comments, such as "it looks so real", common people, just go outside where it is real.
Cant you see how bad TV's have become. A whole bunch of people, gov. and bigbusiness are trying to ensure that everybody goes to work, and comes back home watching TV, partly so they can keep you "safe" of the streets, where so much crime is happening every day, everywhere, that you feel that your a lucky to get home.
Fight the TV, its too late for us, but for our childrens sake
>Dear MPAA: F*ck your intellectual property God he's getting good, hard to tell from the real thing. Kinda pointless for the real sig to even post anymore, dontcha think?
"Standards are set by markets, not by standards bodies," Skip Pizzi, technical manager of worldwide television standards and strategy at Microsoft, said at the World Television Forum here in early June.
Would somebody PLEASE just drop a nuke on Seattle and get it over with... -
I wonder if they aren't approaching this from the wrong direction.
We already have the interactive technologies in development -- the web stuff, SMIL, and so on, and we certainly have the TV infrastructure already in place.
Why not just develop close synchronization between conventionally-delivered web and conventially-delivered TV content in some fashion, instead of deciding to cram everything conceivable down the broadcast channel?
The main thing it'd require would be the maturation of SMIL and a common schema/domain for "TV" URNs, and a little bit of hardware to pick up on synchronization info in the vertical blanking interval[1], but that's about it.
That'd leave you three options[2]:
1. watch TV on your TV while interacting with the synchronized web site on your PC
2. have the TV broadcast embedded in the synchronized website itself (courtesy of your TV card)
3. a WebTV-like set-top box that handles the web side of things, either doing as for #2, or compositing the website with the TV image in some other fashion
The technology for this kind of approach is here today, and it doesn't require pie-in-the-sky home bandwidth capabilities, nor does it require significant changes to the existing broadcast infrastructure.
I wonder why everyone seems so anxious to push their own proprietary solution, instead? I mean, Open Standards allow users control over their experience of the content, and that's a good thing, right?
---
[1] Maybe even current TV tuner cards could be pressed into service, depending on exactly what they offer; I don't know enough about them to say.
[2] Maybe more than that; these are the three I can think of off the top of my head
I was thinking...you know, all I want from my TV is to be able to watch TV programs. I want my F-Troop reruns, not a bio of Larry Storch. You know? I don't want HTML or Java. I want to watch the F1 race at Monte Carlo or Suzuka. Is any of this fighting over "interactive" content really needed for TV?
Besides, there's nothing interactive about staring at a screen, no matter what's on it, no matter whether you've chosen it or not. You don't interact with a TV show or a movie. You watch it.
What's really important is this: when they're pushing ads at me over the damn DTV line, do I get to turn them off? I seem to recall that Winston Smith couldn't turn his screen off at all. Is Big Brother going to be watching us?
For one worrying moment I thought you suggested going to the lowest common denominator, which clearly shows I wasn't really thinking. But your suggestion looks sound. And if they'd only make the standard the all-comprising superset of the existing standards, named in the article, I think we'd be cruising to Z'ha'dum and back in a jiffy, without even a scratch or a fall.:-) Lets call it OpenTV, shall we? IETF, are you listening? Let's RFC!
As a television engineer that was at the founding meetings of ATVEF and the ATSC commitee meetings I can tell you this is a very thorny subject.
Basically it boils down to two camps. One camp wants to invent new languages/scripting to be used for television. The other camp wants to incorporate and amplify IETF standards and utilize them.
I'm for the HTML based plan like ATVEF, since it has all the advantages of an established standard. Plenty of creation tools, few intellectual property issues, and plenty of people that know how to code it.
Those proposing new display engines (usually based on Java) point to the limitations of HTML and XML as unsuitable for use on television. The IP questions about using Java with a 'clean room' version are mind blowing. Noone wants to touch this issue except the electronics manufacturers since all they do is buy a chip and don't deal with any other licensing issues.
The real issue is what will the content providers choose to support. No content provider wants to code several versions of an enhancement to play on differing systems. They want one unified standard to code to all the time. I think most of the content providers have lined up with the ATVEF plans since they allow simple reuse of web apges and tools to make enhanced television.
Contrary to the above posts, the debate isn't about scan rate and format any more, it's about data enhancement.
In the US there is debate that the approved VSB transmission method is far worse than the CODFM method used in Europe. I can tell you as a fact that CODFM is a better method than VSB, but the consumer electronic manufacturers will never allow a switch at this point.
ATVEF encodes the network identifier. It's up to the EPG (electronic program guide) to scan the tuner card to establish the channel order. Once that's done, javascript on the web page can request the tv: tag (something like that, the coder guys know the details, I'm the broadcaster guy).
Intel used to have a product called Intercast that did this and listen for triggers to pre-cache and pull up web data on command.
I just bought a Hollywood Plus MPEG decoder card, and did a little hacking to get it to work with NTSC (just had to change a few register values, nothing fancy). Reading over the documentation for the NTSC/PAL encoder, I see that interlacing can be turned on and off. Doing a search on the 'net reveals that people are talking about progressive scan NTSC, but I can't seem to find any information about sets that actually understand it. I've twiddled the bits on my H+, but my TV still displays in interlaced mode (not that I really expected it to suddenly do progressive scan...)
Anyone have more info? -- Ski-U-Mah! Stop the MPAA [opendvd.org]
Categorically, every single attempt at copy protection of digital media has been compromised and rendered inoperative. By the information's very nature, it must be decoded back into a format suitable for interpretation by humans. The primary outputs: audio, video, text. The universal decoding schemes:
audio: we have audio inputs at 44khz available on virtually every computer on the market now. Many have digital inputs.
video: We have monitors and modems. Some have faster digital connections.
text: Duh.
In short, every single scheme is vulnerable to decryption. If the companies succeed in encoding the over-the-air transmissions, I will simply crack open my television and physically link a video encoder directly over the CRT monitor and then oversample it at 4:1 ratio.
Didn';t make first 40 (fourt) minutes didn't make first (1st) 20 posts
MOSTIMPORTANT: you told them to mod it down, which creates an internal dillemma to modders sicn they want to mod all such comments UP! but you didn't have any uslef or redeeming or gfunnny at ALL so they want to mod it DOWN but this CONflicts so they all just ignore it alike I did, I didn't..
IMHO, it would make sense if Americans would get a grip on reality and use the DVB standard that the rest of the world already uses. Then these pointless bickerings over standards would have been resolved *years* ago.
The "shutter-gate error" can be a problem, but with the latest advances in film movements, most brand-new cameras and projectors provide absolutely rock-steady transport. The mechanics of that sort of thing have really improved in recent years.
I thought the "film" look was because of the slow motion tracking 30 frames per second, whereas video looks real due to the high frame rate, making things look more like they do in real life...
Film is at 24fps, and U.S. standard video is at 30fps (NTSC, the broadcast standard for North America, divides each frame into two interlaced fields, giving 60 fields per second, but still only 30 distinct frames per second). When film is transfered to video, the frames are lined up with a "pulldown" so that the frames from film are spread evenly across the slightly larger number of video frames over the same time period. Note that there are some systems in the works, such as MaxiVision [mv48.com] that double the frame rate of film to 48fps.
That said, it is more than just the frame rate that gives film and video different looks. Film samples light in a way that more closely matches the human eye. Imagine two side-by-side cameras shooting the same scene with the same lighting, one film and the other video. Now imagine that the film's frame rate had been adjusted to match video. Would the resulting images then look the same? No, because of the fact that the two mediums do not respond to light the same way. It probably will be possible sometime in the future to make a CCD or other electronic light-gathering device that works in a way more similar to film (and thus more similar to the human eye), but for right now, it's still a problem.
Actually, you say first that people think video looks more real, but then go on to say that film is more similar to the way we see things--that doesn't really make sense...
Okay, I should have been clearer about that. While I personally believe that film looks the most natural and real and that video looks less natural, when HD video was first being bandied about and shown to various people, some voiced complaints about the look of the image, saying that it looked "too real". I think by saying that, those people unwittingly gave advocates of HD video some ammunition to use in promoting it. What I was trying to do in my previous comment was interpret what those people were trying to say when they complained about video looking "too real". I think what they meant was that HD video still looks too much like the evening news (which is one of the few things on TV that is shot video, most other things are shot with film and then transfered to video). Since we're all used to associating the video look with the news and other reality based TV, anything that has that same video look evokes a reaction and triggers that association. I would imagine that whomever actually said that (I have no idea who or when this supposedly happened) was trying to explain that the video look was too much like TV news and not enough like theatrical film, but they kind of stumble on their words and out came the unfortunate comment of "too real". I was merely trying to explain where I thought the comment came from. I was not agreeing with it.
I do not think that video looks more real than film. I actually think it looks less real. And the thing is, I'm not just making a subjective judgement here, there is an objective element to my criticism. Film caputres light in a way that is much more like the human eye than video. (See this article [cinematographyworld.com] for some comments from a Kodak employee.) I think it's telling that advocates of video rely on subjective arguments (such as the oft-repeated comment about video looking "too real"), while advocates of film rely on objective arguments about the technical capabilities of film. It begs the question: if advocates of video really are pushing something that is "better", why can't they come up with any objective arguments to support their case? Why do they have to rely on subjective opinions? (Hint: maybe because it really isn't better!)
First off, the EE Times article is talking about the standards for interactive content, not the image specs. The image specs have already been set, as you pointed out. The current wrangling is over the set of APIs used to provide interactive "programming" for people's DTV sets and set-top boxes. It has nothing to do with the image specs. Read the article.
That said, there are a couple of incorrect statements in your post (actually, in the quote you have from the "CyberCollege" site). The first is this one:
If there is any "fault" with digital video it's that seems "too sharp and clear" compared to film. Of course, I guess we can all get used to better quality if we really have to!
First off, no matter what anyone tells you, the resolution of HDTV is not the same as film. Film is capable of resolving up to 4,000 lines, not 2,000. People see this all the time when scanning film in for the purpose of creating special effects. The current standard for scanning film in post-production is 2,000 lines ("2K" resolution). However, there is still scannable information left on the film that is not picked up at that resolution. 4,000 lines ("4K" resolution) would be the proper resolution for scanning film. 4K is what is needed to really replace 35mm film without any loss, not 2K resolution.
If you look closely next time you're at a movie theater, you can see that special effects shots look blurry compared to the rest of the movie. That's because of the loss in resolution due to scanning at 2K. Hopefully 4K will become the standard for doing special effects, but I'm not going to hold my breath. Most people think that 2K is "enough", even though it really isn't.
As for HDTV looking "too sharp and clear", this is nonsense. No one ever says that. The common complaint is that HDTV looks "too real" compared to film. And what people mean when they say this is that it looks too much like the evening news. Film is used for everything you see on TV these days except for the news and other "reality" programming, and so when people mean to say that something has too much of the video "look", they end up saying that it looks "too real".
The reason film and video have different looks is that they respond to light differently. Film samples light in a logarithmic curve, which is very similar to the way the human eye sees light. That is why film looks so natural. Video cameras, on the other hand, sample light in a linear fashion, which is what gives it that particular look. The fact is, if you want an image that looks the most natural, film is still the way to go.
The second incorrect statement is this one:
The studio production standards we've cited are not to be confused with the broadcast standards listed below.
Actually, the 1920x1080 24 fps progressive format is exactly the same format that is being used for Star Wars Episode II. That's right, it's being shot in the exact same resultion that is supposedly going to become the broadcast standard for TV. Why would studios being doing that? If they use the same resolution for movies as is going to be used for TV, why would people go to theaters anymore? It doesn't make sense.
Unless of course, they have no intention of ever giving us the HD resolution at home. And since all the major TV networks are now owned by movie studios, they can do that.
So, here what's going to happen. HDTV resolution will be used for theaters, which will be a step down in picture quality, and then regular resolution digital TV will be used for the home, and the extra bandwidth from the digital broadcasting licenses that were given away by the U.S. Congress will be used for a bunch of stupid "interactive" content. Isn't that nice?
All broadcast manufacturers are heading toward doing top end production on 1080 24P systems. It makes film people happy because 24P matches film. It makes TV people since you can downconvert to any resolution/scan format. It make computer people happy because the nasty interlace demons are gone.
You will see television broadcast move to 720 60P in the next decade. Why? Because the coming large plasma displays and projection sets are all natively progressive. Watching interlaced content on a progrssive screen is like listening to 8track tapes on a high end stereo system.
Re: holding back to film standards (Score:1)
So what you're saying is basically irrelevant.
Re:IMHO (Score:2)
--
Ski-U-Mah!
Stop the MPAA [opendvd.org]
The /. of DTV is TVIndustry.com (Score:2)
http://TVIndustry.com/Main.DigitalTV.htm
/oskar
Re:Read the article (Score:1)
Just takes a bit longer, though.
Nice idea, but wrong (Score:1)
The way to reduce artifacts is to increase the bandwidth used to record/tranmsit the video. Increased frame rate and resolution won't remove artifacts.
BTW, MPEG as seen on DirecTV averages to around 3-4 Mb/s. DVDs are like 6 Mb/s. Professional video recorded at 25, 50, and even 100 Mb/s has no perceptible artifacting.
Re:The Chinese Lottery (Score:2)
AC--
Box phonetically equivocates to baw-ks. Boxes phonetically converts to Baw-k-ses, a moderately more linguistically annoying phrase, since you've got to loop yourself from the s sound, to an "eh" sound, back to the "ess". The resolution pluralization became Boxen.
I doubt it was ever intended to be funny or clever. It's just easier to transition from "ks" to "en" than it is to transition from "ks", add a short "eh", then back to "es".
Good example--when going from "Rack" to the plural, you turn "Rack" to "Racks"/"Rax". You can't make the same conversion though for Box, because you've already got that ks/x sound in there. Thus you get something like Boxen.
If you think I've taken this analysis too far, you're the geek who threatened to, and I quote, "kill" somebody over your dislike for a five letter work ending in N instead of S. Not that I took you seriously
Yours Truly,
Dan Kaminsky
DoxPara Research
http://www.doxpara.com
Re:how do you upgrade such a system? (Score:1)
Yeah, wouldn't that be totally GREAT?
No <blink> no <center> no <font> no frames no nothing. CSS would thrive and websites would actually be readable on all platforms.
Nice dream, but I don't think it's gonna happen.
No such thing as analog progressive NTSC (Score:1)
Perhaps your decoder card is meant to feed TVs (interlaced) or computer monitors (progressive).
Re:Why (H)DTV will suck... (Score:1)
It's all about control (Score:1)
Personally, I don't really think it makes much difference which standard is decided as long as some standard is finally *laugh* agreed upon.
Does it really matter? DTV - even if the highest-quality, cheapest to implement, no privacy invading, open source solution is chosen - still won't replace the gaping hole left in your existence where a social life used to be. =)
Get out some more, don't worry about TV too much, it'll still be there when you get back. Who cares if it's missing a few lines of resolution? =-)
Re:Read the article (Score:1)
In any case, the grain size for film that responds to visible light is extremely small, and keeps getting smaller every year with the introduction of new film socks. I've seen 70mm prints of movies from the 60's and 70's, such as "Lawrence of Arabia" and "West Side Story" that looked truly amazing, and yet today's 35mm film is getting to where it looks almost as sharp. That's because the grain size has decreased dramatically over the years, and will continue to do so in the future. That's a big advantage of sticking with film. You can always benefit from advances in film quality without having to buy a new projector. But once you switch to digital, you are stuck with the same pixel count forever.
If you're looking for a reference about 4K resolution being equivalent to film, here is an article [cinematographyworld.com] that talks about a panel discussion between film and digital proponents. The folks from Kodak talk about how 4K scans show the level of detail on 35mm film to be between 8 and 12 million pixels, which is much greater than the paltry 2 million pixels offered by 2K digital.
And if you really want to see just how good film can look, try checking out Technicolor's newly revived dye-transfer printing process. Articles talking about it can be found here [directorsworld.com] and here [mkpe.com]. It's pretty amazing. I've seen it in first hand, and it blows digital projection away. The picture looks so sharp and clear it's as though you could reach right into the screen and touch it, and the color is absolutely incredible! If you live in the Bay Area, you can see a dye-transfer print for yourself if you go to the Century 25 [centurytheaters.com] in Union City. "Mission: Impossible 2" and "Shanghai Noon" are showing there with dye-tranfer prints. It's the best film quality I've ever seen. If you can, go check it out!
Re:DTV (Score:2)
The games are too slow, and the email screen could be laid out a lot better, agreed. The shopping system works as well as you could hope for for a dumb home user on a TV system though.
If noone goes and actually creates the systems, then nothing will happen. The games might be interpreted piles of junk (I assume that, or the system as a whole only has a Z80 controlling it), and I would prefer Tetris and other puzzle games (even some classic platformers) to the trash on there currently, but as a real example of a technology it is great.
When the next generation Digital Boxes come out, hopefully within the next couple of years, then there will be a lot more power available to do stuff correctly. Hopefully they will also have ADSL support, and a built in DVD drive (why not use the one set of MPEG-2 decoder chips instead of multiple sets?).
On of the rules of business it to get it out there quickly whatever the faults. Sky has done this knowling it isn't the perfect solution, but they also know that if they didn't release anything then OnDigital (Terrestrial Digital TV) would be on the case. If only the SKy boxes allowed 2 way satellite transmission for IP traffic as well... :-)
Re:Pay Attention to Copy Protection (Score:3)
Take a look at HTTP Extension Framework [isi.edu]. Note a couple of key aspects:
Take a careful look at how the protocol extensions are to be used in their examples:
LL
Re:Read the article (Score:1)
Re:Frankly, my dear... (Score:1)
And the transition to HDTV so far has been extremely slow. Only one station in this market (92nd, Tri-Cities TN/VA) is even broadcasting HD, and I'm not sure if they're using any actual HD production equipment or just upconverting their NTSC signals. The reason? HD equipment is extremely expensive (a good HD overhaul can cost $1,000,000 and up, easily).
I've been waiting for a better format than NTSC ever since I started in this industry. HD sounds good, but it is extremely uncertain in 85% of the country or more. Plus there are a lot of issues with lowpower stations that will be displaced by HD broadcasts that seem to be holding it back somewhat.
_______
Scott Jones
Newscast Director / ABC19 WKPT
Commodore 64 Democoder
A-men brother, Off-topic slightly (Score:1)
A-men brother,
I have been a much happier man since I abandoned the demon box four years ago. Now when I do get close to one of the cursed things I really start to notice how they grab your consciousness away from you. They have a scary sedating affect upon a person. I see so many of my friends whose lives revolve around "what's on TV." I won't say its all bad programming ( I like watching the Simpsons,) but in general it's shit. TV exposure in my youth has contributed to my past attention span problems. I see it affecting my friends' capabilities making them appathetic, unambitious people who have so much potential yet never use it. "Fuck DTV. Fuck TV."
Random Task
Re:DTV (Score:1)
What you ask for is what ATVEF does now... (Score:1)
Yep (Score:2)
And yes, they'll be quite interested to know that YOU (And they will KNOW it's YOU) are watching the Live Goat Porn channel every night from 6 to 9:30. The up side of this is you'll get directed advertising for all that cool Live Goat Porn merchandise. You might even get honorable mention on the Live Goat Porn web page (http://www.livegoatporn.com) for being their best fan. Wouldn't that be cool?
Digital means data, not necessarily "DTV" (Score:1)
Similarly, the DTV folks seem to be very focused on TV, and seem to view the digital data-broadcasting function as incidental undercarriage instead of a general purpose infrastructure that can carry an optional HDTV payload.
The consequence is evaluating alternative modulations and encodings in terms of TV instead of in terms of delivery of data with various levels of integrity and timeliness.
E.g., IEEE-1394 ("Firewire" etc.) addresses the issues of sharing a channel between isochronous (guaranteed chunks/time) uses and asynchronous uses. Similar concerns should guide DTV broadcasting, if there is going to be a clean path to future digital mixes, which may include stuff not yet invented.
If the bottom layer is fixed and dedicated-purpose, it will force the invention of ugly tunneling kludges whenever someone wants to transmit something new (like doing things in the comments of HTML).
It is easy to imagine lots of interesting ways of using a flexibly designed digital channel with a fixed max capacity. Let's hope the bottom layer design doesn't do perverse things to interfere.
Frankly, my dear... (Score:1)
We've been waiting, what?, more than a DECADE for everyone to get their act together and make a decision.
Time to shit or get off the pot.
--
I'm talking home video rental. (Score:2)
I'm not talking about the availability of "free" or subscription-based content. I'm talking about putting the video rental market out of business. That's one of the goals of the studios/networks in transitioning to (H)DTV. Fewer intermediaries take a cut out of PPV revenues as compared to rental, which would benefit the content producers and cable companies (frequently one and the same) immensely.
Of course, this will restrict your choices for home viewing to the top dozen popular movies, but that's better for the studios anyhow - no pesky indie or foreign flicks to compete with their "Mission Impossible N"s or "Runaway Bride"s (or "Battlefield Earth"s).
-Isaac
Re:Check out the Quote from Microsoft... (Score:1)
Hey! I live in Seattle. Bill Gates lives in Medina, on the other side of Lake Washington, where Redmond is.
Now, you won't get any argument from me about dropping a nuke on Redmond, which would take out most of the bad places and give us new lakefront property - they already chopped down all the good forests that used to be there.
But just broadcast a message on PBS and NPR and the local college and high school stations about 5 minutes before impact so we can get down in the basements, ok?
... make lemonade (Score:1)
So why do we still have CR and LF - the carriage return on a typewriter and the line feed on a typewriter?
Personally, I think we should just announce the Open Source HDTV format is the European one, code it to run on Linux and BSD, and let the chips fall where they may.
Take some initiative people - choose the best format, not the ones the manufacturers want!
Re:This is no use unless it can be received! (Score:2)
COFDM and DVB-T work today, the hardware is in mass production, and it looks like it will become the international standard for DTV.
Re:It's about data standards for interactivity (Score:2)
HTML/XML is also more accessible to people with vision and cognitive disorders, since it's basically text that the client can interpret however they will. Pardon my language, but how the *fuck* do they expect to do large fonts, alternate colors, and speech synthesis on *Java*. If they put in all the hooks to do this well, they'll end up with the OS/360 of content development.
Plus, you can do Babelfish-style translation of the content with HTML/XML. Again pardon my language, but how the *fuck* is anybody supposed to translate Java bytecodes from English to Spanish. Sure, English is used a lot all over the world, but I imagine Hollywood still gets significant revenues from dubbing movies and TV programs into other languages. With Java, they have to translate the source and recompile for every language. Even if they wanted to bear the expense, the configuration management issues are formidable.
And how can Java be indexed by the search engines? Much of any media company's value is in their archives. By using Java, they lock their product up where people can't find it to buy. Java either costs sales, or you have to pay to recreate the information in a form that *is* indexable. In other words, they've gotta have a web page (or the equivalent) replicating the content.
Instead of using pure Java, I think they ought to use an HTML/Java combination. First, they would extend and/or clarify HTML so that it rendered consistently (standard table behavior and standard font sizes). The biggest problem with HTML is that basic things like tables are wildly different across existing browsers. Simply codifying how tables, images, fonts, and styles display for the TV Browser Standard would be the most important step. Most stuff, like National Geographic supplemental information and football player stats, would be well-handled by HTML.
They would use Java plug-ins for the truly unusual stuff. For instance, the zoomable-spinnable-interactive-3D-instant-sports-r eplay would be Java.
(Does this remind anybody else of the "Would you like to know more?" bits from the Starship Troopers movie?)
Tired. So tired. (Score:1)
Besides the fact that it's old and was never very funny, I also suspect that you are ugly.
For future reference, I recommend you arrange your attempts at humor in such a manner which allows for variety and a continual modification of themes so that you don't bore everybody to a premature death.
Just a thought.
--Fantastic Lad, the most amazing script kiddie ever!
"Argh! Somebody put shit in my pants!" --JTHM
Re:This is no use unless it can be received! (Score:1)
The Chinese Lottery (Score:3)
The Chinese Lottery is essentially a crypto term referring to the following concept: Each of China's billion residents is given a set top box for their TV. The boxen each receive a block to attack(this can be broadcast simply by continually outputting a stream of if(serial == "12345"){block="0xAABB1234")). The cost of the attack is borne in small chunks by whatever the market will bear to pay for a set top box, plus each individual paying for the electricity(which is of less value, presumably, than the amortized value of each block being cracked).
Then whoever's box cracks the code displays a message, "You've just won a million dollars. Please call this number to claim it." The government collects the box and gives some sum to the winner. Then they take back most of that sum through taxation(OK that's not part of the Chinese Lottery, but it's part of *every* lottery), and everyone's happy.
Now consider that most of these settop boxen will end up actually having phone connections to the media owners, so that Pay Per View can be implemented. Now the owner of the box doesn't need to be paid off or even informed that his machine was being used to crack some code, and that the cost of that cracking was being borne by his electricity bill. The result of the crack can be sent down the pipe at will.
All they need is the ability to execute code...
Yours Truly,
Dan Kaminsky
DoxPara Research
http://www.doxpara.com
Re:Progressive scan NTSC? (Score:2)
--
Re:Read the article (Score:1)
I am under the impression that the filming is done at a higher rez (1900 some lines), but will be downsampled to 1280 lines. The same article (in Home Theater) said that while each individual frame is capable of yielding higher resolution than that, there is a "shutter gate error" or something like that that reduces the effective location and resolution accuracy of film.
"The animal is inside-out."
*boom!*
"And it exploded!"
- Galaxy Quest -
Where film is going (based on reality) (Score:2)
That's been one of the ongoing topics at the Seattle International Film Fest, which wraps up this weekend. Most of the directors have said they're going to digital, partially because you can film for under $10,000 instead of $250,000.
So expect this to be the "new story" next year, when all these films start coming out of the can (ok, I guess that's outmoded too).
And let's start seeing the wide format more in use. Most of our world is lived more horizontally spread-out than vertically spread-out, New York and Hong Kong being the notable exceptions.
Also a discernable trend, as more people start filming for their own countries, partially due to the drop in cost factor. So, we should expect to see way more films in wide format.
A humorous anecdote - at one point in the film fest a whole bunch of us had to tell the theatre manager (a nice woman who we like) that the aspect ratios were way off, when they persisted in showing a french film (Girl on the Bridge) at the Cinerama with the wrong settings. Of course, this was just after they had completed showing How The West Was Won using the Cinerama tri-screen, so they had to totally rework the whole theatre. It took us five minutes to explain why those of us in the audience knew how the film should be prepped.
Re:The Chinese Lottery (Score:1)
Niether the Canadian national lottery, nor the local one here in British Colombia are taxed. Different approaches to paperwork, I guess.
What it all really means (second look) (Score:2)
The studio production standards we've cited are not to be confused with the broadcast standards listed below.
Actually, the 1920x1080 24 fps progressive format is exactly the same format that is being used for Star Wars Episode II. That's right, it's being shot in the exact same resultion that is supposedly going to become the broadcast standard for TV. Why would studios being doing that? If they use the same resolution for movies as is going to be used for TV, why would people go to theaters anymore? It doesn't make sense.
Because film is going digital. All the directors at the Seattle International Film Fest, which is much more of a director's fest than Cannes will ever be, have said they're going digital for at least half their upcoming films.
And the argument about theaters was used during the release of color TV. Theater attendance is way up since then. If I'm watching Gladiator, I want to see it at the Cinerama, not at home. Sure, there are a lot of made-for-TV movies that aren't any better on the big screen, but try telling me that The Matrix is the same on HDTV as the big screen and I'll ROFL immeadiately.
Unless of course, they have no intention of ever giving us the HD resolution at home. And since all the major TV networks are now owned by movie studios, they can do that.
Actually, they have no choice, the legislation requires them to broadcast in HDTV by 2003 (or was it 2004, can't recall).
So, here what's going to happen. HDTV resolution will be used for theaters, which will be a step down in picture quality, and then regular resolution digital TV will be used for the home, and the extra bandwidth from the digital broadcasting licenses that were given away by the U.S. Congress will be used for a bunch of stupid "interactive" content. Isn't that nice?
No, you'll see HDTV resolution at the mini-malls most likely, and have to go to the city to see higher res versions on a "true" big screen, where sound and experience will become more of the big picture.
Now if they could just issue candy in non-plastic wrappers to avoid the sound crinkle, I might start buying theatre candy again
Re:A-men brother, Off-topic slightly (Score:1)
(This was in Minneapolis -- maybe it was a thing from another area that got imported here..)
Re:What it all really means (second look) (Score:2)
Re: Voxing Boxes (Score:1)
Bawks? What kind of strangualted phonetics are these?
Boxes is
Re:Another thing you won't get (Score:2)
Re:It's about data standards for interactivity (Score:2)
DTV and the laziness of America (Score:2)
United States DTV Standards Technical Details (Score:4)
More information on the DTV Standards [cybercollege.com] is available.
CmdrTaco: The first link to EETimes is broken.
Why (H)DTV will suck... (Score:4)
They're after an encrypted signal path from the studio to the screen. This is the single biggest hangup preventing (H)DTV from hitting the market, and is the reason why cable systems aren't carrying HD signals.
Forget video rental, it's pay-per-view only w/ (H)DTV.
The new "interactive features" will be used to track your viewing habits with alarming granularity.
Fuck DTV. Fuck TV.
-Isaac
Re: (Score:1)
A different world (Score:2)
This is no use unless it can be received! (Score:1)
Other countries? (Score:2)
how do you upgrade such a system? (Score:1)
i think the answer is that we simply don't know and that there will always be new features needed. so how do you upgrade consumer devices in the field? it has to be as easy as clicking a button to upgrade and having the change done immediately, transparently, and free. anything less will be unacceptable to consumers and then they simply won't upgrade. imagine if all web pages were still written using HTML 1.1.
joshy
IMHO (Score:5)
Frame rate should be calculated the same way. 25f for PAL, 30f for NTSC. Therefore, the "ideal" frame rate would be 150f. Yes, this is a lot higher than existing systems go, but it makes transferring existing NTSC or PAL footage a doddle. You don't lose any information, and you don't need to do any interpolation.
Such a resolution and frame-rate should also reduce the horrible artifacts produced by MPEG. (MPEG cannot handle low light well. It looks like a JPEG in a 16 colour resolution.) The only way to remove artifacts is to INcrease the resolution & speed, not decrease it. If you decrease it, the artifacts are still there, they're just not so "visible". Which means exactly nothing, since you don't watch a single frame, but a whole movie.
DTV (Score:3)
Since noone has a TV capable of showing a 1920x1080 picture, or even half of that, I don't think that the lack of a high resolution DTV signal is that bad. That capability can be added on later, when there are more satellites providing more bandwidth.
Of course, HDTV looks amazing, so when it becomes affordable it could be an interesting option to purchase. Unfortuneately, there is nothing on TV to actually watch, so I don't currently have a TV and I don't intend to get one.
Last thing I want is to have to have a box with "Microsoft Media Player" logos on it. Looks like that is the way America wants to go, with MS-MPEG4, etc. A TV is meant to perform a function - show TV. In the UK, DigitalTV also has games, email, shopping etc as well, using Sky TV's Open platform. You ust can't browse the web using the box, but the next generation of DTV boxes in europe will be able to do that as well. Hopefully, Mozilla with its extreme customisability will be chosen for these boxes ahead of PocketIE or whatever.
Next: People to incorporate ReplayTV and DVD players into the DigitalTV's themselves.
Pay Attention to Copy Protection (Score:5)
There's a lot of behind-the-scenes dealmaking going on to try and insinuate copy protection methods into DTV signals. They're trying to prevent people from recording programming off the air (since that would prevent you from buying the home video version of whatever it is).
Pay very close attention to this. Look for copy protection systems in the upcoming standards, and call attention to them. It's very important that it not come to pass.
Schwab
When all you have is a hammer... (Score:3)
Reading this article, I am reminded of a paragraph appearing in the 2 June 2000 issue of Science (Vol 288, p. 1597).
Interactive television strikes me as the same sort of revolution. Everyone's perspective is limited by what they know. TV will looks totally different in 20 years from the standards committees expect.
Re:Why (H)DTV will suck... (Score:1)
Why it probably won't (Score:3)
That's funny. Then why do I find plenty of examples in nearly every medium of people willing to give me content for free and sell advertising space to sponsors. I find this on the radio, TV, the Web, some smaller newspapers, checkout aisle swapsheets and low budget newsprint computer magazines. And people who buy the latest home entertainment technology are a really enticing market niche to be able to sell advertising time in front of. They have disposable income and are willing to part with it for new gadgets. They are likely to be younger and novaphiles, so they may become repeat customers.
There will be plenty of pay-per-view content, but to build the market there will be content available for free or at a flat subscription rate like cable. And that content will be lucrative to provide, so it will continue.
Re:Why (H)DTV will suck... (Score:2)
This reminds me of that Simpons Episode where Homer doesn't allow Bart to go see that Itchy & Scraty movie. Homer tells Bart to watch TV instead, and Bart, who is upset at Homer for not letting him go says something like "TV Sucks".
Homer got really upset and said (in a growling voice) "Bart, I know your upset, but don't say anything you don't mean" or something like that...great episode!
Another thing you won't get (Score:2)
Somehow I don't think the content providers would be fond of this idea, despite it being the logical next step for content delivery services.
Re: holding back to film standards (Score:3)
Digital video is where it's going, and bloody fast. Set the standard at 48 or 72fps, and use every second or third frame if cutting back to film (or double or triple the frames if going from film to television).
And let's start seeing the wide format more in use. Most of our world is lived more horizontally spread-out than vertically spread-out, New York and Hong Kong being the notable exceptions.
Though, of course, all this is moot when considering broadcast television. That shite ain't worth watching no-how. I just want my DVDs to look good...
--
No, not even that much. (Score:2)
If you want to see a show, go to the show's website, and the magical little box on top of your TV will access the appropriate channel for your area. Then the magic box picks up timestamp/synchronization info from the VBI for the SMIL metadocument to synch to.
I wouldn't be suprised if this could be done today with entirely existing technologies, appropriate software and a TV tuner card.
Re:United States DTV Standards Technical Details (Score:1)
I vote for Java, or more specifically "ABM" (Anything But Microsoft).
I love you too, man... (Score:1)
Re:Read the article (Score:2)
I noticed that Industrial Light and Magic is not one of the listed installations for the new Arri scanner, even though lots of other major special effects houses are using it, including Digital Domain (co-founded by James Cameron and used for movies like "Titanic"). Apparently George Lucas is among those people who think that 2K resolution is "enough". Can't he tell the difference? Is he blind? I, for one, thought that "Star Wars Episode I" looked blurry, and that's because pretty much the entire movie was scanned, processed, and printed at 2K resolution, which is just not good enough to match film resolution. Give most people a side-by-side comparison, and they'll agree that 2K digital just isn't as good as film.
Re: holding back to film standards (Score:1)
Though I still want my DVDs to look good.
--
Re:United States DTV Standards Technical Details (Score:1)
John Carmack
Re:It's about data standards for interactivity (Score:1)
in any case you are right about the manufacturers - it will cost too much and kill the entire hd format to convert. im sure that sinclair would love this.
The problem with the ATVEF is the ip aspect - everyone wants a closed 'their' ip format, so they can rake in the $. In this case, the format MUST be totally open, or it will die.
Re:Nice idea, but wrong (Score:1)
hard on the raid storage though
Re:Another thing you won't get (Score:2)
Even when we have the bandwidth, we still won't be able to do it. The content providers will see to that.
Re:Read the article (Score:1)
Of course, film is a visial art that may benefit esthetically from 4K, while reading an x-ray is an art that is accustomed to working around a somwhat fuzzy image, but very little film (except the Zapruder film or a UFO/bigfoot sighting) gets the kind of up-close per-frame scrutiny that an x-ray gets. And an X-ray print is hundreds of times the size of a 35mm frame (digital X-rays are usually displayed on at least high-res 21" monitors)
Re:Read the article (Score:1)
http://www.ise.imagica.co.jp/scanner/IDS/produc
Re:Frankly, my dear... (Score:1)
These comments, such as "it looks so real", common people, just go outside where it is real.
Cant you see how bad TV's have become. A whole bunch of people, gov. and bigbusiness are trying to ensure that everybody goes to work, and comes back home watching TV, partly so they can keep you "safe" of the streets, where so much crime is happening every day, everywhere, that you feel that your a lucky to get home.
Fight the TV, its too late for us, but for our childrens sake
Re:Pay Attention to Copy Protection (Score:1)
Re:The Chinese Lottery (Score:2)
That message would set off my scam alarm. To be sure of getting a report, it would need to be built into the boxen.
/.
Check out the Quote from Microsoft... (Score:1)
-
Honestly... (Score:3)
We already have the interactive technologies in development -- the web stuff, SMIL, and so on, and we certainly have the TV infrastructure already in place.
Why not just develop close synchronization between conventionally-delivered web and conventially-delivered TV content in some fashion, instead of deciding to cram everything conceivable down the broadcast channel?
The main thing it'd require would be the maturation of SMIL and a common schema/domain for "TV" URNs, and a little bit of hardware to pick up on synchronization info in the vertical blanking interval[1], but that's about it.
That'd leave you three options[2]:
1. watch TV on your TV while interacting with the synchronized web site on your PC
2. have the TV broadcast embedded in the synchronized website itself (courtesy of your TV card)
3. a WebTV-like set-top box that handles the web side of things, either doing as for #2, or compositing the website with the TV image in some other fashion
The technology for this kind of approach is here today, and it doesn't require pie-in-the-sky home bandwidth capabilities, nor does it require significant changes to the existing broadcast infrastructure.
I wonder why everyone seems so anxious to push their own proprietary solution, instead? I mean, Open Standards allow users control over their experience of the content, and that's a good thing, right?
---
[1] Maybe even current TV tuner cards could be pressed into service, depending on exactly what they offer; I don't know enough about them to say.
[2] Maybe more than that; these are the three I can think of off the top of my head
Is this necessary? (Score:1)
Besides, there's nothing interactive about staring at a screen, no matter what's on it, no matter whether you've chosen it or not. You don't interact with a TV show or a movie. You watch it.
What's really important is this: when they're pushing ads at me over the damn DTV line, do I get to turn them off? I seem to recall that Winston Smith couldn't turn his screen off at all. Is Big Brother going to be watching us?
Will we care?
Re:IMHO (Score:1)
Stefan.
It's about data standards for interactivity (Score:4)
Basically it boils down to two camps. One camp wants to invent new languages/scripting to be used for television. The other camp wants to incorporate and amplify IETF standards and utilize them.
I'm for the HTML based plan like ATVEF, since it has all the advantages of an established standard. Plenty of creation tools, few intellectual property issues, and plenty of people that know how to code it.
Those proposing new display engines (usually based on Java) point to the limitations of HTML and XML as unsuitable for use on television. The IP questions about using Java with a 'clean room' version are mind blowing. Noone wants to touch this issue except the electronics manufacturers since all they do is buy a chip and don't deal with any other licensing issues.
The real issue is what will the content providers choose to support. No content provider wants to code several versions of an enhancement to play on differing systems. They want one unified standard to code to all the time. I think most of the content providers have lined up with the ATVEF plans since they allow simple reuse of web apges and tools to make enhanced television.
Contrary to the above posts, the debate isn't about scan rate and format any more, it's about data enhancement.
In the US there is debate that the approved VSB transmission method is far worse than the CODFM method used in Europe. I can tell you as a fact that CODFM is a better method than VSB, but the consumer electronic manufacturers will never allow a switch at this point.
Actually, this is a box issue not a broadcast one (Score:1)
Intel used to have a product called Intercast that did this and listen for triggers to pre-cache and pull up web data on command.
Progressive scan NTSC? (Score:2)
I just bought a Hollywood Plus MPEG decoder card, and did a little hacking to get it to work with NTSC (just had to change a few register values, nothing fancy). Reading over the documentation for the NTSC/PAL encoder, I see that interlacing can be turned on and off. Doing a search on the 'net reveals that people are talking about progressive scan NTSC, but I can't seem to find any information about sets that actually understand it. I've twiddled the bits on my H+, but my TV still displays in interlaced mode (not that I really expected it to suddenly do progressive scan...)
Anyone have more info?
--
Ski-U-Mah!
Stop the MPAA [opendvd.org]
Re:Pay Attention to Copy Protection (Score:1)
audio: we have audio inputs at 44khz available on virtually every computer on the market now. Many have digital inputs.
video: We have monitors and modems. Some have faster digital connections.
text: Duh.
In short, every single scheme is vulnerable to decryption. If the companies succeed in encoding the over-the-air transmissions, I will simply crack open my television and physically link a video encoder directly over the CRT monitor and then oversample it at 4:1 ratio.
Dear MPAA: F*ck your intellectual property.
Microsoft's idea of a standard (Score:2)
Well, there you go. Microsoft must teach this meaning of "standard" to their people during orientation or something. It would sure explain a lot.
---
Please no VBScript. (Score:1)
Please god no VBScript on my HDTV. All I need is another I LOVE YOU.vbx virus.
-Brook Harty
Have you WAP'ed today? http://www.attws.com/personal /explore/pocketnet/index.hdml [attws.com]
Won't get modded down: 2 reasons (Score:1)
didn't make first (1st) 20 posts
MOSTIMPORTANT:
you told them to mod it down, which creates an internal dillemma to modders sicn they want to mod all such comments UP! but you didn't have any uslef or redeeming or gfunnny at ALL so they want to mod it DOWN but this CONflicts so they all just ignore it alike I did, I didn't
Oh it did get the mod.
My bad.
Mod this down too (2).
Please to be here, zeusjr!
Re:DTV (Score:1)
It should be around sometime in the next year or so.
Re:IMHO (Score:1)
Re:Read the article (Score:1)
Still, I bet that even though the machine is capable of 4K scans, most people don't bother to do them. Everyone seems stuck on 2K at the moment.
Re:Read the article (Score:1)
Re:Read the article (Score:1)
Film is at 24fps, and U.S. standard video is at 30fps (NTSC, the broadcast standard for North America, divides each frame into two interlaced fields, giving 60 fields per second, but still only 30 distinct frames per second). When film is transfered to video, the frames are lined up with a "pulldown" so that the frames from film are spread evenly across the slightly larger number of video frames over the same time period. Note that there are some systems in the works, such as MaxiVision [mv48.com] that double the frame rate of film to 48fps.
That said, it is more than just the frame rate that gives film and video different looks. Film samples light in a way that more closely matches the human eye. Imagine two side-by-side cameras shooting the same scene with the same lighting, one film and the other video. Now imagine that the film's frame rate had been adjusted to match video. Would the resulting images then look the same? No, because of the fact that the two mediums do not respond to light the same way. It probably will be possible sometime in the future to make a CCD or other electronic light-gathering device that works in a way more similar to film (and thus more similar to the human eye), but for right now, it's still a problem.
Actually, you say first that people think video looks more real, but then go on to say that film is more similar to the way we see things--that doesn't really make sense...
Okay, I should have been clearer about that. While I personally believe that film looks the most natural and real and that video looks less natural, when HD video was first being bandied about and shown to various people, some voiced complaints about the look of the image, saying that it looked "too real". I think by saying that, those people unwittingly gave advocates of HD video some ammunition to use in promoting it. What I was trying to do in my previous comment was interpret what those people were trying to say when they complained about video looking "too real". I think what they meant was that HD video still looks too much like the evening news (which is one of the few things on TV that is shot video, most other things are shot with film and then transfered to video). Since we're all used to associating the video look with the news and other reality based TV, anything that has that same video look evokes a reaction and triggers that association. I would imagine that whomever actually said that (I have no idea who or when this supposedly happened) was trying to explain that the video look was too much like TV news and not enough like theatrical film, but they kind of stumble on their words and out came the unfortunate comment of "too real". I was merely trying to explain where I thought the comment came from. I was not agreeing with it.
I do not think that video looks more real than film. I actually think it looks less real. And the thing is, I'm not just making a subjective judgement here, there is an objective element to my criticism. Film caputres light in a way that is much more like the human eye than video. (See this article [cinematographyworld.com] for some comments from a Kodak employee.) I think it's telling that advocates of video rely on subjective arguments (such as the oft-repeated comment about video looking "too real"), while advocates of film rely on objective arguments about the technical capabilities of film. It begs the question: if advocates of video really are pushing something that is "better", why can't they come up with any objective arguments to support their case? Why do they have to rely on subjective opinions? (Hint: maybe because it really isn't better!)
Cool! Is this my first /. death threat? (Score:1)
Read the article (Score:5)
That said, there are a couple of incorrect statements in your post (actually, in the quote you have from the "CyberCollege" site). The first is this one:
First off, no matter what anyone tells you, the resolution of HDTV is not the same as film. Film is capable of resolving up to 4,000 lines, not 2,000. People see this all the time when scanning film in for the purpose of creating special effects. The current standard for scanning film in post-production is 2,000 lines ("2K" resolution). However, there is still scannable information left on the film that is not picked up at that resolution. 4,000 lines ("4K" resolution) would be the proper resolution for scanning film. 4K is what is needed to really replace 35mm film without any loss, not 2K resolution.
If you look closely next time you're at a movie theater, you can see that special effects shots look blurry compared to the rest of the movie. That's because of the loss in resolution due to scanning at 2K. Hopefully 4K will become the standard for doing special effects, but I'm not going to hold my breath. Most people think that 2K is "enough", even though it really isn't.
As for HDTV looking "too sharp and clear", this is nonsense. No one ever says that. The common complaint is that HDTV looks "too real" compared to film. And what people mean when they say this is that it looks too much like the evening news. Film is used for everything you see on TV these days except for the news and other "reality" programming, and so when people mean to say that something has too much of the video "look", they end up saying that it looks "too real".
The reason film and video have different looks is that they respond to light differently. Film samples light in a logarithmic curve, which is very similar to the way the human eye sees light. That is why film looks so natural. Video cameras, on the other hand, sample light in a linear fashion, which is what gives it that particular look. The fact is, if you want an image that looks the most natural, film is still the way to go.
The second incorrect statement is this one:
Actually, the 1920x1080 24 fps progressive format is exactly the same format that is being used for Star Wars Episode II. That's right, it's being shot in the exact same resultion that is supposedly going to become the broadcast standard for TV. Why would studios being doing that? If they use the same resolution for movies as is going to be used for TV, why would people go to theaters anymore? It doesn't make sense.
Unless of course, they have no intention of ever giving us the HD resolution at home. And since all the major TV networks are now owned by movie studios, they can do that.
So, here what's going to happen. HDTV resolution will be used for theaters, which will be a step down in picture quality, and then regular resolution digital TV will be used for the home, and the extra bandwidth from the digital broadcasting licenses that were given away by the U.S. Congress will be used for a bunch of stupid "interactive" content. Isn't that nice?
It's all about 1080 24P, interlace is DOA (Score:2)
You will see television broadcast move to 720 60P in the next decade. Why? Because the coming large plasma displays and projection sets are all natively progressive. Watching interlaced content on a progrssive screen is like listening to 8track tapes on a high end stereo system.
You are right CODFM is much better (Score:1)
COFDM not only was receivable inside, but there were MOBILE, HANDHELD receivers that picked up clean transmissions.
To Hell With Them All (Score:2)