Kmart Drops Blu-Ray Players 392
Lord Byron II writes "K-mart has decided to stop selling Blu-Ray players in their stores, primarily because of the high cost of Blu-Ray compared to HD-DVD (now under $200). They will continue to sell the PS3 for the time being. Will lower prices speed the adoption of HD-DVD in the upcoming holiday shopping season?"
Re:No. (Score:5, Informative)
The backers of HD-DVD are being far more intelligent from a marketing stand point than Sony+Blue-Ray. Cheaper players, Combo discs (Standard DVD + HD-DVD in the same package) and they have better penetration into the markets that actually matter (Wal-Mart, for example).
WalMart has Toshiba HD A2 for $98.87 Nov 2nd (Score:5, Informative)
http://holiday.ri-walmart.com/?u1=433093-2-0-ARTICLE-0§ion=secret&utm_source=Walmartcom [ri-walmart.com]
I believe they may include the free 5 HD DVDs deal, which alone is worth $100.
I'd say that is breaking the price barrier holding back acceptance!!
(I know I'm buying two, one for us, and one for my inlaws for Christmas)
Re:Kmart owned by Sears (Score:2, Informative)
Re:Is something better coming along? (Score:3, Informative)
Actually... (Score:5, Informative)
Toshiba HD-A2 HD DVD player: $100, this Friday, Wal-Mart
http://www.engadget.com/2007/11/01/toshiba-hd-a2-hd-dvd-player-100-this-friday-wal-mart/ [engadget.com]
Best Buy offers Toshiba HD-A2 for $100
http://www.engadget.com/2007/11/01/best-buy-offers-the-toshiba-hd-a2-for-100-too-and-other-hd-dv/ [engadget.com]
Re:Is something better coming along? (Score:1, Informative)
Re:Is something better coming along? (Score:3, Informative)
The biggest cause of undesirable blur is the 24fps shooting speed of movies. The new digital projection standard includes 2K at 48fps, and 4K and 24fps. I'm really hoping Hollywood saves the movie theaters from home cinema by embracing 2K at 48fps. The experience should be really impressively clear. Not as clear as 4K at 48, but clear.
Re:No. (Score:3, Informative)
Re:HD DVD = Beta max but inferior to competition (Score:1, Informative)
Where's the source? (Score:3, Informative)
Re: No Blue Light special on Blue Ray (Score:5, Informative)
I'm really getting tired of people who don't know what they're talking about making a big issue of 1080i vs. 1080p when it comes to a source device. Obviously, 1080i and 1080p are very different when it comes to a display. However, Any 1080p display worth its purchase price is going to be able to convert from 1080i to 1080p effectively losslessly. From Wikipedia [wikipedia.org]: "Due to interlacing, 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth." In short, a 1080i signal and a 1080p signal contain the same data, just formatted differently. To go from 1080i to 1080p (this is simplified and doesn't account for various framerate differences), you take every two 1080i frames (540 lines each), weave them, and you have a 1080p frame.
Re:Video On Demand Makes BluRay/HD-DVD Irrelevant (Score:3, Informative)
Re: No Blue Light special on Blue Ray (Score:3, Informative)
What frame rate are you assuming the 1080p content is in? Standard formats have only one frame rate for 1080i (30 frames/sec, 60 fields/sec, plus the 1000/1001 ratio rates) but have 3 choices for 1080p (24, 30, and 60, plus the 1000/1001 ratio rates). For content originating in 24 fps motion picture film, or its digital equivalent, encoding it as 24 fps onto the disc is best.
If you are converting 1080i30/60 to 1080p60, that works fine. But the source material may not be in that format. It might be in 1080p24. Upconverting that to 1080i30/60 would add the motion judder artifact. That can be easily fixed if the upconversion were to any progressive frame rate. Fixing it after interlacing is next to impossible (the weaving together method doesn't fix judder).
What we really need is a player that either leaves the content unconverted (e.g. send it as 1080p24 to the display) or upconverts to something a multiple of the 24 fps (1080p48 or 1080p72 ... non-standards, unfortunately). More likely we'll see upconversion to 1080p60 from players in the future, and then TVs will have to have "judder correction" or "3:2 film correction". But it would be better to just pass the video from the player to the display in the 24 fps format. An LCD display can simply update pixels at that rate and you won't see flicker anyway. Other display technology would have to engage the conversion circuits.
Motion Blur, not compression! (Score:3, Informative)
Re: No Blue Light special on Blue Ray (Score:3, Informative)
It would appear sir, that it is you who does not understand the issues here.
1080i means the signal is interlaced. What is interlacing? Put briefly; back in the 1930's, you simply could not transmit as much data to a television back in those days. You were very limited in what you could transmit reliably given the transmitters, receivers, and noisy equipment of the day. In modern language, we might say that bandwidth was very limited for television.
Like all forms of moving pictures, television requires a fairly high framerate to give the illusion of a continuously moving images from what is just a sequence of still frames. But because of the restricted bandwidth, more frames per second means your frames must have less resolution. So the 1930s engineers were seemingly at an impasse.
Enter interlacing. Instead of transmitting a full ~25 frames every second, you transmit ~25 half frames every second. One one frame you draw the odd numbered lines of pixels, and on the next you draw the even lines, and so on. Because CRT televisions used glowing phosphor which had a "fade" out time, the two frames would meld into one without the viewer noticing. It was a good solution given the technology of the day, and served the industry well for many years.
So 1080i signals are inherently of a much, much lower quality than either 1080p signals, or even 720p signals. This is because they transmit half frames, and try as you might you're never, ever going to be able to mesh those frames into one another seamlessly. 1080i is already a lossy signal, so saying that it converts "losslessly" to 1080p is equivalent to saying that a 320x240 signal can be scaled "losslessly" to a 640x480 signal. It's true, but your avoiding the main issue.
Yes, given the same bandwidth, a 1080i signal can transmit just as much data as a 1080p signal. So can any signal for that matter, regardless of format. But the reality is, 99.999% of 1080i signals will be transmitting at the same framerate as their 1080p equivalents, i.e. the 1080i signal will be transmitting less data and hence will be a lower quality one. Even if it transmits the same data, the signal will still have been put through an interlacing shredder, and will not be worth the money you're paying for it.
We're now in the year 2007. Simply put, bandwidth is for nothing. On top of that, our newer televisions don't use CRTs anymore, meaning that interlacing tends to show up quite noticeably, making the picture look awful. So why then do we have 1080i as a HD option?
Hell if I know.
Interlacing was a smart idea in the 1930's. In 2007, with digital framebuffers, LCD TVs, and high quality cabling, interlacing is simply an embarrassment. 1080i is simply a high resolution embarrassment.