Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Movies Media The Almighty Buck Hardware

Kmart Drops Blu-Ray Players 392

Lord Byron II writes "K-mart has decided to stop selling Blu-Ray players in their stores, primarily because of the high cost of Blu-Ray compared to HD-DVD (now under $200). They will continue to sell the PS3 for the time being. Will lower prices speed the adoption of HD-DVD in the upcoming holiday shopping season?"
This discussion has been archived. No new comments can be posted.

Kmart Drops Blu-Ray Players

Comments Filter:
  • Re:No. (Score:5, Informative)

    by Valafar ( 309028 ) on Thursday November 01, 2007 @11:38PM (#21206973)
    WTF are you talking about? There's plenty of "content"; Just go to your local super electronics store and see for yourself. Every major studio release in the last 5 or 6 months is coming out on HD-DVD, Blu-Ray or both. What's more, there's a world of difference in quality if you actually own an HD TV. An up converting standard DVD player does a good job, but the difference with HD-DVD / Blu-Ray is definitely noticeable.

    The backers of HD-DVD are being far more intelligent from a marketing stand point than Sony+Blue-Ray. Cheaper players, Combo discs (Standard DVD + HD-DVD in the same package) and they have better penetration into the markets that actually matter (Wal-Mart, for example).
  • by jbridges ( 70118 ) on Thursday November 01, 2007 @11:40PM (#21206999)
    WalMart has the Toshiba HD A2 for $98.87 as of 8am on November 2nd 2007.

    http://holiday.ri-walmart.com/?u1=433093-2-0-ARTICLE-0&section=secret&utm_source=Walmartcom [ri-walmart.com]

    I believe they may include the free 5 HD DVDs deal, which alone is worth $100.

    I'd say that is breaking the price barrier holding back acceptance!!

    (I know I'm buying two, one for us, and one for my inlaws for Christmas)
  • by Techogeek ( 1148745 ) on Thursday November 01, 2007 @11:47PM (#21207065)
    It's the other way around.. Kmart owns Sears.. http://www.msnbc.msn.com/id/6509683/ [msn.com]
  • by hung_himself ( 774451 ) on Friday November 02, 2007 @02:05AM (#21208039)
    He's talking about compression schemes that compress on the time dimension (which are essentially all video compression schemes). Think of it as a jpeg cube with time as the depth dimension. Compression schemes *cheat* in this dimension by not encoding the complete information - taking advantage of the fact that most of the time only small parts of the picture change with time. So if the picture is static (constant in time) or close to static there is very little information lost this way. You see distortion in places where the picture changes and this is the worst in a pan of a complex scene because the entire picture changes *and* the eye knows what it *should* be seeing. In these cases what you notice is a stutter or blur due to a delays and distortions in updating the picture because the codec is displaying something based on the average picture within a time period which in this particular case is a poor approximation. It's most noticeable in highly compressed low bit rate formats but you can see it even in DVDs if you look for it.
  • Actually... (Score:5, Informative)

    by shirai ( 42309 ) on Friday November 02, 2007 @02:49AM (#21208241) Homepage
    Though they're specials, both Wal Mart and Best Buy are offering HD DVD players for $100.

    Toshiba HD-A2 HD DVD player: $100, this Friday, Wal-Mart

    http://www.engadget.com/2007/11/01/toshiba-hd-a2-hd-dvd-player-100-this-friday-wal-mart/ [engadget.com]

    Best Buy offers Toshiba HD-A2 for $100

    http://www.engadget.com/2007/11/01/best-buy-offers-the-toshiba-hd-a2-for-100-too-and-other-hd-dv/ [engadget.com]
  • by Anonymous Coward on Friday November 02, 2007 @02:49AM (#21208245)
    I went shopping at Best Buy for a TV a couple of weeks ago, and this is what he told me. He said that every screen in the place, whether it was 1080p capable, 1080i, or 720p, was being fed a canned 720p loop. Because it is pre-processed, the feed looks better than it will seem in practice. Also, their HD vs. SDTV comparisons typically use Standard Definition as the base signal for "both" TVs, including the HDTV. This is mind-boggling to me, as I can't understand why they can't show good action sequences in 1080p on the showroom floor. Perhaps there are content restrictions getting in the way, but this is the year 2007, and within 2 years, everyone will need a digital tv. How many people will be persuaded that a 720p tv will be worth it because they "can't tell the difference between the two" in the store? And how many of them will be disappointed that they got a 720p set when they have to go back for another in a few years? Most of the tv-watching public isn't used to upgrading their tvs more than once every 5 years or so, and for sure by 2012 Full HD content will be mainstream. This stands to be a huge issue in a couple years, and I smell a lawsuit if Best Buy and all the other TV retailers don't explain "exactly" what level of detail the content on the showroom floor is displaying.
  • by NeMon'ess ( 160583 ) * <flinxmid&yahoo,com> on Friday November 02, 2007 @03:51AM (#21208553) Homepage Journal
    h.264 has global motion compensation which tracks the direction of a pan or unsteady camera and compresses the direction of movement so it can save bytes and reuse visual data that has shifted position. DVD uses MPEG-2 and does not do this.

    The biggest cause of undesirable blur is the 24fps shooting speed of movies. The new digital projection standard includes 2K at 48fps, and 4K and 24fps. I'm really hoping Hollywood saves the movie theaters from home cinema by embracing 2K at 48fps. The experience should be really impressively clear. Not as clear as 4K at 48, but clear.
  • Re:No. (Score:3, Informative)

    by the unbeliever ( 201915 ) <chris+slashdot&atlgeek,com> on Friday November 02, 2007 @04:00AM (#21208605) Homepage
    h.264 can compress this "20-40gb of data" into something that can fit on a dvd-9 without discernible loss of quality, sir.
  • by Anonymous Coward on Friday November 02, 2007 @07:18AM (#21209645)
    LG makes HD-DVD players too (both standalone and combo HD-DVD/Blu-ray players.)
  • Where's the source? (Score:3, Informative)

    by nschubach ( 922175 ) on Friday November 02, 2007 @07:36AM (#21209731) Journal
    Really, what's the source on this story? A Blog post on some unknown site by someone named "Technology Expert"? Hold a second while I create a blog, post that Walmart/Best Buy/Circuit City/etc decided to drop HDDVD then post it here for the editors to forward on.
  • by pyite ( 140350 ) on Friday November 02, 2007 @08:20AM (#21210007)
    This is a 1080i player, not 1080p.

    I'm really getting tired of people who don't know what they're talking about making a big issue of 1080i vs. 1080p when it comes to a source device. Obviously, 1080i and 1080p are very different when it comes to a display. However, Any 1080p display worth its purchase price is going to be able to convert from 1080i to 1080p effectively losslessly. From Wikipedia [wikipedia.org]: "Due to interlacing, 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth." In short, a 1080i signal and a 1080p signal contain the same data, just formatted differently. To go from 1080i to 1080p (this is simplified and doesn't account for various framerate differences), you take every two 1080i frames (540 lines each), weave them, and you have a 1080p frame.

  • by papasui ( 567265 ) on Friday November 02, 2007 @08:51AM (#21210293) Homepage
    ~Little disclaimer I'm a Network Engineer specializing in DOCSIS/CABLE/VOIP~ Here's a little secret for you. Each analog channel they have on their system is pushing 38Mbit at the current going rate of 256QAM. Once they get rid of those OR optionally increase their plant capacity OR go to higher QAM they will have plenty of bandwidth. They also are very likely using MPEG2 for their datastreams, advanced codecs would significantly reduce the bandwidth needs. So while yes you may be right that it will require upgrades, 10 years is way to far out.
  • by Skapare ( 16644 ) on Friday November 02, 2007 @09:52AM (#21210925) Homepage

    What frame rate are you assuming the 1080p content is in? Standard formats have only one frame rate for 1080i (30 frames/sec, 60 fields/sec, plus the 1000/1001 ratio rates) but have 3 choices for 1080p (24, 30, and 60, plus the 1000/1001 ratio rates). For content originating in 24 fps motion picture film, or its digital equivalent, encoding it as 24 fps onto the disc is best.

    If you are converting 1080i30/60 to 1080p60, that works fine. But the source material may not be in that format. It might be in 1080p24. Upconverting that to 1080i30/60 would add the motion judder artifact. That can be easily fixed if the upconversion were to any progressive frame rate. Fixing it after interlacing is next to impossible (the weaving together method doesn't fix judder).

    What we really need is a player that either leaves the content unconverted (e.g. send it as 1080p24 to the display) or upconverts to something a multiple of the 24 fps (1080p48 or 1080p72 ... non-standards, unfortunately). More likely we'll see upconversion to 1080p60 from players in the future, and then TVs will have to have "judder correction" or "3:2 film correction". But it would be better to just pass the video from the player to the display in the 24 fps format. An LCD display can simply update pixels at that rate and you won't see flicker anyway. Other display technology would have to engage the conversion circuits.

  • by benwaggoner ( 513209 ) <ben.waggoner@mic ... t.com minus poet> on Friday November 02, 2007 @10:41AM (#21211649) Homepage
    No, you're seeing motion blur because film cameras have a 1/48th of a second exposure time. That same blur on frames with high motion was seen in the theater, and on the negative.
  • I'm really getting tired of people who don't know what they're talking about making a big issue of 1080i vs. 1080p when it comes to a source device.....


    It would appear sir, that it is you who does not understand the issues here.

    1080i means the signal is interlaced. What is interlacing? Put briefly; back in the 1930's, you simply could not transmit as much data to a television back in those days. You were very limited in what you could transmit reliably given the transmitters, receivers, and noisy equipment of the day. In modern language, we might say that bandwidth was very limited for television.

    Like all forms of moving pictures, television requires a fairly high framerate to give the illusion of a continuously moving images from what is just a sequence of still frames. But because of the restricted bandwidth, more frames per second means your frames must have less resolution. So the 1930s engineers were seemingly at an impasse.

    Enter interlacing. Instead of transmitting a full ~25 frames every second, you transmit ~25 half frames every second. One one frame you draw the odd numbered lines of pixels, and on the next you draw the even lines, and so on. Because CRT televisions used glowing phosphor which had a "fade" out time, the two frames would meld into one without the viewer noticing. It was a good solution given the technology of the day, and served the industry well for many years.

    So 1080i signals are inherently of a much, much lower quality than either 1080p signals, or even 720p signals. This is because they transmit half frames, and try as you might you're never, ever going to be able to mesh those frames into one another seamlessly. 1080i is already a lossy signal, so saying that it converts "losslessly" to 1080p is equivalent to saying that a 320x240 signal can be scaled "losslessly" to a 640x480 signal. It's true, but your avoiding the main issue.

    Yes, given the same bandwidth, a 1080i signal can transmit just as much data as a 1080p signal. So can any signal for that matter, regardless of format. But the reality is, 99.999% of 1080i signals will be transmitting at the same framerate as their 1080p equivalents, i.e. the 1080i signal will be transmitting less data and hence will be a lower quality one. Even if it transmits the same data, the signal will still have been put through an interlacing shredder, and will not be worth the money you're paying for it.

    We're now in the year 2007. Simply put, bandwidth is for nothing. On top of that, our newer televisions don't use CRTs anymore, meaning that interlacing tends to show up quite noticeably, making the picture look awful. So why then do we have 1080i as a HD option? .....

    Hell if I know.

    Interlacing was a smart idea in the 1930's. In 2007, with digital framebuffers, LCD TVs, and high quality cabling, interlacing is simply an embarrassment. 1080i is simply a high resolution embarrassment.

If you have a procedure with 10 parameters, you probably missed some.

Working...