Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Media United Kingdom

Tech Shoppers in the UK Ditch Desktop PCs and DVD Players (ofcom.org.uk) 123

Brits are ditching DVD players and desktop PCs and are increasingly turning to newer technology such as smart TVs and smart watches, Ofcom research has found. From the research: Shoppers in the UK are predicted to spend billions of pounds again this year on Black Friday and Cyber Monday, and much of that is expected to be spent buying tech online. So, Ofcom has crunched the numbers on which tech devices people have been buying in recent years, and which ones they're getting rid of.

Ownership of digital devices such as smart TVs, smart watches and smartphones has grown significantly in recent years, as more people need a constant connection to the internet -- internet users say they spend an average of 24 hours a week online. By contrast, MP3 players, DVD players and desktop computers seem to be falling out of favour as smartphone use continues to grow, particularly for browsing and streaming. Meanwhile, the popularity of tablets and e-readers seems to have peaked. Ownership of both is significantly higher than it was seven years ago, but has levelled out in the last few years.

This discussion has been archived. No new comments can be posted.

Tech Shoppers in the UK Ditch Desktop PCs and DVD Players

Comments Filter:
  • by Dunbal ( 464142 ) * on Thursday November 22, 2018 @07:20PM (#57686260)
    Where have I heard this before? Ahh yes, when it was going to be completely "replaced" by the tablet...
    • Historically if you wanted to browse the web and check email, you plonked down for a desktop that was bottlenecked by ram before it left the factory. Plus you probably got sold Office "to open email attachments."

      So to some extent, the market for high end deaktop computers was the same as the market for functional computers. It was always a false market.

      • The CPU in mobile devices has it's ram bottlenecked before it's even encapsulated to be attached to a PCB.

        • A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.

          • by Cmdln Daco ( 1183119 ) on Thursday November 22, 2018 @09:04PM (#57686516)

            Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.

            But if you qualify with 'the things it can do' you might be able to make a case.

            • Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.

              The average user needs neither of those things, nor to do "real work" by your definition. Most of them spend 99.9% of their time in the browser, and an iPad with a crippled OS will do what they need to do.

            • Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.

              But if you qualify with 'the things it can do' you might be able to make a case.

              You can run MS Office, run a version of illustrator as well as photoshop. What sort of "real work" that an average person needs at home are you talking about. Remember, I said "average" person not some techie nerd who wants to run LINUX.

          • Re: (Score:2, Informative)

            by Anonymous Coward

            They must sell really crappy PCs in Europe. These days $1000 PC is outfitted with gaming card and 16GB of ram which will run most of game and professional applications just fine. I don't know what iPad Pro can run, $1000 PC can run any modern application without being locked in to Apple walled garden.

            • by Anonymous Coward

              Hell, a crap cobbled-together desktop PC is still far superior to either Android or iOS for getting real work done.

              Even as is, there is no real technical reason that Android or iOS has to suck for serious work, it's just that the mobile platforms are generally aimed at the lowest common denominator, and the user base tends to be mostly airheaded Generation Z types who feel the need to ruin every self portrait with lame vaguely anime-style superimposed cat ears, and drive each other to violence through nast

              • there is no real technical reason that Android or iOS has to suck for serious work,

                The technical reason is Google. Google is all out maximising suckiness everywhere it goes. Look at today's new Youtube feature "two ads for the price of one" - now your ad break is twice as long so you don't get as many in your movie!

                Hell, if it has an ad in it, I watch something else!

              • Hell, a crap cobbled-together desktop PC is still far superior to either Android or iOS for getting real work done.

                Even as is, there is no real technical reason that Android or iOS has to suck for serious work, it's just that the mobile platforms are generally aimed at the lowest common denominator, and the user base tends to be mostly airheaded Generation Z types who feel the need to ruin every self portrait with lame vaguely anime-style superimposed cat ears, and drive each other to violence through nasty Facebook postings.

                What sort of "real work" does an average person need to do at home? Today, many workplaces even offer virtual desktop environments for their users so you can simply remote into work on your iPad or Android tablet and check work stuff without a "real" computer. You run MS Office and a bunch of other applications on tablets these days and a lot of activity is performed on the web these days from filing taxes to budgeting software.

          • by goose-incarnated ( 1145029 ) on Friday November 23, 2018 @01:17AM (#57687048) Journal

            A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.

            I doubt it. My dev environment alone eats up around 16GB, that's even assuming that gcc can run on iOS, and cross-compile for my target, or that eclipse/VS/qtcreator will run on iOS, or that various creation tools (gimp, etc) will run on iOS.

            Tablets are not toys because of the hardware, they are toys because they are content-consumption only devices. Those of us who produce more content than we consume won't use tablets to do the production.

            • Comment removed based on user account deletion
              • by goose-incarnated ( 1145029 ) on Friday November 23, 2018 @10:07AM (#57688294) Journal

                You also will notbuy your PC at that store or Best Buy and spend mire than average on it.

                My desktop is a 2nd-gen i7 with 16GB of ram, cost around $250 when I bought it last year, and yet it will still do more than a top of the range tablet bought now, mostly because the tablet is almost incapable of being used to produce content.

                The tablet can't replace the desktop, because the desktop is used for a much more demanding class of functionality (content-production). The tablet is equivalent to a TV, the desktop is equivalent to a lathe.

            • A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.

              I doubt it. My dev environment alone eats up around 16GB, that's even assuming that gcc can run on iOS, and cross-compile for my target, or that eclipse/VS/qtcreator will run on iOS, or that various creation tools (gimp, etc) will run on iOS.

              Tablets are not toys because of the hardware, they are toys because they are content-consumption only devices. Those of us who produce more content than we consume won't use tablets to do the production.

              Cool story bro but they are not talking about developers or other niche users. They are talking about average users who might need Office and a few other applications and do a lot of the home finance online with their browser.

          • by Anonymous Coward

            Okay, hereâ(TM)s an example of what I want to do.

            I googled and found a website with a PDF I want to send to my friend. Not the link, I want to send the PDF as an attachment. Easy as pie on a desktop. Even Linux with Lynx and Pine itâ(TM)s no problem. Canâ(TM)t do it on iOS.

            For getting actual shit done on a website you need an actual PC, not a tablet which is just an iPhone with a bigger screen. Sometimes it works â" if the website was specifically made for mobile â" sometimes it doe

      • by AHuxley ( 892839 )
        4K and 5K and 8K games will drive that need to buy more powerful desktop computer parts.
        New CPU and GPU products have to keep up with software that can support 4K.
        • That's a very niche market.

          • by AHuxley ( 892839 )
            Its a long upgrade path. To get beyond just supporting 4K at 30 and 60 fps. Then to get the best quality at 4K with the high refresh rates.
            The display can support that. The GPU will need work. Thats another "desktop" part to buy.
            More Ram, more CPU. Then the software needs new hardware to look its best.
    • a resurgence in PC gaming has helped mask that a bit, fueled mostly by the insane popularity of Fortnight and PUBG, but folks are buying less and PCs. They'll buy one or two for Junior to do the homework on but they don't usually upgrade them much.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        ...that's also because when you buy a PC? You don't need to buy another for several years, and just swapping in the newest no-extra-power-needed graphics card is a huge performance bump whenever the existing GPU gets sluggish. When a GPU that can be slapped into any existing WalMart beige box is under $200 and can play even the newest games decently? Folks aren't dropping big coin on a whole new system.

        - WolfWings, too lazy to login to /. in way too long.

      • a resurgence in PC gaming has helped mask that a bit, fueled mostly by the insane popularity of Fortnight and PUBG, but folks are buying less and PCs. They'll buy one or two for Junior to do the homework on but they don't usually upgrade them much.

        You can play fortnight on a Samsung S8.

    • by Anonymous Coward

      Desktop and laptop PC SALES have gone down, because they have reached saturation point, and people don't need to replace them ever year or two anymore! Today's computers offer very little in performance and features over computers that are 6-8 years old.

      Oh, and try and find a TV these days that NOT a smart TV! Its almost impossible!! No, people don't necessarily need or want more devices that are connected to the Internet 24 hours a day, it is getting hard to find devices that don't have that capability!

      • by Dunbal ( 464142 ) *

        Desktop and laptop PC SALES have gone down, because they have reached saturation point, and people don't need to replace them ever year or two anymore!

        Exactly the same argument I heard in 2008.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday November 23, 2018 @04:20AM (#57687438)
      Comment removed based on user account deletion
      • by Dunbal ( 464142 ) *
        If you're buying a desktop retail, you are doing it wrong. They've always been sub par components hobbled together and loaded with malware and spyware at an extortionate price.
  • were they buying DVD players in 2017?

    • by bosef1 ( 208943 )

      I was wondering about this as well. It's late, so I'm not going to RTFA. Did they mean video disc players in general, due to the rise of streaming video; or did they literally mean people aren't buying DVD players anymore, and the UK has finally discovered Blu-ray?

    • Yes, and according to the article DVDs were replaced by smart watches which satisfy the same need for round and shiny objects. I suppose this explains why so few people buy smart watches...

  • by Anonymous Coward

    Just not upgrading. These things have become commodities. No one ever complains microwave ovens are being ditched just because no one goes mad over buying them anymore. People are just spending money on the next new shiny things.

  • by Anonymous Coward on Thursday November 22, 2018 @07:56PM (#57686352)

    Because nowadays, there's barely a point in replacing them.
    A 6 year old PC can still play modern games. And games used to be the only thing left that required non-professionals to follow the expensive upgrade cycle.

    PCs just are what they always were: A tool for universal data processing.
    It's not their fault they were wasted on useless consumer blobs running fixed-function modules ("app[lication]s") to waste their lives.

    But of course the money media must keep up the state that anything but by-definition-unsustainable exponentially exponential growing growth is the devil, and the stable balance of infinitely recycling resources that all surviving things in the universe have in common literally means literal death for being Literally Hitler(TM). Literally. ... /s
    As that's the only way they can keep leeching on society, by making us work, without working themselves.

    • by Kjella ( 173770 )

      Because nowadays, there's barely a point in replacing them. A 6 year old PC can still play modern games.

      And even then a modern graphics card would bring it almost all the way up to speed. I mean it should have a Sandy Bridge+ generation CPU, DDR3 RAM and a PCI Express 3.0 x16 slot. And if they get a shop to swap it they should get an SSD too, if they don't already have one. By the way, just managed to get a 2TB SSD for $220 + VAT on a Black Friday sale which was <5x the cost/GB for the cheapest HDD. Last year it would have been well over double that, spinning rust has stagnated while SSDs are still falling

    • Because nowadays, there's barely a point in replacing them.

      Yes and no, it depends on what you do. I just upgraded my 6 year old PC not because I do something new with it, but because other devices forced my hand. My camera takes 50mpxl images which just load and processed painfully slowly. My old camera didn't do that. My new phone takes 4K video footage and Premier just chugged when faced with that. That old 1TB drive was great in its day but with a modern game pushing 70GB it is not only filling up quickly but the wait between cutscene was just a pain.

      Sure your o

      • Those tend to be niche areas though. Most people don't have Premier. They take photos and videos and upload them straight to the internet.

        Gamers will spend a lot more on upgrades than PCs. Between 1994 and 2012, when I was more into games, I bought a total of 2 PCs. The second one was because I couldn't buy an AT motherboard with a modern CPU. I bought about 5 or 6 motherboards in that time, and a similar number of graphics cards, of course.
      • I still see little reason to replace my PC though it's 6 years old. I've upgraded the graphics card somewhat recently and put in a SSD. But the Core i7 processor, 16 GB of ram, motherboard, and all the other bits are original from 2012 and and still compare well to a new PC. The only real innovations seem to be increasing the core count, and some things like NVMe which this motherboard is too old to support.

        • The only real innovations

          Did you ever in the past upgrade on innovation, or because your computer was unable to do something?

          As I said, it depends on what *you*, and you alone, specifically *you* do with your computer. You see no sense in upgrading. I found it a necessity given my not that rare workload of playing with photos and videos.

          As for innovation, we just entered the world of raytracing. Expect an upgrade to be necessary in the coming year for quite an incredible jump in the graphic quality of games if you do that. Gaming a

  • by Anonymous Coward

    Because everyone already has a computer and don't feel the need to buy a new one every year!
    Other than gamers who want the latest hardware, most people are content with a computer from 2010... I should know, my PC is from 2010 and still more than usable enough after upgrading to an SSD and changing the GPU to keep up with games.

  • Fine DVD plsyers ar probsbly a duinge marcket but what about BD anf 4K Bd players. Ore does tfa mean any player for disc based video??

  • by Anonymous Coward on Thursday November 22, 2018 @08:25PM (#57686404)

    The "Bang for buck" ratio has deteriorated.

    ie:

    8086 -> 80286 -> good BFBR "Bang For Buck Ratio"
    80286 -> 80386 -> good BFBR
    80386 -> 80486 -> excellent BFBR [ VESA local bus, faster ram ]
    80486 -> pentium -> good BFBR [ pci bus, more ram, much faster speeds, MMX ]
    Pentium -> Pentium 4 -> good BFBR [ pcie, more ram, much faster speeds, better video cards, etc ]
    Pentium 4 - > Quad Core or Core2 - > excellent BFBR [ pcie, next gen, DDR3/DDR2 memory, much better cores, and more of them ]

    Now, we are in the era where we:

    -Add slightly faster ram, at the cost of latency - shitty BFBR
    -Add slightly faster video cards, at a very large cost - shitty BFBR
    -Add slightly faster CPUs, at an obscence cost -- very shitty BFBR

    so the BFBR has decreased, where you can spend another $1000, to get 10-15% better performance, measured in "seconds" for most tasks, or less, or a few more FPS, which anything above 30 would be unnoticeable.

    spend $100, get an SSD, and make it feel like a new system.
    Spend $1000, get a slight performance boost.

    I'm still using a 2600k, with 16gb, with a 7770HD video card, I see no reason to upgrade.

    • I see no reason to upgrade.

      You didn't mention what you're feeding in an out of your computer?

      I would see no reason to upgrade if I took happy snaps on my smartphone, or recorded video using a potatoe. I would see no reason to upgrade if you're sitting there with a 1080p shitty monitor, playing some games that also run fine on the Xbox360.

      On the flip side, start editing 50mpxl images (getting standard on new SLRs), or editing 4K video (getting standard on mobile phones), or try and drive a 4K display, with H

  • The problem is almost nobody is building high quality PC software these days. Most new development is out in the 'cloud' with the PC being treated like the old 'dumb terminals' from the Main Frame era. The stuff running on the PC now is mostly front end client software that just sends commands to the servers in the cloud.

    Once upon a time, all the cool new software was running on a PC that was not connected at all to the internet. Local databases, spreadsheets, word processors, etc. were all designed to wor
  • by cheesybagel ( 670288 ) on Friday November 23, 2018 @12:14AM (#57686916)

    When Ryzen came out I thought it was time to upgrade and throw AMD a bone. So I bought a new computer system. Motherboard, CPU, but then... the DRAM and NAND were expensive like heck and the GPU prices were ludicrous because of those god damned coin miners. So I put an old GPU card on it and got lower end memory and NAND products.

    I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more... A couple months passed then Meltdown and Spectre came around. So basically I've left it in another floor collecting dust while I'm still working on my old PC. I can't feel assed about transferring the file systems and applications from my "old" PC to the new one. Oh and the case they got me had no 5 1/4" frontal drive bays whatsoever for my legacy discs so I had to buy an external reader. At one point I thought I was better off with a laptop.

    I blame the memory cartel pricing, obscene GPU prices which are like 2x what they should be right now, and the CPU manufacturers for a) screwing it up b) Intel keeps spinning new revisions of the same shit over and over and calls it a new product.

    So it is little wonder few people want to upgrade. Also 4K just made everything more expensive and it is useless for gaming.

    • A couple months passed then Meltdown and Spectre came around. So basically I've left it in another floor collecting dust while I'm still working on my old PC.

      Your old PC (especially if it's intel) will be vulnerable to those unless it's old enough to be single core (i.e. a P4 or older).

      I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more...

      M.2 has the capability to be faster, since it'a a much faster bus.

      Also 4K just made everyt

      • My old PC has an AMD Piledriver processor. It doesn't do SMT. It does CMP. So it isn't as vulnerable to those exploits as say Ryzen or an Intel processor. It is vulnerable to out-of-order branch prediction attacks though. But then again so is nearly everything else including ARM.

        • My old PC has an AMD Piledriver processor. It doesn't do SMT. It does CMP. So it isn't as vulnerable to those exploits as say Ryzen or an Intel processor.

          Old i7 with hyperthreading here. Probably vulnerable a. f.

          It is vulnerable to out-of-order branch prediction attacks though. But then again so is nearly everything else including ARM.

          Phones, yes. I think the RPi is safe since it's an in-order processor. Though that's why it's not really very speedy and the 1.4GHz Pi3 is absolutely no match for this 1.7GHz

    • I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more...

      That alone should have netted you a close to 10x improvment in performance for identical cost unless you bought an incredibly crap M.2 drive.

      obscene GPU prices which are like 2x what they should be right now

      GPUs are going for bargain basement prices right now. New prices are back below MSRP, and the second hand market is flooded with cards. NOW is the time to buy, and if you're anywhere close to that 2x mark it's time to pick a different store.

      and the CPU manufacturers for a) screwing it up

      In what way?

      So it is little wonder few people want to upgrade.

      It really isn't, but not for the reasons you cite. You didn't specify your workload. You're happy with an old GPU, can

      • The CPU manufacturers screwed it up by making buggy processors and, at least in the case of Intel, with little performance benefit over much older processors.

        I wanted to game with higher detail on and to compile things faster. I also occasionally encode video or audio. That's why I bought the AMD Ryzen so I would have more concurrent threads to improve compilation. I might buy a new GPU card and larger SSD and put that Ryzen computer to use eventually. However it might be a bad idea because AMD is currently

        • Other people agree that GPUs are still overpriced like heck.
          https://www.eetimes.com/docume... [eetimes.com]

          • That article says nothing about GPUs being overpriced. It's a bitching and moaning piece that the current top of the line GPU is the most expensive one ever released ignoring that there are plenty of high performance parts with great prices available, mark-ups have gone with the crypto-bust, and due to current market oversupply getting a high-performance GPU has never been cheaper. Bitching about the high cost of being an early adopter for a new technology that is currently sold out everywhere (implying it

        • The CPU manufacturers screwed it up by making buggy processors and, at least in the case of Intel, with little performance benefit over much older processors.

          Not at all. The CPU does exactly what it says on the box. They traded security that is almost irrelevant to general consumers in favour of performance and as shown recently that security can happily be regained through software. The CPUs currently on the market do not show any higher level of errata than in the past and they still happily work like they did in the past.

  • DVD player? I haven't seen one in years. And desktops have only been found in libraries since tablets hit the market.

  • by Going_Digital ( 1485615 ) on Friday November 23, 2018 @06:10AM (#57687670)
    The technology in desktop computers is mature, it is no longer developing at the break neck speed it once was. It wasn't long ago when once a system was 3 years old in seemed ancient and slow by comparison to the latest model. Today however your 3 year old system is still perfectly good and you would hardly notice any difference in performance if you replaced it today, so why bother?

    Everything has this cycle, where it gets to the point that what you already have is good enough, and further small tweaks do not justify the cost of replacing.

  • The sound of the D5 is generally Dolby Digital, and its code rate is 384~448kbps. D9 generally has DTS (as long as the positive film is not too long), and there is THX certification, DTS code rate is much higher than Dolby Digital, the lowest is 768kbps, the highest is up to 1.5Mbps. And the DTS encoding capacity is also amazing. Sometimes the D9 can also be equipped with LPCM lossless sound. Therefore, the D9 format of the disc, of course, the sound effect is undoubtedhttps://www.newbecca.com/product/10075

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...