Tech Shoppers in the UK Ditch Desktop PCs and DVD Players (ofcom.org.uk) 123
Brits are ditching DVD players and desktop PCs and are increasingly turning to newer technology such as smart TVs and smart watches, Ofcom research has found. From the research: Shoppers in the UK are predicted to spend billions of pounds again this year on Black Friday and Cyber Monday, and much of that is expected to be spent buying tech online. So, Ofcom has crunched the numbers on which tech devices people have been buying in recent years, and which ones they're getting rid of.
Ownership of digital devices such as smart TVs, smart watches and smartphones has grown significantly in recent years, as more people need a constant connection to the internet -- internet users say they spend an average of 24 hours a week online. By contrast, MP3 players, DVD players and desktop computers seem to be falling out of favour as smartphone use continues to grow, particularly for browsing and streaming. Meanwhile, the popularity of tablets and e-readers seems to have peaked. Ownership of both is significantly higher than it was seven years ago, but has levelled out in the last few years.
Ownership of digital devices such as smart TVs, smart watches and smartphones has grown significantly in recent years, as more people need a constant connection to the internet -- internet users say they spend an average of 24 hours a week online. By contrast, MP3 players, DVD players and desktop computers seem to be falling out of favour as smartphone use continues to grow, particularly for browsing and streaming. Meanwhile, the popularity of tablets and e-readers seems to have peaked. Ownership of both is significantly higher than it was seven years ago, but has levelled out in the last few years.
The desktop is dying! (Score:5, Insightful)
Re: The desktop is dying! (Score:1)
Historically if you wanted to browse the web and check email, you plonked down for a desktop that was bottlenecked by ram before it left the factory. Plus you probably got sold Office "to open email attachments."
So to some extent, the market for high end deaktop computers was the same as the market for functional computers. It was always a false market.
Re: The desktop is dying! (Score:1)
The CPU in mobile devices has it's ram bottlenecked before it's even encapsulated to be attached to a PCB.
Re: The desktop is dying! (Score:2)
A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.
Re: The desktop is dying! (Score:5, Insightful)
Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.
But if you qualify with 'the things it can do' you might be able to make a case.
Re: (Score:2)
Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.
The average user needs neither of those things, nor to do "real work" by your definition. Most of them spend 99.9% of their time in the browser, and an iPad with a crippled OS will do what they need to do.
Re: (Score:1)
Except it only runs a toy operating system (not even real MacOS) and lacks a filesystem. So the things it can do don't really include real work.
But if you qualify with 'the things it can do' you might be able to make a case.
You can run MS Office, run a version of illustrator as well as photoshop. What sort of "real work" that an average person needs at home are you talking about. Remember, I said "average" person not some techie nerd who wants to run LINUX.
Re: (Score:1)
Real people don't worry about Apple Marketing Punctuation Guidelines.
Re: (Score:2, Informative)
They must sell really crappy PCs in Europe. These days $1000 PC is outfitted with gaming card and 16GB of ram which will run most of game and professional applications just fine. I don't know what iPad Pro can run, $1000 PC can run any modern application without being locked in to Apple walled garden.
Re: The desktop is dying! (Score:1)
Hell, a crap cobbled-together desktop PC is still far superior to either Android or iOS for getting real work done.
Even as is, there is no real technical reason that Android or iOS has to suck for serious work, it's just that the mobile platforms are generally aimed at the lowest common denominator, and the user base tends to be mostly airheaded Generation Z types who feel the need to ruin every self portrait with lame vaguely anime-style superimposed cat ears, and drive each other to violence through nast
Re: (Score:2)
The technical reason is Google. Google is all out maximising suckiness everywhere it goes. Look at today's new Youtube feature "two ads for the price of one" - now your ad break is twice as long so you don't get as many in your movie!
Hell, if it has an ad in it, I watch something else!
Re: (Score:1)
Hell, a crap cobbled-together desktop PC is still far superior to either Android or iOS for getting real work done.
Even as is, there is no real technical reason that Android or iOS has to suck for serious work, it's just that the mobile platforms are generally aimed at the lowest common denominator, and the user base tends to be mostly airheaded Generation Z types who feel the need to ruin every self portrait with lame vaguely anime-style superimposed cat ears, and drive each other to violence through nasty Facebook postings.
What sort of "real work" does an average person need to do at home? Today, many workplaces even offer virtual desktop environments for their users so you can simply remote into work on your iPad or Android tablet and check work stuff without a "real" computer. You run MS Office and a bunch of other applications on tablets these days and a lot of activity is performed on the web these days from filing taxes to budgeting software.
Re: The desktop is dying! (Score:4, Insightful)
A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.
I doubt it. My dev environment alone eats up around 16GB, that's even assuming that gcc can run on iOS, and cross-compile for my target, or that eclipse/VS/qtcreator will run on iOS, or that various creation tools (gimp, etc) will run on iOS.
Tablets are not toys because of the hardware, they are toys because they are content-consumption only devices. Those of us who produce more content than we consume won't use tablets to do the production.
Re: (Score:3)
Re: The desktop is dying! (Score:4, Insightful)
You also will notbuy your PC at that store or Best Buy and spend mire than average on it.
My desktop is a 2nd-gen i7 with 16GB of ram, cost around $250 when I bought it last year, and yet it will still do more than a top of the range tablet bought now, mostly because the tablet is almost incapable of being used to produce content.
The tablet can't replace the desktop, because the desktop is used for a much more demanding class of functionality (content-production). The tablet is equivalent to a TV, the desktop is equivalent to a lathe.
Re: (Score:1)
A 600 eur iPad Pro from last year, at the things it can do, basically runs circles around most 1000 eur PCs of today.
I doubt it. My dev environment alone eats up around 16GB, that's even assuming that gcc can run on iOS, and cross-compile for my target, or that eclipse/VS/qtcreator will run on iOS, or that various creation tools (gimp, etc) will run on iOS.
Tablets are not toys because of the hardware, they are toys because they are content-consumption only devices. Those of us who produce more content than we consume won't use tablets to do the production.
Cool story bro but they are not talking about developers or other niche users. They are talking about average users who might need Office and a few other applications and do a lot of the home finance online with their browser.
Re: The desktop is dying! (Score:2, Informative)
Okay, hereâ(TM)s an example of what I want to do.
I googled and found a website with a PDF I want to send to my friend. Not the link, I want to send the PDF as an attachment. Easy as pie on a desktop. Even Linux with Lynx and Pine itâ(TM)s no problem. Canâ(TM)t do it on iOS.
For getting actual shit done on a website you need an actual PC, not a tablet which is just an iPhone with a bigger screen. Sometimes it works â" if the website was specifically made for mobile â" sometimes it doe
Re: (Score:2)
New CPU and GPU products have to keep up with software that can support 4K.
Re: (Score:2)
Need a new CPU.
Software should be 4K ready.
Re: (Score:3)
Re: (Score:2)
That's a very niche market.
Re: (Score:2)
The display can support that. The GPU will need work. Thats another "desktop" part to buy.
More Ram, more CPU. Then the software needs new hardware to look its best.
At home it mostly is (Score:3)
Re: (Score:2, Insightful)
...that's also because when you buy a PC? You don't need to buy another for several years, and just swapping in the newest no-extra-power-needed graphics card is a huge performance bump whenever the existing GPU gets sluggish. When a GPU that can be slapped into any existing WalMart beige box is under $200 and can play even the newest games decently? Folks aren't dropping big coin on a whole new system.
- WolfWings, too lazy to login to /. in way too long.
Re: (Score:1)
a resurgence in PC gaming has helped mask that a bit, fueled mostly by the insane popularity of Fortnight and PUBG, but folks are buying less and PCs. They'll buy one or two for Junior to do the homework on but they don't usually upgrade them much.
You can play fortnight on a Samsung S8.
Re: (Score:1)
Desktop and laptop PC SALES have gone down, because they have reached saturation point, and people don't need to replace them ever year or two anymore! Today's computers offer very little in performance and features over computers that are 6-8 years old.
Oh, and try and find a TV these days that NOT a smart TV! Its almost impossible!! No, people don't necessarily need or want more devices that are connected to the Internet 24 hours a day, it is getting hard to find devices that don't have that capability!
Re: (Score:2)
Desktop and laptop PC SALES have gone down, because they have reached saturation point, and people don't need to replace them ever year or two anymore!
Exactly the same argument I heard in 2008.
Comment removed (Score:5, Insightful)
Re: (Score:2)
But it's a good machine actually for people afraid of the big bad Intel ME and AMD PSP.
No one gives a damn about this, and it is well and truly reflected in the article.
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
reputable vendors like HP
I rest my case.
Re: (Score:2)
until the power of the desktop fits inside the tablet form factor. Which is a long way to go. Processors may be getting there, but not yet the graphics and certainly not the flexibility of interchangeable parts & customization.
Not to mention heat. Processing billions and billions of instructions and vertices per second burns a lot of electricity, which is then converted to heat. All that heat has got to go somewhere. When I was young, back at the dawn of time and computing, a 250W power supply was more than enough for everything (including the monitor, whose power outlet plugged into the PC's power supply directly). Now you need 800W or up if you want to run any sort of decent graphics card - and the monitor(s) also plug in separ
Re: (Score:2)
Now you need 800W or up if you want to run any sort of decent graphics card
My PC has two video cards in it, peaks at about 400W, and will run any AAA game at least with decent settings.
Re: (Score:2)
The connector for the monitor was just a pass through from the wall socket, the psu itself did not drive the monitor.
Re: (Score:2)
"This or that is dying" is marketing gobbledygook for "no longer growing".
Indeed, it is because of the fixation on (potential) growth as a measure of company performance.
It's why Tesla is so highly valued - one day it could become a virtual monopolist supplier of electric vehicles and therefore sell the equivalent of the combined total current vehicle market all by itself...
DVD PLAYERS??? what year is this story from? (Score:1)
were they buying DVD players in 2017?
Re: (Score:2)
I was wondering about this as well. It's late, so I'm not going to RTFA. Did they mean video disc players in general, due to the rise of streaming video; or did they literally mean people aren't buying DVD players anymore, and the UK has finally discovered Blu-ray?
Re: (Score:2)
Yes, and according to the article DVDs were replaced by smart watches which satisfy the same need for round and shiny objects. I suppose this explains why so few people buy smart watches...
Not ditching. (Score:1)
Just not upgrading. These things have become commodities. No one ever complains microwave ovens are being ditched just because no one goes mad over buying them anymore. People are just spending money on the next new shiny things.
Re: Building Their Own (Score:1)
Its really not. Most of my friends have quit building. Ill never build, nor buy, a desktop again. Everyone is buying laptops.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
People are building their own desktop computers by purchasing the components they want to use. Demand is massive.
Yeah, no. The average joe/jane is not "assembling" their own PCs. Time is money for most people. Only enthusiasts are "building" PCs for themselves or their friends. Demand is not really that massive. Supply for a lot of components is constrained at times because of supply chain issues.
No they don't "ditch" them! They keep them! (Score:5, Insightful)
Because nowadays, there's barely a point in replacing them.
A 6 year old PC can still play modern games. And games used to be the only thing left that required non-professionals to follow the expensive upgrade cycle.
PCs just are what they always were: A tool for universal data processing.
It's not their fault they were wasted on useless consumer blobs running fixed-function modules ("app[lication]s") to waste their lives.
But of course the money media must keep up the state that anything but by-definition-unsustainable exponentially exponential growing growth is the devil, and the stable balance of infinitely recycling resources that all surviving things in the universe have in common literally means literal death for being Literally Hitler(TM). Literally. ... /s
As that's the only way they can keep leeching on society, by making us work, without working themselves.
Re: (Score:2)
Because nowadays, there's barely a point in replacing them. A 6 year old PC can still play modern games.
And even then a modern graphics card would bring it almost all the way up to speed. I mean it should have a Sandy Bridge+ generation CPU, DDR3 RAM and a PCI Express 3.0 x16 slot. And if they get a shop to swap it they should get an SSD too, if they don't already have one. By the way, just managed to get a 2TB SSD for $220 + VAT on a Black Friday sale which was <5x the cost/GB for the cheapest HDD. Last year it would have been well over double that, spinning rust has stagnated while SSDs are still falling
Re: (Score:2)
Because nowadays, there's barely a point in replacing them.
Yes and no, it depends on what you do. I just upgraded my 6 year old PC not because I do something new with it, but because other devices forced my hand. My camera takes 50mpxl images which just load and processed painfully slowly. My old camera didn't do that. My new phone takes 4K video footage and Premier just chugged when faced with that. That old 1TB drive was great in its day but with a modern game pushing 70GB it is not only filling up quickly but the wait between cutscene was just a pain.
Sure your o
Re: (Score:1)
Gamers will spend a lot more on upgrades than PCs. Between 1994 and 2012, when I was more into games, I bought a total of 2 PCs. The second one was because I couldn't buy an AT motherboard with a modern CPU. I bought about 5 or 6 motherboards in that time, and a similar number of graphics cards, of course.
Re: (Score:2)
I still see little reason to replace my PC though it's 6 years old. I've upgraded the graphics card somewhat recently and put in a SSD. But the Core i7 processor, 16 GB of ram, motherboard, and all the other bits are original from 2012 and and still compare well to a new PC. The only real innovations seem to be increasing the core count, and some things like NVMe which this motherboard is too old to support.
Re: (Score:2)
The only real innovations
Did you ever in the past upgrade on innovation, or because your computer was unable to do something?
As I said, it depends on what *you*, and you alone, specifically *you* do with your computer. You see no sense in upgrading. I found it a necessity given my not that rare workload of playing with photos and videos.
As for innovation, we just entered the world of raytracing. Expect an upgrade to be necessary in the coming year for quite an incredible jump in the graphic quality of games if you do that. Gaming a
PC is dying! (Score:1)
Because everyone already has a computer and don't feel the need to buy a new one every year!
Other than gamers who want the latest hardware, most people are content with a computer from 2010... I should know, my PC is from 2010 and still more than usable enough after upgrading to an SSD and changing the GPU to keep up with games.
DVD players?? (Score:1)
Fine DVD plsyers ar probsbly a duinge marcket but what about BD anf 4K Bd players. Ore does tfa mean any player for disc based video??
Re: (Score:2)
Since those both play DVDs, I would assume they're already included in the numbers.
Upgrading Has Little "Bang For Buck" Ratio (Score:5, Insightful)
The "Bang for buck" ratio has deteriorated.
ie:
8086 -> 80286 -> good BFBR "Bang For Buck Ratio"
80286 -> 80386 -> good BFBR
80386 -> 80486 -> excellent BFBR [ VESA local bus, faster ram ]
80486 -> pentium -> good BFBR [ pci bus, more ram, much faster speeds, MMX ]
Pentium -> Pentium 4 -> good BFBR [ pcie, more ram, much faster speeds, better video cards, etc ]
Pentium 4 - > Quad Core or Core2 - > excellent BFBR [ pcie, next gen, DDR3/DDR2 memory, much better cores, and more of them ]
Now, we are in the era where we:
-Add slightly faster ram, at the cost of latency - shitty BFBR
-Add slightly faster video cards, at a very large cost - shitty BFBR
-Add slightly faster CPUs, at an obscence cost -- very shitty BFBR
so the BFBR has decreased, where you can spend another $1000, to get 10-15% better performance, measured in "seconds" for most tasks, or less, or a few more FPS, which anything above 30 would be unnoticeable.
spend $100, get an SSD, and make it feel like a new system.
Spend $1000, get a slight performance boost.
I'm still using a 2600k, with 16gb, with a 7770HD video card, I see no reason to upgrade.
Re: (Score:2)
30fps comment. You are blind if you cannot see and feel the difference between 30 and 60.
I guess he is thinking of movies, rather than FPS games or desktops.
I did not like The Hobbit in 48fps, but maybe that was just the 3D making it bad.
Re: (Score:2)
I don't think it was the 3D or the frame rate, per se. I think it was that special effects toolchains were not set up to account for differences in motion perception - tl:;dr the VFX (even subtle ones you weren't supposed to notice) just looked like a video game. Need more motion blurring, for one.
Re: (Score:2)
I see no reason to upgrade.
You didn't mention what you're feeding in an out of your computer?
I would see no reason to upgrade if I took happy snaps on my smartphone, or recorded video using a potatoe. I would see no reason to upgrade if you're sitting there with a 1080p shitty monitor, playing some games that also run fine on the Xbox360.
On the flip side, start editing 50mpxl images (getting standard on new SLRs), or editing 4K video (getting standard on mobile phones), or try and drive a 4K display, with H
Re: (Score:1)
It doesn't make any sense whatsoever, especially as we don't even get a day off work out of it as they do in the States! It seems to have taken over from the traditional Boxing Day/January sales here in the UK, even though those sales were a place to pick up a bargain (in terms of unsold stock brought in for the Christmas rush).
If anything, it seems to have hastened the demise of our High Streets, as people now expect discounts (whether real or imagined) before Christmas rather than afterwards... right duri
Re: (Score:3)
We don't.
Shops have been trying to make it a thing, for about 2-3 years now.
After some footage of Black Friday Walmart rampages a few years back on the news, suddenly shops decided they wanted that and tried to induce it.
Pretty much nobody cares. To us, it's just a pre-Christmas sale when you spend most of December Christmas shopping anyway. And the price reductions are even more fake than other sales. At least "January sales" actually happen as shops sell off leftover stock. Not everywhere, but they do
Re: (Score:2)
Black Friday isn't even that old in the US. It's been around a long while in the sense it's known that the Friday after Thanksgiving is a popular day to go shopping, but it's really only the last 10-15 years or so when the retailers have been hyping it up into a major event. Though it seems to be dying the last couple of years for a variety of reasons - people buying online, the retailers pushing it to be a week and the entire month of November which just means people spread out their spending, and people
Compelling PC software is waning (Score:2)
Once upon a time, all the cool new software was running on a PC that was not connected at all to the internet. Local databases, spreadsheets, word processors, etc. were all designed to wor
I recently built a new PC (Score:3)
When Ryzen came out I thought it was time to upgrade and throw AMD a bone. So I bought a new computer system. Motherboard, CPU, but then... the DRAM and NAND were expensive like heck and the GPU prices were ludicrous because of those god damned coin miners. So I put an old GPU card on it and got lower end memory and NAND products.
I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more... A couple months passed then Meltdown and Spectre came around. So basically I've left it in another floor collecting dust while I'm still working on my old PC. I can't feel assed about transferring the file systems and applications from my "old" PC to the new one. Oh and the case they got me had no 5 1/4" frontal drive bays whatsoever for my legacy discs so I had to buy an external reader. At one point I thought I was better off with a laptop.
I blame the memory cartel pricing, obscene GPU prices which are like 2x what they should be right now, and the CPU manufacturers for a) screwing it up b) Intel keeps spinning new revisions of the same shit over and over and calls it a new product.
So it is little wonder few people want to upgrade. Also 4K just made everything more expensive and it is useless for gaming.
Re: (Score:2)
A couple months passed then Meltdown and Spectre came around. So basically I've left it in another floor collecting dust while I'm still working on my old PC.
Your old PC (especially if it's intel) will be vulnerable to those unless it's old enough to be single core (i.e. a P4 or older).
I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more...
M.2 has the capability to be faster, since it'a a much faster bus.
Also 4K just made everyt
Re: (Score:2)
Low end M.2 drive can possibly be slower on writes than a bit older, higher end SATA drive. New 128GB drive has fewer flash channels than old 128GB drive..
Huh TIL.
I did try a benchmark for sequential reads on a modern high end laptop. The internal (M.2) drive could hit 2.5G/s, versus mine which saturates the SATAII port. That was a good laptop though, not a cheap M.2 drive.
btw for a laptop like yours it can be possible to add USB 3.0 ports on ExpressCard if you have the slot and need or want them.
It actuall
Re: (Score:2)
My old PC has an AMD Piledriver processor. It doesn't do SMT. It does CMP. So it isn't as vulnerable to those exploits as say Ryzen or an Intel processor. It is vulnerable to out-of-order branch prediction attacks though. But then again so is nearly everything else including ARM.
Re: (Score:2)
My old PC has an AMD Piledriver processor. It doesn't do SMT. It does CMP. So it isn't as vulnerable to those exploits as say Ryzen or an Intel processor.
Old i7 with hyperthreading here. Probably vulnerable a. f.
It is vulnerable to out-of-order branch prediction attacks though. But then again so is nearly everything else including ARM.
Phones, yes. I think the RPi is safe since it's an in-order processor. Though that's why it's not really very speedy and the 1.4GHz Pi3 is absolutely no match for this 1.7GHz
Re: (Score:2)
I ended up with a M.2 NAND drive which was not any larger than the SATA one I had in my old PC and cost about as much if not more...
That alone should have netted you a close to 10x improvment in performance for identical cost unless you bought an incredibly crap M.2 drive.
obscene GPU prices which are like 2x what they should be right now
GPUs are going for bargain basement prices right now. New prices are back below MSRP, and the second hand market is flooded with cards. NOW is the time to buy, and if you're anywhere close to that 2x mark it's time to pick a different store.
and the CPU manufacturers for a) screwing it up
In what way?
So it is little wonder few people want to upgrade.
It really isn't, but not for the reasons you cite. You didn't specify your workload. You're happy with an old GPU, can
Re: (Score:2)
The CPU manufacturers screwed it up by making buggy processors and, at least in the case of Intel, with little performance benefit over much older processors.
I wanted to game with higher detail on and to compile things faster. I also occasionally encode video or audio. That's why I bought the AMD Ryzen so I would have more concurrent threads to improve compilation. I might buy a new GPU card and larger SSD and put that Ryzen computer to use eventually. However it might be a bad idea because AMD is currently
Re: (Score:2)
Other people agree that GPUs are still overpriced like heck.
https://www.eetimes.com/docume... [eetimes.com]
Re: (Score:2)
That article says nothing about GPUs being overpriced. It's a bitching and moaning piece that the current top of the line GPU is the most expensive one ever released ignoring that there are plenty of high performance parts with great prices available, mark-ups have gone with the crypto-bust, and due to current market oversupply getting a high-performance GPU has never been cheaper. Bitching about the high cost of being an early adopter for a new technology that is currently sold out everywhere (implying it
Re: (Score:2)
The CPU manufacturers screwed it up by making buggy processors and, at least in the case of Intel, with little performance benefit over much older processors.
Not at all. The CPU does exactly what it says on the box. They traded security that is almost irrelevant to general consumers in favour of performance and as shown recently that security can happily be regained through software. The CPUs currently on the market do not show any higher level of errata than in the past and they still happily work like they did in the past.
What year is it? (Score:1)
DVD player? I haven't seen one in years. And desktops have only been found in libraries since tablets hit the market.
Not ditching, just not replacing (Score:4, Insightful)
Everything has this cycle, where it gets to the point that what you already have is good enough, and further small tweaks do not justify the cost of replacing.
The biggest difference between D9 and D5 (Score:1)