Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Almighty Buck

Energy Supplier Counts Cost of Devices on Standby (bbc.com) 146

UK households could save an average of $183 per year by switching off so-called vampire devices, British Gas research suggests. From a report: These are electronics that drain power even when they are on standby. The figures are based on research conducted on appliances in 2019 but have been updated by British Gas to reflect recent price increases. The Energy Saving Trust (EST) said consumers need to consider which devices they leave switched on. It estimates households would save around $68.5 per year by switching off all their devices when not in use. The organisation, which promotes sustainability and energy efficiency, did not give exact details of how it came to this figure. "Stats or prices related to individual appliances depend on several factors, including model, functionality and individual usage," it said.
This discussion has been archived. No new comments can be posted.

Energy Supplier Counts Cost of Devices on Standby

Comments Filter:
  • by Echoez ( 562950 ) * on Thursday April 28, 2022 @02:33PM (#62487504)

    My guess is that they analyzed how much power is drawn by devices on standby, and assumed you could 100% turn those off in order to achieve these savings. Which means that in the real world, any such savings are far lower.

    When I think of the devices on "standby", I come up with: Wifi router, multiple Roku, 6 device chargers, Ring video doorbell, alarm clocks, LED lights on devices like the microwave, XBox Series S, printer, etc. I can imagine that if I was super vigilant in unplugging those devices, it could yield some savings. BUT, there is a cost in such a process is a giant pain in the ass.

    Honestly, I would pay the $15 a month to NOT care about this, and just continue my current usage pattern.

    It would be better to focus on swapping out our energy PRODUCTION to be clean and sustainable rather than nickel-and-diming our CONSUMPTION.

    • by AmiMoJo ( 196126 )

      They seem to have picked the worst examples too. Modern electronics are covered by EU rules on standby power.

      • Re: (Score:2, Informative)

        Indeed. Most devices in standby mode draw very little power.

        I have a Kill-a-Watt meter [amazon.com] and measured many of my devices. The clock on the microwave draws less than one watt. Same for the standby mode on my TV and laptop charger.

        A simple method to estimate power consumption is to put your hand on the device. If it isn't warm, it isn't drawing much power.

        If you want to save energy, turn down your thermostat.

        • More devices should have physical power switches. Unplugging the device can inconvenient, depending on where the outlet is. Older TVs had it right.

          • If you're in the UK then CPC do some really good switched plugs. Big switch on the top and a small neon indicator to show they're on. I've fitted them to several devices which have come without an actual on/off switch (TV, Kitchen Radio, DAB tuner, printer):

            https://cpc.farnell.com/pro-el... [farnell.com]

            CPC are always doing offers too so when I bought mine they were around 1.90 GBP each. Well worth it for my use case.

            Obviously if you've got switched sockets you don't need them but I live in an old house and most of th

            • Not in the UK, but reaching for the outlet (which may be behind something etc) to switch it off is almost as inconvenient as just pulling the plug. Which is why, IMO, the switch should be on the device itself.

              • by jabuzz ( 182671 )

                Even more daft given that 99% of UK socket outlets are switched. Typically you only have unswitched sockets on something that is remotely switched and/or you don't want to accidentally be turned off (think fridge/freezer).

                If you are plugging them into power strips then just get a power strip with switched outlets rather than chopping plugs of everything.

                • Really 99%? As in there are wall switches in every room to turn almost every outlet in your residence on or off? I don't recall that last time I was in the UK, but maybe I wasn't paying attention. I'm in the US and here in a room that might have 4 wall outlets, there might be a wall switch for one or two of those outlets, but certainly not all of them. And of course if there's a ceiling-mounted light or fan, that will be wall-switched.

                  • Where I live in .au, all power sockets are switched. The switch for each socket is mounted on the same plastic plate as the socket itself. I'm not an electrician but I assume if must be a standard here? Can anyone from .au/nz weigh in?

                    • Ah, yeah I have seen these in Europe as well, although not everywhere. I would say that if I'm reaching down to hit that switch I might as well just unplug it. My issue in my house is that so many outlets are behind furniture I really can't get to them, switch or no switch.

          • More devices should have physical power switches.

            Do you really need an off-switch for the clock on your microwave just to save 0.1 watts?

            Why not just buy a microwave without a clock? There are many available.

            More mechanical switches mean higher costs and more points of failure.

            • STBs (at least the ones I have) typically use more than 0.1W, given by how warm they are even when "off".
              As for microwaves, the cheaper ones use a mechanical timer and it has a proper switch, just the more expensive ones have a clock and are never completely off.

              • As for microwaves, the cheaper ones use a mechanical timer and it has a proper switch, just the more expensive ones have a clock and are never completely off.

                That's because you've paid five times as much for a device that does exactly the same thing and they have to add all sorts of crap to it to make you think it's worth paying the extra amount.

                I have a (currently) 12-year-old mechanical-timer microwave that I paid $59 for. My neighbours have a $349 non-mechanical microwave of about the same age. Here's how cooking works with them:

                $59 microwave: [twist] Cooking starts.
                $349 microwave: [beep] [boop] [beep] [beep] [boop] [beep] [beep] [boop] argh dammit [boop]

                • My microwave also has a traditional heating element, which I prefer for something like a frozen pizza. The element has failed, so, yeah, now it is pretty much like a cheaper microwave, maybe the defrost function is something useful though.

                • by piojo ( 995934 )

                  I'm glad that works for you, but I like consistency. 30 seconds is 30 seconds with a digital timer. (And I can change do 25 seconds instead if I want a bit less heat than usual.)

                  And it sounds like your neighbors bought a shitty microwave. Shame on companies that make devices with bad user interfaces, and ship without thoroughly testing and iterating.

            • by dbialac ( 320955 )
              Everything you do adds up. Look at what I did in my first post on this thread.
          • by dbialac ( 320955 )
            Get a power strip and use the physical power switch on it. It works and it's easy.
            • Well, I guess I can duct tape the power strip to the device and use a power strip for each device I want to turn on or off. I could just attach a switch to the power cable though, it would be more convenient than a power strip.
              Or, I can just keep it on and not bother with it, after all, it's not a lot of money.
              My point is that devices should have this type of switch built in so that more people would turn the devices off instead of only the ones who really care about doing it.

              Unplugging the device is inconv

      • In addition, I'm less interested in the the parasitic usage and more interested in the difference between keeping them powered 24/7 (with sleep for most of that) vs having all of these fully switched for a year. Power usage during startup can be higher than in normal operation for some devices, and during that startup time the user can't use the device at all.

        So for example I'd like to see the difference in energy usage over a whole day in 2 scenarios:
        1) TV/cable box has been sleeping for 23 hours and watch

        • by AmiMoJo ( 196126 )

          Cable boxes seem to be terrible for long boot times. Last time I saw a Virgin Media one it took a good 10 minutes to start up and start being responsive. It reminded me of Windows 95, where the desktop would appear but you had to wait another few minutes while stuff loaded before it became useful.

          The other thing to consider is the potential for damage if items are left turned on. If there is lightning or some other electrical issue that causes a voltage spike, and the device is in standby, it might get dama

    • by fermion ( 181285 )
      $6 a month. Most who use air conditioning would see that as an insignificant fluctuation. Rounding error.

      I buy energy star products. For a tv, that means half a watt in standby mode.

    • Comment removed based on user account deletion
      • We're also converting AC power to DC power dozens of places all over our homes. I would think there'd be too much power loss to convert the house to have dedicated 12v and 5v wiring to every room compared to the efficiency gains of only having one AC-DC converter.

        Just under my desk, I probably have 10 power bricks, half 5v and half 12v. Even if it was just one DC power adapter for an outlet, I don't think USB-C is good enough for power distribution yet and there might be more wasted power than the status

        • Re: (Score:2, Interesting)

          by Anonymous Coward

          Tried getting a powered USB hub and only plug in its wall-wart, not the link to the computer?

          Back when those USB bitcoin mining sticks came out suddenly miners needed lots of USB ports with lots of power, so there were rather beefy 7 port USB hubs that took external power. Haven't seen them in a while, though, but might still be around.

          Distributing DC without needing signalling is not too difficult. If you can solder and do a bit of woodwork for the housing you can make your own. It's just that everyone u

          • How many USB hubs do you know that have multiple USB-C ports implementing the full USB-PD standard with variable voltage and amperage? This would be a new device by necessity.

            Cramming it into a power strip would require something very compact - and more expensive like a GaN circuit board.

          • Low voltage (5V) wiring needs bigger cables than 220V wiring for the same current. The reason is voltage drop. Losing 1V from a 220V line is less than 0.5%, losing 1V from a 5V line is 20%, so the 5V line needs thicker cable despite carrying much less power.

        • This gets me wondering if we should have a whole house inverter that changes 240 volts or 120 volts to something like 48 VDC, and have each outlet negotiate with the device on how much voltage it will get, similar to how USB-C can negotiate up to 48 volts to get a lot of charge on skinny little wires, and be smart enough to tell the difference between a smartphone versus a fork. By having one inverter might just save electricity than a ton of devices doing their own inverting. Plus, it would make battery

          • To avoid having wall-warts plugged into outlets, how about we build wall warts right into the outlets, even the unused ones? That's basically what it would look like. Also, it would require two sets of wiring and two sets of outlets (kettle for example would still need 220V). I think that regular wall-warts are cheaper.

            • Bad idea. Power converters fail more regularly than a power receptacle. After seeing some of the shoddy and non-code compliant work that some homeowners perform, that concept is I'll-advised. Much better to have an externally mounted power brick with multiple USB and USB-C ports.
              • Yeah, my previous post was sarcastic. If having a power supply plugged into the wall with nothing connected to it is a problem (wasted power), then having it built into the outlet does not make the problem go away, it makes it worse, because now you can't just unplug it.

          • by tlhIngan ( 30335 ) <slashdot&worf,net> on Thursday April 28, 2022 @08:13PM (#62488170)

            This gets me wondering if we should have a whole house inverter that changes 240 volts or 120 volts to something like 48 VDC, and have each outlet negotiate with the device on how much voltage it will get, similar to how USB-C can negotiate up to 48 volts to get a lot of charge on skinny little wires, and be smart enough to tell the difference between a smartphone versus a fork. By having one inverter might just save electricity than a ton of devices doing their own inverting. Plus, it would make battery backups easier, although electrical switches would have to be beefier due to no zero crossings.

            Lower voltages lead to higher losses. They're called IIR losses and as the voltage goes down, the current goes up. But the losses with that current going up by the square.

            It's why power lines are in the megavolt range - you want to carry as little current as possible to lower the losses as much as possible. Doubling the voltage to get half the current means you lose 1/4 the power.

            The power lines going from the substation to the transformer (pole pig) are only on the order of 30-50A or so. Compare that with your house which may get 200A service. (The voltage to the transformer is typically 7200V phase to ground, or 30 times the voltage your house gets of 240V single phase, but it also means 1/30th the current flows or it will lose 1/900th the amount of power than if it ran at 240V.

            It adds up.

      • They haven't invented a clock yet that only shows the time when you look at it.

        Sure they have- it's called a cell phone.
    • Yeah, apparently this claim has been debunked & speculation points to a tactic to blame consumers for excessive consumption instead of targeting big industrial consumers of electricity that can & should be adopting more efficient practices.
      • by Xenx ( 2211586 )
        The more cognizant the populace is, the more likely they are to encourage the same from businesses. It isn't perfect, but we shouldn't ignore a lesser problem for the presence of a greater. We should be working on both. That said, I do agree that they want this to keep attention off of them.
    • It would be better to focus on swapping out our energy PRODUCTION to be clean and sustainable rather than nickel-and-diming our CONSUMPTION.

      Are we talking just about idle coffee makers, or home insulation? The latter is by no means nickles or dimes, and is more cost-effective than producing more electricity.

      https://theclimateadvisor.com/... [theclimateadvisor.com]

    • I have pretty good branch circuit level metering on my house. Purely parasitic loads while sleeping across 7 Sonos speakers, one multi-port charger, a TV, a UPS, and 4 little wall warts last night was 277Wh (slept in today, 9h sleep). That's about 100kWh/year.

      In contrast, my CCTV and network consume about 2.2MWh/year. (Cameras and NVR are about half of that.)

      There might be better things to focus on.

      • When most of the world cannot afford a clean source of cooking fuel, the concern about being nagged about parasitic loads that are a mere 5% of the energy use of a "CCTV and network" is a good problem to have.

    • by dbialac ( 320955 )
      I've actually done exactly this. I have LED lights that I got free from the power company and I have power strips that I turn off when I'm not using the devices (TVs, computers charging, etc.). I thought the power company was joking when they provided me with a graph showing my power consumption significantly below the average for the area. I recently had a saleswoman for solar tell me she'd never seen power consumption as low as mine ever and she wondered how I did it. I also knock down the Heat/AC if I'm
    • I have a whole house energy monitor and my data reveals that my vampire loads account for about 2% of my energy usage. I don't leave energy hungry devices plugged in (e.g. a Keurig) that are not in use regularly.

      In that vampire load, I also exclude the water heater and "home server", though I do account for them separately. I replaced my traditional tank water heater (it failed) with a heat pump (no gas hookup available and not enough electrical capacity at home for a tankless) and the energy usage is abo

  • TV and Blu-Ray (Score:5, Insightful)

    by JBMcB ( 73720 ) on Thursday April 28, 2022 @02:35PM (#62487510)

    We can turn off our TV and Blu-Ray player at the power strip. If you turn on the power strip, then turn them on, you have to wait between half a minute to a minute for them to "boot" So, I'm not sure how much power you are saving by turning those off. According to our Kill-A-Watt, they use around half a watt-hour a day on standby.

    • Cable box. I believe that's the "damn, it's still booting up?" TV-side grill hardware that no one wants to say aloud for some reason.

      TFA says that consumers could save up to £55/year by switching devices off when not in use, and yet almost half of that estimated savings, is coming from one device.

      There's your "vampire".

      And I would agree with you. Kill-A-Watt's are great for this kind of stuff.

    • Oh no, a whole minute! Also if your Kill-A-Watt is saying you're using half a watt-hour a day on standby then you're not representative of the devices you find in peoples houses. Heck my 2 year old cable box used 20W on "standby". I say past tense because we're cable free for 2 months now.
      Our 10 year old TV uses about 7W in standby, my hifi about 20W.

      0.5W is what the remote controlled power strip uses.

      • by JBMcB ( 73720 )

        Oh no, a whole minute!

        Right, and that whole minute, on the TV, is using ~380W of power. So I'm using *more* power turning the TV off at the power strip, unless I'm going on vacation or something and not using the TV at all for a week.

        And, yes, our devices sip electricity on standby, as I put them into "sleep" mode in the settings, which adds a few seconds of startup time instead of a full minute. I'm unaware of a cable box that has a similar setting.

  • by Casandro ( 751346 ) on Thursday April 28, 2022 @02:48PM (#62487538)

    Essentially your device must only consume up to half a watt when in standby. However once it's providing some service (e.g. displaying a clock) it may consume more.

    However the days when a TV-set would consume 30 Watts in standby are long gone.

    • However the days when a TV-set would consume 30 Watts in standby are long gone.

      Those TVs however are not long-gone. We bought out TV in 2011 and there are no signs that it needs replacing anytime soon. I think my father's TV is 5 years older than that.

      The standby laws only came in in 2013. My brand new hifi has a "sleep mode" the manual is very clear that it is not to be called "standby" precisely because it's not considered a standby mode according to the EU regulations.

      That's the beauty of the EU regulations. You need to switch into a "low power mode", but only the ones you call "st

  • by devslash0 ( 4203435 ) on Thursday April 28, 2022 @02:53PM (#62487554)

    Firstly, the Watt rates per hour they quoted don't meet any modern standards.

    For example, most TVs are said to be anywhere between 1W and 3W on standby these days. Let's say that a TV runs on standby for the full year and consumes an average of 2W per hour. That is 2W * 24h * 365.25 days = 17.52 kWh. Current cap on the electricity unit rate in the UK is set at £0.2834. This gives us £4.97 for an average modern TV, not £24.61!

    They are talking completely out of their bums about overcharging and battery wear. Firstly, electronic controllers on modern phones and laptops stop the supply of electricity when full. Secondly, the logical battery capacity that the user sees is usually lower than the physical capacity, capped so that even charging the battery to the maxium doesn't put it in a state of a maximum charge, causing excessive wear.

    • most TVs are said to be anywhere between 1W and 3W on standby these days

      My 4-year old LG 43-inch TV uses between 0.1 and 0.2W on standby (so little the meter struggles to settle). That's another order of magnitude smaller still.

      • Of course it does, the EU mandates a maximum of 0.5W for a TV on standby since 2013. Problem is not everyone has a 4 year old TV.

    • Not "watts per hour", but watts. Watts per hour is a measure of how fast your power consumption (or more commonly, generation) is changing.

      If your TV's power consumption increased at 2 watts per hour for a year it would be using 2W/h * 24h * 365d = 17.5 kW at the end of the year, which is a lot!

      • You multiplied 3 units which had time and had a result without a time unit?

        17kWh/year isnâ(TM)t that much, in the US that costs you 50c-$2 depending on where you live.

        • Oop, that's what I get for copying the GP's format. That should just be "1 year" (or "365 * 24h") rather than "24h * 365d".

          17.5 kWh/year is fine, but 17.5 kW for a TV is not. That's about how much two electric showers pull while running, or ten fan heaters. 17.5 kW for a month would be almost 13,000 kWh. It's a bit much for a TV in standby.

    • these days.

      What do you define as "these days" and have you taken a survey of how old some TVs are? TVs on sale in Europe are actually 0.5W "these days" (and all days since 2013), but the reality is that there are many old TVs still in active use.

      • I have several older LCDs that predate any mandates about standby power and they fail to register on the watt meter I have indicating that they must be below 1W. I haven't worried about standby draw since the days when CRTs kept the tube warm for a faster startup time. I have my doubts that there are many TVs in common usage that use meaningful amounts of electricity in standby. Set top boxes on the other hand I can believe as they don't just stay "warm" they are in active use keeping track of the schedule,
  • These devices that use electricity turn it into heat. If you have thermostats in your house, that heat will reduce the energy consumed by your central heating. Of course if you have air conditioning it will increase the amount of energy used by that in summer, but almost no-one in the UK has air conditioning.
    • If you have perfect convection and don't have warm spots near those devices. I don't think that happens much in the real world, though.

    • If you have thermostats in your house, that heat will reduce the energy consumed by your central heating.

      That is a very inefficient way to heat a house. Carnot cycle and transmission losses mean you only get 40% or less of the heat energy in the gas burned to generate the electricity.

    • by jabuzz ( 182671 )

      We have the physics winner here. Same goes for energy saving bulbs. Mostly the bulbs are on when it's dark, and that's mostly the winter and well in the UK you mostly have your thermostatically controlled central heating on in the winter. If you have gas central heating it might be cheaper, but then I am in Scotland so 96% of all my electrons are low/zero carbon which is better.

      The fuck wits that wrote this article need to go back and do some elementary thermodynamics first.

  • by Ormy ( 1430821 ) on Thursday April 28, 2022 @03:13PM (#62487590)
    My electricity prices have doubled in recent months so I've been going around finding out how much power my devices use when in use and in standby. I plugged my entire AV system through a single power meter which includes a 55" LG OLED, decent spec gaming PC, Denon AVR (from 2011) and subwoofer amplifier (750Wrms x4 channels class D), the latter two are usually plugged in to their own outlets. With the PC in sleep, and everything else on standby it was all together drawing 2.2W. What shocked me is how much the amplifiers where using when turned on but not playing any sound, nearly 50W each. And of course when gaming with the sound and the subs blasting it's drawing over a kW but it's going to good use. Wifi router is using around 7W, clock on the microwave less than 1W. Single data point I know but there it is.
    • What shocked me is how much the amplifiers where using when turned on but not playing any sound, nearly 50W each.

      That is shocking.... ly low for an amplifier. Be happy you don't have a Class-A amplifier design. My 100W/ch amplifiers pulled 800W each from the wall before I replaced them a few years back. High-end audio is overwhelmingly to this day typified by woefully inefficient designs.

      It's only recently (past 10 odd years) that highly efficient Class-D amplifiers started finding their way into the premium audio world.

      You'll find a lot of devices these days draw less than 0.5W on standby to meet EU regulations, but

    • by Jamlad ( 3436419 )
      I have two comments: why is your decent specced PC in sleep mode? Boot sequence is so fast these days that it's almost negligible.

      Also, personal preference, but I grew up at a time when it was advised to unplug devices during an thunderstorm to prevent electrical surges destroying them. I even remember the power going out a handful of times during a storm when I was a kid.

      Obviously, the grid is a lot more robust these days due to semiconductors but I still turn off all of my expensive toys with the power

  • Cable boxes were historically among the worst of devices. They would consume a good amount of power, and turning them off simply shut off the picture output without reducing power consumption at all. I'm sure there are still a ton of those out there, despite the shift towards streaming. I figure the big consumer in our house would be the computer that are left on. Even with everything "off," the house is still typically pulling 0.4kW.

    • One very widely deployed 1080 HD settop with DVR, from around 18 years ago, turned off the HDMI output and the front panel display when "Off". When off it was still running CPU, memory, tuner, two demodulators, and an upstream burst modulator, and drawing around 40 W. I don't recall if it powered the DVR hard drive down when not in use.
  • I know things went a little sideways when the UK left the EU but when did we switch our currency to dollars?
  • by Hans Lehmann ( 571625 ) on Thursday April 28, 2022 @05:24PM (#62487932)
    This "study" was done by a company with a vested interest in you using more gas and less electricity. I'm going to assume that every other word is a lie.
    • This "study" was done by a company with a vested interest in you using more gas and less electricity. I'm going to assume that every other word is a lie.

      Here's some factoids for you:
      a) Electronic components don't run on gas so turning them off doesn't help you conspiracy to increase gas use.
      b) Gas makes up a significant portion of electricity so turning devices off would outright contradict your conspiracy to increase gas use.
      c) There's a push to get massive reduction on gas use right now so that the electricity prices get under control again (because of the aforementioned link between gas and electricity). Companies like British Gas are losing money due to

    • by mjwx ( 966435 )

      This "study" was done by a company with a vested interest in you using more gas and less electricity. I'm going to assume that every other word is a lie.

      You do know that British Gas is also an electricity supplier?

      They will give you tips on saving gas too.

  • As many commenters have pointed out, there's some serious holes in the calculations made to derive this saving.
    They seemed to be based on ancient appliances, rather than modern ones.

    The fact is, if you want to really save on energy use, there's just so many better ways to do it:

    1. Only run dishwasher when full
    2. Wash clothes less often, use the eco settings (even though they take longer, they use less power)
    3. Avoid the tumble drier - in winter, make use of radiators for drying
    4. Get into the habit of turni

    • 3. Avoid the tumble drier - in winter, make use of radiators for drying

      You may well end up spending all of your savings on mould and rot treatments if you do this too much.

      • Hey it's UK we're talking about. They use hotwater bottles while sleeping, not to keep warm, but to weigh down the covers so they don't fly off in the cold draft. I don't think mold is going to be an issue.

  • It's very unfortunate that it quotes annual prices in British pounds, though. It really should have annual kWh, since people in different countries pay a different price. And even UK utility customers will not pay the same per kWh rate over time.

    I live in a very large home, with hundreds of plug-in devices, and dozens of hardwired appliances as well. I have been slowly measuring the power consumption for each plug-in device. Not using a Kill-a-watt, mind you, but using smartplugs with energy metering, speci

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...