Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Media Power Entertainment Technology Hardware

The Energy Saved By Ditching DVDs Could Power 200,000 Homes 339

Daniel_Stuckey (2647775) writes "The environmental benefits of streaming a movie (or downloading it) rather than purchasing a DVD are staggering, according to a new U.S. government study by researchers at the Lawrence Berkeley National Laboratory. If all DVDs purchased in 2011 were streamed instead, the energy savings would have been enough to meet the electricity demands of roughly 200,000 households. It would have cut roughly 2 billion kilograms of carbon emissions. According to the study, published in Environmental Research Letters, even when you take into account cloud storage, data servers, the streaming device, streaming uses much less energy than purchasing a DVD. If, like me, you're thinking, 'who buys DVDs anymore, anyways?', the answer is 'a lot of people.'" The linked paper is all there, too — not just an abstract and a paywall.
This discussion has been archived. No new comments can be posted.

The Energy Saved By Ditching DVDs Could Power 200,000 Homes

Comments Filter:
  • by Geoffrey.landis ( 926948 ) on Thursday May 29, 2014 @03:39PM (#47122493) Homepage

    If you read the article in detail, the energy cost for a DVD rented or purchased by mail is pretty much identical to that of one streamed (figure 4.)

    The purported energy cost difference between DVD and streaming is entirely due to the fact that they assume you drive to the store to buy or rent the DVD. (In fact, there is actually a tiny bit more carbon emitted if you stream instead of rent or buy by mail, if you look at the right image on figure 4).

    I assume if you buy or rent from a store you're going to visit anyway, this differnce vanishes

  • by Jmc23 ( 2353706 ) on Thursday May 29, 2014 @03:41PM (#47122527) Journal
    Not quite. The only difference seen is with people driving cars to purchase the dvd.

    So all of the 'environmental benefits' boil down to the assumptions they make about those purchases.

    Perhaps it's just me, but I would lean more towards people already being at a store/mall for another purpose and picking up the dvd as an impulse buy. Non-impulse buys of dvd would seem to more logically take place over the internet.

  • by Russ1642 ( 1087959 ) on Thursday May 29, 2014 @03:54PM (#47122659)

    Don't start all this "can't tell the difference" crap. Until you can get internet lags and stutters completely eliminated we'll be able to tell the difference.

  • by wagnerrp ( 1305589 ) on Thursday May 29, 2014 @04:02PM (#47122731)
    All "clouds" must be over the internet. The whole point of "the cloud" is that it is located remotely, on someone else's hardware, managed by someone else's IT staff. Elsewise, it's nothing more than the same data center you had a decade ago.
  • by pla ( 258480 ) on Thursday May 29, 2014 @04:03PM (#47122743) Journal
    Not quite. The only difference seen is with people driving cars to purchase the dvd.

    This - THANK you, someone on Slashdot knows how to read! Hell, you don't even need to read, just look at the pretty chart.

    Physically dragging yourself to the store, just for the purpose of buying or renting a single DVD comes out to more energy used. Every other scenario comes out to less energy, including buying it and having it mailed to you. And if you ignore the salmon-colored portion of each bar (the part that goes toward driving) because, for example, you bought a DVD while out and already at the store getting other stuff... Store-bought would actually come out as the most efficient.

    More suspiciously, I find it odd that they dropped the "client device operation" energy consumption by over half for streaming. I don't know about you, but my USB-powered DVD drive draws under 2.5W; My TV draws 80-90W. I'd love to ask the authors what part of streaming magically makes my TV 20x more energy efficient.

    "This info-tisement brought to you by Netflix and Blockbuster, who really wish you'd quit insisting we stock all these damned physical discs; and by the MPAA, who would like to remind you that you only license the contents of your DVDs, they can still revoke that license any time they want."
  • by mmell ( 832646 ) on Thursday May 29, 2014 @04:14PM (#47122865)
    Most clouds I've worked with to date have been corporate clouds. No internet involved. Networks, yes; but no internet. Lag was never a problem for me in those environments.
  • by Anonymous Coward on Thursday May 29, 2014 @04:15PM (#47122881)

    All "clouds" must be over the internet. The whole point of "the cloud" is that it is located remotely, on someone else's hardware, managed by someone else's IT staff. Elsewise, it's nothing more than the same data center you had a decade ago.

    Not necessarily true. One aspect of the cloud is being able to rapidly expand capacity or relocate workloads based on application needs. "located remotely, on someone else's hardware, managed by someone else's IT staff" is more like a definition of out-sourcing. Cloud can be on my hardware, managed by my staff, be migrated to or augmented by remote capacity during peak times or special circumstances.

  • by AaronLS ( 1804210 ) on Thursday May 29, 2014 @04:45PM (#47123153)

    You are both idiots for not knowing how to argue. Zeromous, perhaps your point is "Someone can be hosting a cloud locally to support a business/agency. So it can be available over 1gbit LAN with indiscernible latency, or be in a geographically close data center with an interconnect of equivilant bandwidth."

    But you didn't provide any supporting facts so your equally ridiculous.

    Zeomous of poor reading comprehension says: "I'm still trying to figure out what you exact beef here is." when Russ had just said "lags and stutters".

    Anyhow, usually what people host locally is not a cloud infrastructure, since if you're not doing hosting for third parties, and only your own organization, your virtuallization needs are met by a simpler cluster architecture. Some call it a cloud infrastructure, but usually is just a virtuallized cluster. Virtuallization != cloud. Cloud involves virtualization. Not all virtualization is a cloud. Very much the not-all-black-birds-are-crows kind of thing.

  • by rev0lt ( 1950662 ) on Thursday May 29, 2014 @05:24PM (#47123595)

    Actual KW saved by not running directly on metal, and squeezing every possible resource out of a highly efficient and redundant server.

    On the other hand, many "cloud" services are actually grid services that run on many, redundant, small servers, in contrast to the blade center HP and IBM tries to shove down your throat. One example is GMail and the assorted google services. So, while I understand your point about virtualization, cloud and virtualization are two very different and very distinct things.

    It means asset depreciation is much lower, so server churn is much lower (less carbon, less waste less garbage)

    It depends how you measure it. In a pure cpu-power-per-watt, 1U servers are way cheaper than an equivalent blade solution, easier to service, and will run cooler. They do take more space, but asset depreciation on a 50K blade cage vs 30K of 1U servers is bigger in the blades.

    every watt is consumed rather than dissipated as heat

    Well, its not, and this is one of the biggest fallacies of virtualization. It wildly varies according to the workload and your configuration. For small workloads, you may even spend more in hardware to provide proper virtualization than you had to pay for a metal solution. You do gain flexibility, and yes, when well done, you may take more advantage of your hardware, but this is not a novel concept. When possible, solutions like linux containers, solaris zones and freebsd jails allows at least some level of flexibility with a smaller execution footprint.
    And regarding usage... well, most cpu's even implement an instruction that internally halts the cpu if not in use. Cpu consumption varies according to the workload, and most of the specs mention max consumption, not average consumption. It may even happen that your beefier setup actually spends more power per vm than single dedicated servers.

    It means common parts for all servers which leads to less manufacturing waste.

    Yes, but is it cheaper? As an example, almost all industrial processes wastes copious amounts of water, when often more sofisticated and reusable replacements are available. But water is cheaper. Its a bit like saying "this aluminium package is 20% smaller, so we can stop using cardboard packaging because it generates less waste". I would like to see proper metrics on that, not sure if it is that obvious.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...