Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software The Internet

The Cost of Distributed Client Computing? 527

ialbert asks: "I only recently decided to install SETI@home on my mostly idle home computer. It got me thinking though, are those free processor cycles truly free? Has anyone had experience with processors dying prematurely due to a constant, heavy load, or is usage pretty inconsequential? What about other components, like harddrives? And how much does a 100% processor load increase your power bill versus a 1-2% idle load over the course of a year? It's easy to think of idle computers as an untapped computational resource, but what are the costs to the computer owners?"
This discussion has been archived. No new comments can be posted.

The Cost of Distributed Client Computing?

Comments Filter:
  • I imagine that processors die mainly though being powered on and off repeatedly, than being fully used. Same for hard drives I expect.

    I don't imagine it's possible to "wear out" a processor by using it. Course I could be wrong...
    • Well, if you run a processor too hot its lifetime will certainly be reduced. Insofar you can wear it out.
      And since processors get hotter when they're used than when they're idle you can wear it out buy using it intensively. Of course only if you don't have a good cooling solution. (i.e. a well-proportioned cooler..)
    • Chips can "wear out". There are physical effects that make molecules move when there is an electric current in a conductor. My understanding is that this is a known problem and the traces are made thick enough so that it will be many years before they start to fail.
      • Re:Wear Out (Score:5, Informative)

        by randyest ( 589159 ) on Wednesday October 15, 2003 @12:06PM (#7220505) Homepage
        Right, and the standard in the ASIC industry is a 40 year lifetime minimum before electromigration will lead to failure in normal use (which means yo keep the chip in the allowed operating temperature range, regardless of if it's overclocked or not). That's 40 years. What hardware were you using 40 years ago?

        Point is, even running chips hot, to a degree, (pun not intended) doesn't reduce their lifetime enough to worry.

        Some of the other points, such as increased power use, and accelerated failure of mechanical components such as hard drives, are valid. But chip wear-out is a non issue -- you'd have to heat your chip past the point of system stability to get the em lifetime down low enough to care about it.
    • by sjwt ( 161428 )
      From what i understand,
      if you are useing an overclocked Intel chip,
      then yes, as they change the cycles to suite
      the load and heat, you may age the chip,
      but the ageing is only slight.

      On AMD chips, they run the same weather under
      load or not, so theres no ageing there.

      Most of the damage to chips happens durning
      booting up, powering down and spikes and surges.

      [overclockers.com]
      Overclocking's Impact on CPU Life
      • No, both chips run cooler when idle and hotter when loaded. I don't think you know what you're talking about. The Intel cpu throttling won't kick in if you have a heatsink with a fan.
        • by default luser ( 529332 ) on Wednesday October 15, 2003 @02:01PM (#7221650) Journal
          Yes, the grandparent post is incorrect.

          Pentium IV CPUs have an internal temperture diode, just like every Intel chip since the Pentium II Deschutes core ( excluding early Celerons ).

          As opposed to all chips before it, the Pentium IV will do more than just crash when overheating. It will dynamically reduce it's own clock speed to reduce power consumption. But this feature will only come into play when the cooling solution is unable to keep up with the processor ( IE: dead fan, extremely hot room ), and will not affect performance under normal conditions.

          What the parent was referring to is the HLT instruction, which will cause the processor to do nothing and reduce power use. Most modern processors support it, and most modern operating systems ( including NT and Linux ) execute these instructions in an idle thread.

          This is basically the concept of this discussion: will your computer run hotter under load rather than running idle HLT commands?

          The answer is yes. What this means to you in terms of silicon lifetime is probably beyond the expertise of anyone here on Slashdot, so take every "insight" with a bag of salt.
  • what are the costs to the computer owners?

    $4.23

    Next question?
  • missin the point. (Score:4, Insightful)

    by sg_oneill ( 159032 ) on Wednesday October 15, 2003 @11:50AM (#7220255)
    While it is an interesting question, the reason you donate cycles to seti/columb rulers/cancer research/whatever is you love science and the progress of humanity.

    Its not about money.

    Or to put it another way. How much CPU cycles are wasted on Pr0n, and how does this help society? :)
    • How much CPU cycles are wasted on Pr0n, and how does this help society?

      You cannot waste CPU cycles on Pr0n.
    • Then I suppose you have no trouble endorsing your paycheck over to your favorite charity, since financial concerns pale in comparison to the humanitarian good that can be done. Or are you missing the point also?

      This person's not missing the point at all. He just doesn't want to donate something he might not be able to afford.

      RP
    • by GoofyBoy ( 44399 ) on Wednesday October 15, 2003 @12:23PM (#7220692) Journal
      >How much CPU cycles are wasted on Pr0n

      I have a computer and Internet connection specifically for pr0n, so my CPU cycles are not "wasted" but "perfoming its main function".

    • Yeah, but how about 100.000s of idiots spending $1 million+ electricity on cracking an idiotic 64 bit key to get a $5000 price?

      If they ever finish RC72, they will most likely have spend the a few years output of a nuke power plant on it.

      I really think there are useful projects worth the waste of electricity, but cracking codes you can exactly calculate how many years youll need? only to start the next after finishing that needs 256 times longer?
    • And when they finally find the cure for cancer, they'll suck your significant other DRY selling the drug to him/her. That's what concerns me. I want to know if the knowledge obtained by using free computing power will be free, too. Or at least affordable.
  • In the computer lab at my school, I am about to replace some P5's with brand new machines and those lowly P5's have been running Setiathome constantly for at least four years without a hiccup.

    The only way a CPU would die from being "overused" is if it didn't have sufficient cooling or if it was a bad chip in the first place.
    • In the computer lab at my school... ave been running Setiathome constantly for at least four years without a hiccup.

      Just make sure you get written permission from someone higher up the organizational chart than you. Then, at least if someone else even higher up gets their nose out of joint for running "unauthorized" software it isn't your butt in a sling.

      And DO get it in writing.. not verbal. Even if your boss doesn't backpeddal, there is no gaurantee they won't leave or retire. Then you are stuck

    • Having supported ancient machinery, CPU fans and power supplies tend to go over time.

      But the seem to die more based on age or environmentmental conditions than actual use. Assuming the worst, with a MTBF of about 3 years, and a replacement cost of $40, the effect is still negligable.

      Seti@home doesn't use the hard drive (unless you are running it on a machine with so little ram it drops into swap-hell). If your disk goes it's normal wear and tear.

  • by Anonymous Coward on Wednesday October 15, 2003 @11:51AM (#7220264)
    I used to run a protien folding application on a spare Athlon I had. I thought it would help advance humanity. Then I discovered that the deamon I was running was spining my hard drive up and down all the time. Eventually the bearing gave out, and the disk platter came flying out of the case at high speed. It sliced through my cat and embedded itself in the oposite wall. The computer itself then caught fire when the drive motor over heated. It burnt my entire house and all of the contents, including a twelve thousand page thesis I had been working on (That work is classified, so I can't tell you what it was about). I stubbed my toe escaping, and a fire fighter died trying to put the fire out.

    Just don't bother is my advice.
    • It sliced through my cat and embedded itself in the oposite wall.

      And all this time I always thought the disclaimers on some free software such as "not responsible if this software cauess data loss,... erases your hard drive or kills your cat..." were just a joke. lol

  • 100% load (Score:2, Interesting)

    by grub ( 11606 )

    People don't buy a Cray or Origin cluster to have the CPUs sitting at 1% load, they're made to work. If a home PC was properly cooled I'd hope that it should last to whatever the lifetime is spec'd at by the manufacturer.
  • No moving parts (Score:2, Insightful)

    by bunnyshooz ( 632195 )
    Since there aren't any moving parts inside the processor, processor load is unlikely to wear it out. It is more likely that a processor will fail due to issues with cooling and from being turned on and off frequently. So keep that Seti@Home going!
  • Power (Score:2, Insightful)

    by jak163 ( 666315 )
    I've noticed a significant difference in my electric bill if I don't use the suspend function in my computer. I don't have the bills in front of me but maybe $10 a month. I'm using one of the early, high-power consuming P-IIs though.
    • My electric bill differences were much greater, but live in a desert, had several extraneous machines on, and was using triplehead xinerama on three older 17" monitors. At this point I'm down to my firewall, fileserver, and a friend's colocated computer as the machines that are always on, and I use my laptop for my computing. The electric bill has dropped about a third.

      I can see why colocation facilities are popular, and if I had reliable enough hardware I'd probably do something like that too.
    • Re:Power (Score:5, Insightful)

      by Waffle Iron ( 339739 ) on Wednesday October 15, 2003 @12:29PM (#7220749)
      A while back I plugged a variety of appliances into an ammeter to see what they consumed. Here is what I got for a couple of computer systems:

      Dell PIII-550MHz:

      • Idle - 39W
      • Unreal Tournament - 57W
      • Compiler Build - 56W
      • Powered Down - 2W

      Athlon 1800+

      • Idle - 99W
      • Unreal Tournament - 118W
      • Powered Down - 5W

      So my computers seem to use about 20 extra watts under load compared to idle. That would amount to an extra $18/year if the app ran all the time compared to letting the machine idle all the time (@ $.10/kwh).

      However, I usually power my systems off when I'm not using them. If my athlon system is off an average of 16 hours per day vs. running under load, that saves $65 per year.

      My 17-inch CRT monitor used 74 watts. Turning off or suspending that would save a similar amount of money. Altogether, that would be about $10 per month, as you guessed.

  • the math (Score:5, Informative)

    by proj_2501 ( 78149 ) <mkb@ele.uri.edu> on Wednesday October 15, 2003 @11:51AM (#7220277) Journal
    somebody worked this out when i started the e2 distributed.net team.

    the figures [everything2.com]
    • Wasteful
      Wrong. Back in the halcyon days of RC5-56 and the DES Challenges, computers didn't make a distinction between idling and crunching, so it was a great idea to use those spare cycles for something (remotely) productive. But this is no longer true: modern-day power-sucking CPUs do have circuitry that lets them idle and cool off when the processor is just running NOPs. Thus, keeping a number cruncher running 24 hours a day will stress your processor, requiring full ventilation and running up your power
  • At least not from doing what it's supposed to be doing. They died from excessive heat, or shorts, but never from doing their job...

    Hard drives can't be that stressed by the sort of work the SETI program adds. Not exactly a daily thrashing.
  • Power (Score:5, Informative)

    by DaHat ( 247651 ) on Wednesday October 15, 2003 @11:52AM (#7220284)
    I've found that on my laptop, the cost of running seti@home cuts my battery life in half, so when I care about power I am sure to leave it off, however, when ever it's plugged in, it like the rest of my boxes are chugging away. When it comes to power costs I don't really care currently as I don't pay my electricity, it's included with my rent and believe you me I make good use of that.

    As for premature death of CPU, being under heavy load should not hurt it, powering on and off often does far more 'wear and tear'.
    • I learned this the hard way. I had the United Devices cancer research agent running on my laptop. I unfortunately did not realize the power penalty of running this app in the background until I was on an airplane and ran out of battery in
      In terms of wear and tear on your hardware, I would suspect it would be minimal, if you compare it to leaving the machine on running idle.

      BTW, I know SETI paved the way for this technology, but feel something like UD research [grid.org] has far more potential benefit to society
      • BTW, I know SETI paved the way for this technology, ...

        SETI might be the best known of the early distributed computing efforts, but it was by no means the first. The DES challenges run by distributed.net came before SETI, as did the RC-48 and part of the RC-56 challenge. Distributed.net's technology was superior to SETI's in many ways back then, too. There were also many other, lesser known efforts underway, such as the Mersenne prime search.

        I think crude distributed computing efforts have a long his

    • oh come on. The idea that turning your computer off every night hurts it is the argument that fuels our incredible energy waste. Who do you know has had a computer fail because it was turned on and off too many times? I mean, for how many users would the on/off duty cycle actually have a significant impact.

      turn your computer off unless you have a good reason (like network server function) to keep it on.
    • Re:Power (Score:3, Interesting)

      by NorwBlue ( 711956 )
      Finaly a point where I have some numbers. I used to manage a net for a pharmaceutical company and we used to run the machines like most people. Turning them on when needing them and turning them off at night to save power. When we changed the SOP for computer use(on the teory that machines mainly die when turning on) to keeping the computers permanently on we had a decrease in service costs by 75%-80%. When these figures where held against the increase on the powerbill we saved ALOT of money (around 200 ma
  • But I wanted to praise the choice of article. I myself have been wondering the same thing. I know that it may not seem like a lot of extra usage, but the hidden costs may be substantial.

    Or my Tin Foil hat is on too tightly.
  • ram drive (Score:5, Interesting)

    by ih8apple ( 607271 ) on Wednesday October 15, 2003 @11:53AM (#7220301)
    Since I figured the cost of the processor running at 100% was insignificant compared to the cost of the hard drive constantly spinning instead of spinning down during downtime, I created a small RAM drive on my various computers where I ran seti@home so that the file access wouldn't affect hard drive usage. This worked equally well on linux and windoze. The only other thing to do was to create startup and shutdown scripts to create the ram drive, copy the files over, and start the process and then to copy off the files before shutdown.
    • Since I figured the cost of the processor running at 100% was insignificant compared to the cost of the hard drive constantly spinning instead of spinning down during downtime...

      You figured wrong. A hard drive typically uses ~5 watts when spinning, but the difference between a processor idling and one running at 100% can easily be 20-50 watts, depending on the CPU.

    • a spinning HD uses 3-5 watt.
      A modern cpu at full power needs 60-100W, idling 10-40.
      Do the math
  • by eaglebtc ( 303754 ) * on Wednesday October 15, 2003 @11:54AM (#7220316)
    I have a Pentium 4 @ 2.6GHz, overclocked to 3.2GHz. My power strip is plugged into a great little device: the Kill-A-Watt [ccrane.com] wattmeter. I can track my electricity usage over time by Volts, amps, Watts, VA, and it keeps a log of the kWh consumed by a particular device.

    When Folding@Home is turned off, my power consumption for the entire system is 140W. When I activate Folding@Home, the Wattmeter reading jumps to about 190-195W.

    So if you're concerned about electricity usage in your house, then yes, distributed computing sucks more power.

    • Yes, but unless you live in a warm area you would have propably used a lot of that electricity to warm up your house anyway.

      Actually, if you live in a warm area you have to pay for the power used in the distibuted computing twice. First in the compter and then in the removal process; air condition.

      But most people don't live in a are where they need to run either air condition or some form of oven 24/7 so the balance is mixed.
      Distributed computing is not a very efficient use of power since many of the c

      • But the extra costs are distributed on so many individuals that it doesn't matter anyway.

        It is exactly this sort of thinking which leads to large-scale environmental problems, our tendency being to avoid responsibility when the blame is spread thin enough.
  • ...but what are the costs to the computer owners?"

    The costs will be a lot higher if we don't detect and defeat the alien hordes through SETI.

    I hate penny-pinching accountant types.

  • Honestly, at ~700W for a typical 19" monitor, your monitor is probably the highest powered device in your computer. CPU power draw varies from about 10W or less when idle, to 70W or so for a rigorous instruction mix (the Intel Itanium is somewhat anomalous at about 100W when fully exercised). So remembering to turn off you monitor, or at least selecting the low-power mode of your monitor for a screen saver rather than animating useless objects will probably have the largest effect on your power bills.
  • by greymond ( 539980 ) on Wednesday October 15, 2003 @11:57AM (#7220366) Homepage Journal
    I've been using http://www.distributed.net/ [distributed.net] on and off for a few years now and i've never had a problem with any of my processors. However I usually upgrade my cpu/mb every 3-4 years, so if you have or keep your systems longer i'd imagine any burnouts would be due to "just an old cpu" and not from the constant use. Then again I don't plan or expect my hardware to last forever.

    As far as the power bill goes. I currently have a desktop, laptop, wireless router/hub and zaurus going the majority of the day - at least the systems are always on since I am too lazy to turn them off and have no need too. I also live with my girlfriend who runs the haridryer every morning and must have every light on in the house to check her makeup with. At the end of the month we get our power bill of $45-50 - which in my opinion is not a lot. We're also in California for the record.
  • You will throw out your CPU for being such a slow and out-of-date piece of junk long before it burns out.
  • As far as I can imagine, having the processor under full load all the time wouldn't be too damaging so long as you kept it cooled properly. Heat is the number one source of trouble for me when it comes to maintaining a stable system.

    As for the cost over the course of a year, it would depend on a few factors, namely the particular specifications of your unique system. If you took two identical computers, except you put in diffeent CPU's, and ran both straight for one year in 2 different locations, you would
  • Processor's shouldn't have a shorter life due to usage, unless it's because the cooling fan get's F'd. Having to replace the fan more often will cost you a few $ every couple of years.

    For a single (typical) PC, the difference in electrical usage will be a few dollars per year. Given the typical cost of electricity in the U.S., you're only talking somewhere in the $20 - $100 range.

    You're more likely to see it negatively affect your sanity; having those fans running at full output all the time.

  • Some Measurements. (Score:3, Informative)

    by taliver ( 174409 ) on Wednesday October 15, 2003 @11:58AM (#7220383)
    I'm kinda in a position to answer at least one part of this question.

    CPu's, when idle, can use as little as 2-5W. When fully utilized, up to 40-50W (depending on the make/model/etc). So let's assume you have a middle of the road processor that has a difference of 25W between active and idle. (This is consistant with measurements on a PIII 800MHz, a little lower than middle of the road.)

    Now, 25W * 24Hrs * 365 days * 1kw/1000W * $0.10/kWhr = $21/year. Roughly $1/year per Watt of additional power.

    As far as breaking of components, as well as the system is cooled properly, I wouldn't think it would be a problem.
  • I run the distributed.net client on all the machines I have running. One which has got to be 6 or 7 years old, and dnetc has been running the whole time. That machine is rarely shutdown - and it's still goin strong.

    I've got a couple other machines, 1 and 2 years old running some hot AMD processors, the 1st runs around 40-45 degrees C, the other 50-55. They also seem to run fine.

    I don't think it'll wear your hardware down unless you've got a really poor cooling solution. As far as hard drives go, the d
  • Energy costs (Score:5, Informative)

    by p7 ( 245321 ) on Wednesday October 15, 2003 @12:03PM (#7220446)
    Check this website for a breakdown of the energy costs.

    http://www.dslreports.com/faq/2404
  • A lot of distributed clients aren't that well behaved. An idle process can still slow a system down immensely, unless it completely shuts itself off when there's activity. I haven't tried seti@home, but I messed with some of the des & rc5 cracking clients long ago.
  • My apc backups rs reports how much power is being used. When im at 100% cpu vs ~0% its an extra 17 watts.

    For a 19" monitor + p3 + hard drives etc, its only about 220W total from the ups. (im sure much more peak from a cold start).
  • It's about 50-70 watts on the latest 3GHz PCs. An idling 3GHz Pentium 4 takes about 20 watts and a fully loaded one about 70-90 watts. At 15 cents the kilowatt-hour (that's what NYC pays), that comes out to an extra 21.6 cents/day or $79 per year compared to leaving the computer idle all year long.

    So, yes, power is a substantial cost consideration. NYC power is also primarily gernerated with coal, so every joule of electricity used is that much more CO2 in the atmosphere. On the other hand, if the CPU cycl
  • Remember, thermodynamics tells us that all energy eventually ends up as heat. By using extra energy in your processor when it is under load, you are ultimately just generating more heat in your living space.

    In the winter, you want to heat your apartment. Any heat coming from your computer is less energy you need to use in your heaters. It's 100% "efficient" -- remember that thermodynamic efficiency is measured in terms of how much energy is wasted as heat. But in this case (and what a special case it is!)

  • I think you're grasping at straws here. Yeah, I suppose a processor's life would be shortened by constant use, but it's JOB is to be able to compute. I'd be more worried about propper cooling inside the case and specifically on the chip, and not running the chip above specifications.

    If you're that worried about it, you may as well be running the OS with the LEAST processor usage. If you run windows, that probably means running windows 95 or 3.1. Is it that important? I don't think you'd get much more life
  • I've been running the Great Internet Mersenne Prime Search (GIMPS) for the past several years non-stop on several PCs. I leave the machines on 24 hours a day and only shut them down for hardware upgrades and an occasional cleaning.

    The power costs are negligible on a single machine. Run a farm and it can get expensive when you factor in cooling, which is the primary expense. Air conditioning running 24/7 or close to it in a house is far more costly than the consumption of a typical PC.

    The advantage of

  • I've been running dnet on an overclocked celeron (550 MHz -> 850 MHz) for about two years... no problems. With Windows versions before 2000, your CPU doesn't really "idle" in the sense that inactive parts are not shut down (that's why programs like "Rain" and "Waterfall" are popular for the 9x OS'es).

    Most modern OS'es use HLT commands to power down inactive parts of the CPU. On such an OS, running a distributed worker like seti, folding, or dnet will make the chip run a little hotter, and probably

  • In the winter, that extra wattage generated by the CPU load will contribute to heating your home. One would hope that your heating system is more efficient than your computer at heating, but you never know.

    Conversely, during the summer if you have air conditioning, you'll have to take into account the extra work that your A/C unit has to do to counteract the heating generated by your computer.
  • Yes, it costs something extra in electricity with modern processors. There are two ways to approach this for people who think the extra couple of bucks isn't worth the science:

    1. Turn off the outside light you leave on 24 hours a day instead and you'll still end up with a 15 watt savings.

    2. Set the computer to sleep, hibernate or shut down when you haven't used it for a while. While boring the processor to death surfing the web the machine will consuming it's normal amount of power anyway (for the most
  • Back when the ppro-200 was a neat machine, but out of most budgets I calculated that the costs in electrisity alone to run enough 386s to do as many RC5-56 blocks in a year as a single PPRO would be more than the cost of a new PPRO machine. (Note that soon aftwards the PII came out, at the time no CPU could touch the PPRO for blocks done, though other CPUs were better on a per clock calculation)

    So if you have an old machine that you keep solely for the addition to your stats in some project, you may be be

  • Get a multimeter or one of the units that plug into the outlet then your PC plugs into the unit. Find out what the actual change in wattage draw is. Then write it up, post to an obscure news website and submit the article to /. so we can all marvel at the cost of running distributed apps.

    Seriously though, it can be surprising how much different devices draw. My old 19" monitor at work pulled about 100 watts when a "typical" desktop is up (3-4 watts when in power-save mode.) My newer 17" LCD (nearly th

  • Tough one (Score:3, Funny)

    by Mannerism ( 188292 ) <keith-slashdotNO@SPAMspotsoftware.com> on Wednesday October 15, 2003 @02:22PM (#7221920)
    Has anyone had experience with processors dying prematurely due to a constant, heavy load, or is usage pretty inconsequential? What about other components, like harddrives? And how much does a 100% processor load increase your power bill versus a 1-2% idle load over the course of a year?

    Those are all surprisingly complex and computationally intensive questions. In order to find the answer, I'll soon be releasing "@home@home", a distributed application designed to calculate the true cost of itself.
  • by RhettLivingston ( 544140 ) on Wednesday October 15, 2003 @03:08PM (#7222445) Journal

    Many have pointed out that chips essentially don't wear out, but that's only in a world where every motherboard has a perfect design. In reality, given any motherboard, there will be some bad parts of the design and the lifetime may indeed be effected by how much it is stressed, especially those with an error in the design as regards to heat dissapation though underspeced drivers can be a big issue to. Also, many use capacitors whose values change after a few years due to chemicals cooking out of them. This is why many of the cheaper motherboards on the market will just stop working or become unreliable after about 3 years. If those motherboards are run hotter for a larger percentage of time, certainly there will be a reduction in life.

    Even so, the cost amortized over time is still minor. If a motherboard goes bad after 2 years instead of 3, then you've "spent" 1/3 of the lifetime of a $100 or so component on the task. So, maybe about 34ish bucks split over 2 years or 17ish bucks a year. Not free, but not much money either.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...