Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Digging Into the Electrical Cost of PC Gaming

timothy posted more than 2 years ago | from the with-compound-interest-you'll-die-rich-in-nazeroth dept.

Power 162

New submitter MBAFK writes "My coworker Geoff and I have been taking power meters home to see what the true cost of PC gaming is. Not just the outlay for hardware and software, but what the day-to-day costs really are. If you assume a 20 hour a week habit, and using $0.11 a KWH, actually playing costs Geoff $30.83 a year. If Geoff turns his PC off when he is not using it, he could save $66 a year."

cancel ×


Sorry! There are no comments related to the filter you selected.

You know... (-1)

Anonymous Coward | more than 2 years ago | (#40141921)

Some idiot in Cali was complaining about his power rates. Mine are a third of what his are. He said that I have no right to complain. Mine are still high. Is he right?

What do you _mean_ by "too high"? (3, Insightful)

Anonymous Coward | more than 2 years ago | (#40142007)

Everyone always has a right to complain, but some people's complaints are silly and make me think they're idiots, or to put it nicely, their personality is generously infused with irony.

I can't say whether or not you're an idiot, though, because you merely said "too high" rather than explaining why you think your rates are "too high" -- you might have good reasons which expose corruption in your state's PRC, or you might have amazingly stupid and arrogant reasons, based on arbitrarily saying things without thinking hard about them, and where even those shallow thoughts are founded completely on a lack of information and evidence.

So who knows? You didn't even give numbers for "too high" (which wouldn't tell the whole story either, but would probably bias me one way or the other).

Re:You know... (-1, Offtopic)

cheekyjohnson (1873388) | more than 2 years ago | (#40142021)

Absolutely. After all, if your situation could be worse, that means that your situation isn't bad! For instance, if I were to punch you in the face, that wouldn't be a bad thing because getting murdered is far worse!

Money well spent (0)

Anonymous Coward | more than 2 years ago | (#40141935)

If that is all you are spending, then it is money well spent. There are more expensive alternatives.

Re:Money well spent (4, Funny)

HeLLFiRe1151 (743468) | more than 2 years ago | (#40141971)

Meanwhile the Geoff and his coworker discuss these types atrocities at their daily meeting at Starbucks while sipping on their third expresso.

Re:Money well spent (4, Funny)

Iniamyen (2440798) | more than 2 years ago | (#40142037)

If they were using Free Trade Watts, it wouldn't be an issue.

Re:Money well spent (1)

polar red (215081) | more than 2 years ago | (#40142403)

I suspect you mean Market Economy Watts. The term market economy, which is being used in Economic "science"(term used very liberally here) is so far removed from reality, does not apply.

Re:Money well spent (0)

Anonymous Coward | more than 2 years ago | (#40142263)



PC gaming? (3, Insightful)

MsWhich (2640815) | more than 2 years ago | (#40141951)

I'm not sure how this has anything to do with the cost of PC gaming, considering that my mother, who only uses her computer for Facebook and TurboTax, could see the exact same benefits by doing the exact same things the article suggests.

Re:PC gaming? (4, Informative)

0123456 (636235) | more than 2 years ago | (#40141977)

Running PC games can easily take 300-500W with a high-end graphics card. Posting on Facebook probably uses 30-50W on a modern desktop PC (plus whatever the monitor uses in both cases).

Re:PC gaming? (3, Funny)

Anonymous Coward | more than 2 years ago | (#40142003)

i can't play video games anymore since i'm running a bitcoint mining operation with my graphics card. it's pretty expensive to run.

Re:PC gaming? (3, Funny)

BigSlowTarget (325940) | more than 2 years ago | (#40143683)

Sure you can still play. You just feed in quarters at double or triple the rate of the rest of us.

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40142145)

Actually typical modern desktop uses more like 60-150 W while browsing. My desktop (not that new Phenom II X4 rig) uses about 90 W (computer itself only) when idling. Maybe a new Core i3 could get to level of 50 W or maybe even 30 W, but that's not something everyone has around here. There are not that many Phenoms either but something older. Laptops could easily be in, lets say 20-40 W category. But I'll wait when some Rapsberry Pi owner tells his numbers.

Btw that desktop of mine uses around 250 W while playing Minecraft. Some other 3D games I tried used similar amounts of power.

Re:PC gaming? (1)

0123456 (636235) | more than 2 years ago | (#40142401)

Actually typical modern desktop uses more like 60-150 W while browsing.

Only if it's a Phenom or has a discrete graphics card. My Athlon X2 uses about 50W when CPU usage is low, as does my i5 (both using integrated GPUs).

Re:PC gaming? (-1, Flamebait)

Dishevel (1105119) | more than 2 years ago | (#40142611)

I thought integrated GPUs were for servers and third world countries.

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40143197)

and macs

Re:PC gaming? (3, Insightful)

Phreakiture (547094) | more than 2 years ago | (#40143535)

It's a pity there isn't a -1 Snobbery moderation option.

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40143579)

And I thought desktop computers were for poors and call centers.

Re:PC gaming? (1)

Phreakiture (547094) | more than 2 years ago | (#40143701)

Actual data that I've taken from my home server:

Idle: 53W.

One core at 100%: 73W

Both cores at 100%: 93W

These are measured at the AC plug.

I didn't measure differentials for optical disc activity (DVD burner was idle when testing) or for high levels of disc activity (disc was spinning, but not being actively used during testing) but the thing that stands out to me is that the background power usage of this machine is larger than the differential caused by CPU utilization.

I also can't help but notice that CPU clock scaling doesn't seem to contribute to energy savings on this particular machine, else the differential between idle and 1 core busy would be larger than the differential between 1 core and 2 cores busy, owing to the fact that the cores are clocked together.

The machine is an Athlon X2, 2600 MHz. The motherboard has an integrated nVidia GPU. 1.5 TB Seagate HDD (7200 RPM), cheap-ass Optiarc DVD burner (don't get one, seriously), 6 GiB of RAM. Power supply is a 450W Corsair.

Many PC's are power pigs but they don't need to be (1)

Frank T. Lofaro Jr. (142215) | more than 2 years ago | (#40143783)

I've got a desktop which runs in the 30's for wattage while doing low CPU consuming tasks like browsing, and never reaches 70 even at full load, and gets down to the high 20's when completely idle.

Re:PC gaming? (1)

MsWhich (2640815) | more than 2 years ago | (#40142347)

Yes, but the article was specifically about saving money by turning your computer off when you're not using it. Sure, a high-end gaming system is going to draw more power even when idle than a crappy underpowered and outdated system (like my mom's) but I don't think the difference is going to be significant enough to make the claim that this is something that will specifically help PC gamers. Everyone can save money by turning the machine off when not in use, whether you're a gamer or not. (Although for my money -- pun intended -- the savings isn't enough to justify the annoyance in having to boot the machine every time I sit down to use it. I do put mine into sleep mode, but that's as far as I'm willing to go.)

Re:PC gaming? (3, Informative)

0123456 (636235) | more than 2 years ago | (#40142427)

Yes, but the article was specifically about saving money by turning your computer off when you're not using it.

A high-end gaming GPU might use 50-100W when rendering the desktop. Integrated graphics... don't.

Re:PC gaming? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#40143687)

There are definitely some offenders out there; but contemporary GPUs are increasingly good at cutting back when they aren't needed. Laptop OEMs won't touch an architecture if it will utterly toast the battery just to dump the desktop to the screen, and desktop cards(while their maximum draw seems to edge ever upward) have inherited a similarly parsimonious lower end.

Re:PC gaming? (4, Interesting)

MBAFK (769131) | more than 2 years ago | (#40142469)

Sleep on a modern machine is pretty damn good. On my main gaming PC if you turn off the monitor and sleep the system it uses 3.18 watts. If you turn the machine off rather than sleep you use 2.92 watts.

Re:PC gaming? (1)

dc29A (636871) | more than 2 years ago | (#40142473)

Sure, a high-end gaming system is going to draw more power even when idle than a crappy underpowered and outdated system (like my mom's)

Current generation of video cards, even the high end, draw maybe 2-3 watts when idle. The Ivy Bridge CPUs are 77 Watt TDP rated and idle they consume peanuts. With a good PSU and SSD, I seriously doubt these systems will draw more power idle than a crappy outdated system.

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40143839)

Yeah, just like the joke:

X: I saved $ bucks today while running after the bus.
Y: You schmuck, you could have saved $$ bucks if you were running after a cab.

It's not money saved. It's merely preserving Earth's resources. Pull the plug on your desktop and try a console (Wii is some say the eco-choice for the power bill)

Re:PC gaming? (1)

MBAFK (769131) | more than 2 years ago | (#40142383)

I actually looked at this when I had the power meter out. To play Spider Solitaire is about 102 watts on the same machine that needed 157 to play Dawn of War 2. That machine idles at 100 watts.

Re:PC gaming? (1)

gmack (197796) | more than 2 years ago | (#40143777)

It really would have been helpful to know what hardware you tested on. I get that the CPU and GPU both likely downclocked when idle but Does it have a HD that spins slower (WD greendrive and friends) when not in use? Does it have an SSD card? What monitor did it have? Was it an LED backlight or one of the older ones that use more power?

I also can't imagine anyone not setting their monitor to power off when idle.

Re:PC gaming? (-1)

Anonymous Coward | more than 2 years ago | (#40142395)

Sure, if your PC sucks ass. 5760x1080 on 3 screens with SLI FTW (900 watts). Real gaming costs $103/year if you're cutting yourself back to 20 hours/week.

Re:PC gaming? (1)

Sloppy (14984) | more than 2 years ago | (#40142125)

If your mother only uses her computer for Facebook and Turbotax but draws 100W while idle, then your mother needs building advice. Nudge her into moving to Ivy Bridge Core i3 (and use the integrated graphics; don't add graphics card) when they come out in a couple months.

(Actually if that's all she does, maybe even an Atom or Bobcat system will be enough, but in 2012 I don't recommend going that way.)

Re:PC gaming? (1)

MagusSlurpy (592575) | more than 2 years ago | (#40142223)

So spend $500 on a new PC to save $40/year on electricity?

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40142913)

Can't believe you missed the obvious analogy about people who buy new cars because they have better MPG. You must be new here.

Re:PC gaming? (4, Interesting)

TheLink (130905) | more than 2 years ago | (#40143261)

Yeah every now and then Slashdot has these silly articles about PC power consumption, "kill a watt" etc.

The power consumption of modern PCs (post P4) has gone down to a level where most home users would usually be better off looking for savings in other areas. Driving more efficiently, not using as much cooling/heating (and making it more efficient - insulation, sealing etc).

As for gaming, sure a high powered gaming rig will use a few hundred watts (and usually less if you're not doing SLI). But that's far from the most energy hungry way of having fun. Your hobby could be drag racing, or hiking/rock climbing somewhere that requires a 1 hour drive, or even baking cakes. FWIW even cycling and other sports might be more energy hungry if you replace the calories burnt by eating more of stuff that requires a fair bit of energy to produce ( e.g. US corn fed beef).

From various sources:
1 pound of beef = 13-15 pounds of CO2 ( [] )
1 kWh = 2.3 pounds of CO2 ( [] )
so 1 pound of beef = 5.6-6.5kWh

So if all that exercise makes you eat an additional half pound of beef (400kcal), that's about the equivalent of running a 300W gaming rig + monitor for 9 to 10 hours.

In contrast 1 pound of chicken = 1.1 pounds of CO2.

I've even seen many people here who say they still prefer to use incandescent lighting. It doesn't take that many bulbs to use as much as a gaming rig, even fewer for a facebook/browsing PC/notebook. A single fluorescent tube lamp uses about 40W already.

Re:PC gaming? (-1)

Anonymous Coward | more than 2 years ago | (#40143359)

> "building advice"

The facebook crowd switched over to laptops years ago. Please leave normal people out of your turbo-button beige clone dystopia.

Do you know what has really great power efficiency? An iPad.

Re:PC gaming? (0)

Anonymous Coward | more than 2 years ago | (#40143415)

If you just move in to a cave think how much money you will save. lol

Kill-a-watt meter (4, Interesting)

stevegee58 (1179505) | more than 2 years ago | (#40141965)

I bought a kill-a-watt meter a while back when I started dabbling in Bitcoin mining and it was a real eye-opener.

It's a very similar problem to OP's situation since Bitcoin mining and gaming both use high performance video cards.

Re:Kill-a-watt meter (1)

thePowerOfGrayskull (905905) | more than 2 years ago | (#40142011)

I bought a kill-a-watt meter a while back when I started dabbling in Bitcoin mining and it was a real eye-opener.

It's a very similar problem to OP's situation since Bitcoin mining and gaming both use high performance video cards.

You can't say that and leave us hanging - did it cost more in electricity than you gained by mining bitcoins?

Re:Kill-a-watt meter (0)

Anonymous Coward | more than 2 years ago | (#40142075)

>> did it cost more in electricity than you gained by mining bitcoins?

The power company charged a bit more.

Re:Kill-a-watt meter (1)

ArsenneLupin (766289) | more than 2 years ago | (#40143645)

But wouldn't that depend on the bitcoin exchange rate... which varied quite a bit during the last couple of months...

Re:Kill-a-watt meter (3, Interesting)

Anonymous Coward | more than 2 years ago | (#40142193)

Back when BTC were above $8, and you were using modern Radeon cards, it was roughly break even. Now if this is in a room that needed to be air conditioned, I would ballpark triple the energy costs. I decided it wasn't worth it unless it was the winter.

Re:Kill-a-watt meter (1)

gumbi west (610122) | more than 2 years ago | (#40143825)

A/Cs have an efficiency of about 3 (3 watts of cooling for every watt of electricity), so you need only add about 30% to the figure when adding in A/C.

Re:Kill-a-watt meter (2)

stevegee58 (1179505) | more than 2 years ago | (#40143141)

Even at $5/BTC I'm still profitable with electricity at $0.07/kwh

Re:Kill-a-watt meter (1)

Rogerborg (306625) | more than 2 years ago | (#40142217)

Did you factor the cost of the meter into your calculation? I'm not sure Geoff here did that.

Power (0)

Anonymous Coward | more than 2 years ago | (#40141967)

As a ballpark, for most regions I find calculating the yearly cost of an item on 24/7 to be about $1/watt.
13 watt cfl on 24/7, $13/year($4.33 for 8 hrs/day, etc), my Emachines e725 laptop, 24/watts so $24/year(less when screen is off,but it is 'on' 24/7). etc.

Great tip when buying large TV's etc you can quickly figure if that plasma is worth buying or not(for me, not).
And of course everyone should buy a $15-25 Kill-A-Watt or equiv. to see what appliances might be leaking etc(We found a fridge so wasteful a new one was paid for with the power savings in 3-4 years).

Re:Power (2)

vlm (69642) | more than 2 years ago | (#40142113)

As a ballpark, for most regions I find calculating the yearly cost of an item on 24/7 to be about $1/watt.

Rephrased, at 8.76 cents per kilowatt hour, one watt year costs about a buck per year. Plus or minus leap years and leap seconds. After endless add on taxes, and fees, and fees disguised as taxes, and taxes disguised as fees, that's probably about what I'm paying when I write a check.

Re:Power (2)

Chrisq (894406) | more than 2 years ago | (#40142353)

Plus or minus leap years and leap seconds.

Probably less than the cost of the energy needed to calculate it.

Re:Power (0)

Anonymous Coward | more than 2 years ago | (#40143279)

A rule of thumb that's very close for people paying a bit more for power is that each 10 Watts used 24 hours a day costs $1 per month. (divide by three for eight hours, etc).

It certainly makes it obvious that keeping an old faster-clocked P4 machine around left on as a router, server, p2p box, or general use machine, isn't so cheap after all.

Are modern machines as good at throttling back consumption by the the GPU as the CPU when peak performance isn't needed? Considering that even tablets can decode h.264 fine with optimized low-power hardware, most of us should probably avoid powerful GPUs if they aren't good at throttling down consumption. Having extra GPU power "just in case" might be a costly choice. What's in the better laptops that manage long battery runtime?

Perhaps someone can post some detailed data on GPU power use?

Re:Power (3, Insightful)

TWX (665546) | more than 2 years ago | (#40142133)

I have a meter as well; one thing to consider with replacement appliances is the reliability and longevity of the appliance.

I have a 33 year old Sub Zero built-in refrigerator in my new house. It's so old that it has only one knob for temperature adjustment, and the refrigerator compartment on top is slave to the freezer setting. I've removed the cover to the compressor and coils to clean them, and I've found some indication that a service or two have been performed over the years, but compared to a friend's brand new LG unit that's had to be serviced twice in eight months and had cost them $1600 to purchase, I'm happy to use this fridge for the moment. Plus, a new built-in refrigerator will cost between $4000 and $8000 depending on what brand and features are chosen. This unit can run for a very, very long time for $4000 worth of electricity.

As for TVs, one doesn't necessarily have to use the fancy, big TV all of the time either. For many years I had a projector screen that could roll down in front of the entertainment center, blocking the 27" TV in it, so I could use my projector when I wanted to watch something of substance. Now, I have the projector in a different room from the TV we watch the news on, and we only use it when we actually want to watch a movie or some other thing where surround sound and a big image matter. Obviously the roll-down method won't work with a fixed TV, but putting the fancy home theatre TV into a different room would.

My current PC (an old Dual-Xeon box) has a hardware sleep switch that ties into some pins on the motherboard, and when pressed the computer drops down to a low power state. When I'm done using it I just put it to sleep, and when I want to use it again it comes back in about three seconds. Works well, keeps all of my programs running fine, and saves power.

There are lots of techniques that can be used to save power, but the biggest hogs in the house (HVAC, hot water heater, refrigerator, oven/range/cooktop) don't hold a candle to the consumer devices that everyone always panics about. If you want the most bang for your buck, insulate your house. Change your windows. Plant some trees that increase shade on the structure. Turn your thermostat up a couple of degrees and install some high efficiency ceiling fans to keep the air moving a little. Sure, turn off the electronics you're not using, but don't assume that it'll be earth-shattering on your power bills just by doing that.

Re:Power (0)

Anonymous Coward | more than 2 years ago | (#40142429)

IF you have a Sub Zero " built in", you dont care about a paltry $100 a year in savings... You waste more than that wiping your butt with old $20 bills in the morning.

"dear do you have any more crinkled twenties?"

"nope only a fistfull of fives!"

"No way I'm wiping my golden ass with anything less than a ten! Tell Jeeves to come in here with the silk curtians from the prefunctuary!"

Components (3, Interesting)

SJHillman (1966756) | more than 2 years ago | (#40141981)

What about switching out power hungry gaming cards for newer, more efficient cards? This year's mid-end model may have comparable performance to last year's mid-high end model but might draw half the power. Over time, the lower power consumption adds up, not to mention you can get by with a smaller power supply. Likewise, trading in your hard drives for a solid state drive (maybe using a green HDD for extra storage)? And for old timers, switching out CRTs for LCDs? Overall, I think it'd be easier for people to upgrade to more energy efficient components than it would be for them to change their PC usage habits. Lowering the sleep/HDD shutoff/monitor shutoff timers can make a big difference too without having to remember to shut down your PC every day or waiting for it to reboot. Not an option for everyone, but gamers usually aren't on a shoe-string budget or else they wouldn't be able to afford the PC and the games in the first place.

Re:Components (1)

gl4ss (559668) | more than 2 years ago | (#40142229)'d be playing for couple of years to justify the cost of upgrading just for that reason.

the whole debate is stupid, time spent (presumably happily) / money for electricity ratio is pretty much nothing if compared to just about any hobby, hell, even just buying sneakers is more expensive per year.

not to mention the energy costs acquired when the equipment was made.

just buy a phone and play with it? uses much less energy. the games suck though.

Re:Components (5, Funny)

Rogerborg (306625) | more than 2 years ago | (#40142245)

Indeed, I scrap my hardware every 2 months so that I can be absolutely sure that I'm saving money and preserving the environment.

Re:Components (1)

xdroop (4039) | more than 2 years ago | (#40142315)

This doesn't work for the same reason that virtualization rarely yields absolute savings. Instead of "doing the same with less", the pointy heads see all this newly-freed up hardware and decide to re-use it. You end up "doing even more with the same". So your costs-per-work-unit go down, but your absolute costs stay the same (or go up once virtualization costs are factored in).

The same goes for people buying hardware. We rarely say "oh, I can buy this computer that has A) the same performance and B) better energy consuption rates as my existing one for less than I paid for it" -- we say "oh, I can buy one that is so much faster and powerful (and ususally, energy-hungry) than my existing one for the same as I paid for the originial".

Why spend more money to get what you already have, when you can spend more money to get -- more?

Re:Components (1)

Anonymous Coward | more than 2 years ago | (#40143357)

bcause, somewhere along the line, you get over with puberty and start valuing tools for their cost benefit and not for they powaaaaaaaaaaaaa

because you don't need just the pc, but the pc, a car, a house, a dog, a wife and so on.

sure, I did spend grand on my desktop when I was in basement duty; now I've a perfectly fine recycled pc, costed 60$ for hdd replacement (safety) and 90$ for gpu upgrade (performance) and it has plenty of computing power - can run shogun 2 maxed, at 1920x1080

btw, the monitor is shared with the tv, and the tv was a throw off from a dude who purchased it for hd without noticing it supported hd only from hdmi and not from the tv tuner (duh)

Re:Components (1)

laffer1 (701823) | more than 2 years ago | (#40143387)

It really depends on the situation. For example, I build packages for my open source project. The computer science department donated 20 machines for use in a cluster while I was there. I could build around 2000 packages in 10 days. After I left the university, I had to do it with my own computing equipment. Today, I can build the same software in about 2 days with my desktop computer. If I were paying for electricity use to run 20 Dell optiplex systems with pentium 4 1.7Ghz-2.0Ghz + IDE disks to the Phenom II X6 1090t with raid 0 IDE + 2 SATA disks I use now, I'd save a lot of money with the new build.

Most people don't do what I do, but I think it does simulate a CPU and disk intensive work load one might see in some business settings. There are times that virutalization can save money on electricity. Of course, the cost for the disk subsystem to pull it off may far outweigh the savings. In my case, it wasn't that bad and I don't use virutalization for the setup. The software is setup to run multiple jobs in parallel.

Back when I ran builds for SPARC, the two sun machines I had used about $30 in power including air conditioning.

I don't think buying new gaming hardware is going to save anything because video cards seem to use more power now. I just bought a 750 watt power supply because I finally upgraded my aging video card and between that, 6 hard drives and my Phenom II, the system was blue screening during gaming. I've also got a mac pro that uses about 300 watts when idle not counting the display. Conversely, my laptop uses about 32 watts (AMD A6 quad core 1.4Ghz) There's a massive difference between what consumers buy and what some of us run.

Re:Components (1)

L4t3r4lu5 (1216702) | more than 2 years ago | (#40142365)

I don't think the parent is suggesting that you buy components to replace fully functioning and useful parts just to save electricity. Potentially, though, you could save real, actual money buy buying newer parts than upgrading your current, old hardware.

I ran an 8800GTX until it died, but it was around 6 months ago and I decided I needed an upgrade (before it failed). If I had gone ahead with the upgrade, I would have paid £100 for the card, and another for a 1kW PSU to handle the draw. Those cards pull north of 320W under load! Thankfully (?) it failed before I upgraded, so I went with an AMD HD6950 instead, and haven't looked back. Performance improvement is wonderful, power draw is down 50%, and I didn't need to upgrade the PSU (meaning the £200 budget could go on the card).

However, having just also upgraded the bare-bones too, I can safely say that the biggest power saving you'll make is upgrading to an SSD. Power draw isn't the issue; It's the fact that you can go from power-down (hibernate or cold start) to working in ~20 seconds. It makes sleep and low-power (but still working) states pointless, so you'll power-off almost every time you leave the thing for any period of time. Again, only if you're looking to upgrade, but worth considering.

Re:Components (1)

L4t3r4lu5 (1216702) | more than 2 years ago | (#40142387)

I should have stipulated that the graphics card upgrade was going to be a second 8800GTX in SLI, meaning graphics alone would have drawn around 650W.

Re:Components (1)

SJHillman (1966756) | more than 2 years ago | (#40142477)

Yes, I was referring to regular upgrades you might do anyway. For example, the Radeon HD 7850 (this year's mid-end model) and the 6950 (last year's mid-high end model) have comparable performance, but the 7850 draws about 2/3rds the power or less depending on benchmarks. The 6950 sells for less, but the power consumption may make the total cost of ownership similar to or greater than the 7850.

Re:Components (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#40143849)

Even with a lousy HDD-of-no-particular-importance, I find that the big timesuck on boot isn't the booting; but the "getting all the browser pages and documents and whatnot back to where I left them(yes, even in applications that support session restore, you still run into issues like webpages that have decided to nuke the contents of form fields and such)" problem.

For that reason alone, the only real choice is between suspend-to-RAM and suspend-to-disk. With your contemporary soft-off PSU burning a few watts so that it can detect you waking it up, the difference between 'off' and 'suspend to RAM' is the relatively low, unless you have absolute piles of the stuff, cost of keeping your RAM refreshed.

Re:Components (1)

HideyoshiJP (1392619) | more than 2 years ago | (#40143727)

Waiting for it to reboot? I'm still running spinning disk on a Phenom II X4 system and it takes maybe two minutes at most to get to the desktop and finish login processes (Kaspersky is a different story). Turn it on and take a poop or something. Watching that Windows flag just makes you perceive it taking longer. Now, I'll go ahead and leave my work laptop with full disk encryption out of this. *That* is painful.

Those are some great savings (0)

Anonymous Coward | more than 2 years ago | (#40141993)

>If you assume a 20 hour a week habit, and using $0.11 a KWH, actually playing costs Geoff $30.83 a year.
>If Geoff turns his PC off when he is not using it, he could save $66 a year."

A $66 saving on a $30 bill? Sign me up!

Re:Those are some great savings (1)

SJHillman (1966756) | more than 2 years ago | (#40142051)

He's comparing apples and oranges.

Let X be the cost of normal, non-gaming usage
X + $30.83 = cost of gaming 20 hours a week in addition to (or in place of?) normal usage
X - $66.66 = cost of non-gaming usage if you shut down the PC when not using it

But what is his time worth? If I value myself at the same rate my employer values me, then the startup time of my computer costs about 15 cents. I use the PC in the morning before work and in the evening after work and throughout the day on weekends, so that's 30 cents a weekday plus 15 cents for each weekend day or $1.80/wk. 52 weeks in a year, so it costs about $93 of my time waiting for it to boot up. If you have downtime in which you'd be doing nothing otherwise, then it may be worth it. If your schedule is usually tight, then it's cheaper to leave it on all the time.

Re:Those are some great savings (1)

MBAFK (769131) | more than 2 years ago | (#40142583)

If you can't wait for it to boot you can sleep it. The difference between sleep and off is minimal on a modern machine.

Turning off something saves money? Really? (2)

Liquidretro (1590189) | more than 2 years ago | (#40141999)

Wow, earth shattering news here, turning off your PC when your not using it saves you a significant amount of money! What about factoring in cooling costs. High end gaming machines put out a lot of heat too. Since many gamers are using SSD's these days, sleeping your computer is great, they resume so fast. It's just common sense. I make sure everyone in my house shuts down or sleeps their machines at night if there is not a valid reason why they are on. It really does help. The real problem with this list is where is the spec list? That dual or triple GPU machine, that is water cool, and has a huge overclock will use a ton more power then your i5, single GPU machine. Finding an average gaming machine is tough to do.

Re:Turning off something saves money? Really? (0)

Anonymous Coward | more than 2 years ago | (#40142071)

Not that tough to do. Valve collects hardware statistics through Steam and makes that publicly available. You could easily find the most used hardware with a little work.

Re:Turning off something saves money? Really? (0)

Anonymous Coward | more than 2 years ago | (#40142151)

Everything on my computer from Outlook to Chrome to Office shits itself when I resume from either sleep or hibernation. Mapped network drives give me popups saying the server is unavailable (on top of the explorer window where I'm browsing said "unavailable" server). Outlook either tells me my inbox is unavailable or sits stuck about halfway through send&receive until I close it or it eventually errors out and then I have to close it anyway to get it to reconnect. I once made the mistake of leaving photoshop open and minimized overnight, in addition to having to wake up from sleep I had to wait about 5 minutes for photoshop to finish unswapping and redrawing all of its tool palettes before I could close it. Chrome sometimes comes out ok and sometimes does this thing where when I type in a URL, the tab's icon spins and spins, but there's nothing happening (the status bar doesn't even say Looking for... or Waiting for... and the mouse doesn't show the spinner either)

And people whine about the state of Linux's power saving? If I've got to close all my programs before I let the computer go to sleep, I might as well choose shut down (and I do.)

Re:Turning off something saves money? Really? (1)

Sentrion (964745) | more than 2 years ago | (#40142593)

I have always seen these types of problems with the so-called "hibernate" or "sleep" modes. I always disable this feature the first chance I get. The ridiculous amount of time required for the rebooting process hasn't improved much since Windows 3.1. The more software you use on a daily basis the worse your problem gets. Let's say to like to keep track of your schedule with a PC based organizer. If during any particular weekend day you only need to update your schedule four different times and random intervals throughout the day, you are not going to be using you time efficiently if you wait 5, 10 or 15 minutes to boot up, log in, open your software, etc., especially if the task you want to perform only takes 30 seconds. So, most people who like to use their PC software throughout the day but for short intervals at a time, it almost becomes impractical unless you keep your machine on all day long. Now, of course, you could always turn off your machine and spend 3-4 hours each day doing things that don't involve your PC, but if you need to quickly look up something that you have recorded on your hard drive or somebody calls you and needs you to do something quickly for them, then keeping your machine on all day, even if mostly unused, is an unfortunate fact of life. Mobile apps might help to liberate you from your machine, but that depends a lot on your particular situation. Saving $60 each year might be worth it for some people, but not for everybody.

Re:Turning off something saves money? Really? (1)

maitai (46370) | more than 2 years ago | (#40143411)

My laptop (Sony Vaio Z) cold boots into Windows 7 in ~9 seconds (fresh install of Windows 7 minus crapware helps a lot). But to be honest I keep my laptop on 24/7 when it's on AC power and use sleep mode whenever I pack it up to bring it somewhere else. I seldom if ever turn it off.

Re:Turning off something saves money? Really? (1)

Liquidretro (1590189) | more than 2 years ago | (#40143629)

I don't have this problem on W7 on newer hardware with a fairly new format.. Some programs don't sleep well. Understandably photoshop is probably one of them, but sounds like you either have some driver conflicts and/or are in need for a fresh install.

Hey Geoff.. (1)

MickyTheIdiot (1032226) | more than 2 years ago | (#40142009)

Now do a calculation of how much of your employer's time you wasted doing your calculation!

If you make all the bad assumptions the RIAA makes, I bet you can make it hit a cool million, easy!

True costs (1)

crakbone (860662) | more than 2 years ago | (#40142017)

True costs - where is the vitamin d deficiency, light sensitivity, prices for bawls and redbull, price for pizza, radon exposure from your mom's basement,depends for long raid nights, divorce costs, hardware costs and software licensing and general lowering of testosterone levels. Of course the benefits are, water savings because of less baths, no social costs (coffee shops, movies, dates, video rentals, vacations, etc), not expensive presents for friends, less electricity used in the house because no other lights are used, furniture reduction, lower vehicle maintenance costs, lower automotive fuel costs, and more leet gear

C3 (1)

Cylix (55374) | more than 2 years ago | (#40142047)

I would suspect C3 sleep states are supported on a majority of systems by now. Perhaps I was just lucky when I picked up the hackintosh board a few years ago. Now, I simply use a reasonably long idle timer and the system goes to sleep/power off. It takes a few seconds to come back out of that state and wholly beats a cold start.

I guestimate my home system gets about 3-4 hours of usage each day during the weekday. In addition, there are plenty of other device around the house which support other core services.

I don't know if it's so much about being green as it is the sensibility to turn a light switch off if it's not in use.

Lame (0, Flamebait)

vlm (69642) | more than 2 years ago | (#40142049)

Lame as heck.

Does he game in a pitch black room? My basement (my basement, not my moms basement) used to be lit by 800 watts of incandescent lights on tracks (which isn't all that bright for 30 x 30 feet), and as they burned out (which took awhile) I replaced then with $50 LEDs that will pay for themselves in saved ecological and energy costs in a mere five million years of operation. Anyway the point is my lighting electrical budget in the basement was an order of magnitude greater than my video card power budget and the capital costs are darn near as bad. Depreciation schedule is much longer however.

Another issue is my property tax is very roughly $2 per square foot of my house, and my desk (which is admittedly pretty luxuriously large) is about 20 sq ft and my elaborate "executive reclining office chair" is probably at least 4 sq ft extra, lets call that 25 sq ft * $2/sq so you're looking at $50/yr rent to the city (aka prop tax) simply to store the machinery, which is about twice the supposed electrical consumption. Now if you want to see high annual average specific power consumption per sq foot, try the 4 sq feet where my clothes dryer resides... Or my furnace, or air conditioner compressor...

Another way to look at it, is I splurged and dumped $200 of environmental and energy degradation into purchasing my last video card a couple years back. The electrical cost will approach the capital cost of the card, assuming the video card is the only consumer of energy (LOL) at $30/yr, in 2017. Of course the card will be functionally obsolete before 2017.

So if you can afford a gaming rig and can afford to upgrade it every five years or so, you can simply ignore the costs of operating it as a rounding error. Or the operational costs only matter if you stole the gear or got it as a gift.

Kind of like, if you can afford to buy a $60K SUV or pickup truck, then $4/gallon gasoline is merely a rounding error to be ignored. Or if you buy a $2M california mcmansion, a $1K month air conditioning bill is a rounding error compared to the mortgage (to say nothing of the decline in property values)

Re:Lame (3, Interesting)

Simon Brooke (45012) | more than 2 years ago | (#40142333)

My home has on average 100 watts of power available. I can use more in the short term, but doing so depletes the battery and means I'll have to use much less for some part of the week. The wind turbine which is my sole source of power is rated at 750 watts, but only generates that much in absolutely perfect conditions. So I know quite a bit about how to use power economically. I can light my whole house effectively with just 18 watts of LEDs. They're strategically placed, yes - but you can easily read more or less everywhere.

In this situation, the graphics card on my computer (Radeon HD 6850 at 127 watts TDP) is actually the biggest power drain I've got. Obviously, my gaming is limited to two or three hours a day... Power is worth thinking about.

...and? (1)

Sycraft-fu (314770) | more than 2 years ago | (#40143891)

Seriously, you live in a very odd situation. While I'm not against conservation, and indeed I do turn my PC off when I'm not home because why use what isn't needed, you can't try and use your situation to apply to the population at large. 100 watts is NOT something I have to think about. My house has about 15,000 watts of power available to it at all times. 100 watts more or less is not noticeable and is well within the margin of error I get depending on how the AC is run.

A drop in the bucket (1)

Iniamyen (2440798) | more than 2 years ago | (#40142083)

Considering that the gas portion of my energy bill utterly dwarfs the electricity portion (especially during the winter), I hardly even pay attention to how much electricity I use. For those who have electric heat, I am sorry.

Re:A drop in the bucket (1)

operagost (62405) | more than 2 years ago | (#40142197)

At least you don't heat with oil. Yikes. Nearly double your heating costs.

Re:A drop in the bucket (1)

MoldySpore (1280634) | more than 2 years ago | (#40142507)

The only thing that is gas in my house is the stove. With 3 full size PC's (700W+) running 24/7/365, and heating/cooling the house with central air/heat 24/7/365 my electric bill is usually between $100-180. I'd be interested to know how it compares to someone using gas to heat and cool. I have a feeling that if you took out my PC's and the central air, my bill would be around $5 a month lol

But wait, what about winter heating costs? (0)

Anonymous Coward | more than 2 years ago | (#40142095)

Since they're so important when it comes to incandescent bulb discussions, they must be important here. How much will he lose by paying for extra heat that his PC isn't providing?

Millions, even billions of joules??

I think the clear solution is obvious.

60 watt incandescent bulbs EVERYWHERE! SAVE THE WORK OF GOD-KING EDISON!

SSD (0)

Anonymous Coward | more than 2 years ago | (#40142109)

Electrical costs are why I put an SSD in my home theater PC. Fast boot time means a PC that was on 24/7, is now only powered when in use.

Re:SSD (1)

avandesande (143899) | more than 2 years ago | (#40143303)

I was going to post but an AC beat me to it- with SSD I am much more likely to shut off my machine because it boots so quickly.

Peanutes, really. (2)

nashv (1479253) | more than 2 years ago | (#40142111)

All in all, that is really peanuts in terms of electicity bills. If you are spending roughly 2 hours a day gaming, a normal person with a full-time job and a family would have very little time to do much else that can sink money.

Considering that yearly electricty bills routinely reach about a $1000+ for a standard household [] , this added 10% due to gaming is pretty insignificant when compared to other racing cars for example.

Sure, there may be cheaper hobbies, but I honestly don't think anyone well-settled enough to be practising a daily hobby and deriving enjoyment from it finds it a problem to spend 8 bucks 50 cents a month for their recreation.

Re:Peanutes, really. (0)

Anonymous Coward | more than 2 years ago | (#40142453)

Gasoline bill. Try driving 40+ miles a day to and from work every day. Living in the suburbs is not as cheap as everyone thinks it is. The *only* reason to live in the burbs is to raise children in a safe environment and provide proper public education. Other than that, you will pay for that lifestyle in the form of isolation costs from the city. It's where the jobs are.

$100 a year huh? (1)

JustAnotherIdiot (1980292) | more than 2 years ago | (#40142159)

If only my phone service was that cheap.

$67 a year, really? (1)

Anonymous Coward | more than 2 years ago | (#40142205)

That's nothing. Not even worth my time trying to save. When I can save $67 a month, post.

As compared to...? (3, Insightful)

geekmux (1040042) | more than 2 years ago | (#40142345)

I'm not sure what exactly the article is trying to convey here, as measuring electrical consumption is merely fine-tuning an existing expense related to a hobby, and an obscenely small amount of money being measured at that (c'mon, ~$30/year? People who will spend twice that much in a month on caffeine just to play said hobby).

Compare playing video games to spending money on cable TV. Or going to the movies. Or riding a bike outside. Discussing literally pennies of electrical savings per day seems rather pointless when you're spending considerably more to sustain that kind of hobby in the first place.

why do all such examples use the cheapest rates? (1)

0111 1110 (518466) | more than 2 years ago | (#40142413)

As of my last month's bill I am paying 28.8 cents per kWh. I'm not sure how much power my computer uses, but with my Nvidia GTX280 and an overclocked 4 Ghz dual core CPU I would assume at least 400 watts. Particularly while playing a game. So let's say 12 hours for a day of gaming. So 4.8 kWh or $1.38 per day of marathon gaming. If you assume 4 days per week that would be $22.12 per month or $265.42. Of course my computer may actually use 500 or 600 watts while gaming. What interests me more is how much power my computer uses when I'm not gaming. There have been times when I've just left my computer on all the time. I would suspend to RAM, but I usually run Windows and I have yet to find a version of Windows that will properly wake up from an STR properly. That's why I'm thinking of switching to Linux. Suspend to RAM works perfectly for me in Linux.

Re:why do all such examples use the cheapest rates (0)

Anonymous Coward | more than 2 years ago | (#40143709)

As of my last month's bill I am paying 28.8 cents per kWh. I'm not sure how much power my computer uses

So maybe instead of blabbing crap about you do something about it?

1. get a kill-a-watt meter - it's $20 in ebay
2. plug it in and see

Guessing how much power you use is stupid without actual measurement.

Secondly, this is not the "cheapest rates". These are typical rates in the US. Where I live, I pay 6.5c/kWh, which is one of the cheapest rates.

1W power usage => 8.76kWh/year utilization.

So for me, each W reduction, it saves me about $0.60. 100W = $60.

For you, 1 W reduction in usage (eg. standby) yields $2.52. So maybe your usage of suspend to RAM is not a good thing - it costs you $10-$20/yr.

Get a laptop (0)

Anonymous Coward | more than 2 years ago | (#40142481)

At the most, a high-end gaming laptop uses 250-watt, since their ac adaptors state that. Compare that to a PC that uses 450+ watts.

You could also save more money if you only play consoles AND don't play online. you'd be using less energy than a PC and won't need a fast internet to get low latency.

Since no one has posted this yet (1)

Anonymous Coward | more than 2 years ago | (#40142655)

What is the cost of your time? I leave my computer on most times in order to avoid the startup "process" my aging system has or to maintain program state between uses. Conversely, I shut off the system to force myself to not use it.

Anyway. Spread the cost of time across the year while waiting to boot. This could easily be higher than $66 for those of us making a reasonable wage. As an object exercise, 30 seconds per day for 365 days is about 3 hours. If you make $22/hr then the savings is a wash. Just a simple example.

Re:Since no one has posted this yet (2)

Dunbal (464142) | more than 2 years ago | (#40143187)

What is the cost of your time?

Zero. It took me 3 months in intensive care to figure that out. Now my priorities are different, and I am actually happy.

Re:Since no one has posted this yet (0)

Anonymous Coward | more than 2 years ago | (#40143589)

I leave my computer on most times in order to avoid the startup "process" my aging system has or to maintain program state between uses. Conversely, I shut off the system to force myself to not use it.

Get a modern operating system and hardware that can support the sleep feature, it will pay for itself if you are leaving your equipment on at all times.

what is the true total cost? (0)

Anonymous Coward | more than 2 years ago | (#40142671)

this is only the immediate direct cost. the true total cost is higher. this is because there is an environmental cost that is not accounted for in the generation of electricity. we don't live in a static universe, so if we don't take the hidden costs into account the full repercussions of our choices will not be apparent.

also, assessing the cost simply in monetary terms has a feedback effect that keeps us unconscious of the actual impacts of our decisions. we stay focused on things like saving more money, or making more money, etc. what i'd love to see is a holistic analysis in terms of money and something like (say you were using hydro-power) the number of salmon that the dam prevented from spawning each year for a year of gaming. which would provide a less abstracted context for decision making.

as someone who cares about the state of the world, imho, maintaining a holistic perspective on *everything* is a requirement for being a good steward (a responsibility we are all born with whether we take it up or not). whenever i fail to looks at things holistic i have failed as a steward. this is because we are all (humans, rocks, air, animals, etc.) interdependent.

another suggestion (1)

jcgam69 (994690) | more than 2 years ago | (#40142909)

Keep the inside of your computer clean. Clogged filters and fans consume more power to keep the computer cool.

Amdahl Shrugged (2)

drdrgivemethenews (1525877) | more than 2 years ago | (#40142985)

You're working on one of the smallest possible incremental changes in your house's electrical usage. What's the point?

The wall warts (AC adapters) scattered about your house almost certainly use and waste more electricity than your PC. The US EPA guesstimated in 2005 that around 200 gigawatts (6% of US total power) goes through these things, and a significant portion of that (30 - 50%) is wasted.

See [] Getting all your wall warts onto centrally controlled power strips would seem like an interesting and money-saving challenge. If anyone has done that, I'd love to hear about it.


Dunbal (464142) | more than 2 years ago | (#40143153)

Turning off your computer saves electricity!

I mean seriously, wtf.

Reality sucks, eh? (3, Insightful)

TheSkepticalOptimist (898384) | more than 2 years ago | (#40143301)

Someone just moved out of his parents house and realized that electricity actually costs money. Spoiler alert, 40 minute long hot showers also costs a lot on the water and gas bills.

Its hilarious me when teens / early twenty-somethings leave the protected isolation of their parent's nest or university dorm and suddenly get a good ol' does of reality.

not just that... (1)

buddyglass (925859) | more than 2 years ago | (#40143575)

If he weren't interested in gaming he could likely make do with a much less powerful GPU and/or possibly a more power-efficient CPU. The combination of those two would reduce his power consumption even further during non-gaming-related computer usage (or idling).

Not a fair comparison (1)

HangingChad (677530) | more than 2 years ago | (#40143827)

To really figure the electrical cost of gaming, you have to figure out what else people would be doing if they weren't playing games. Some activities, like watching TV, would use as much or more power.

My guess is if we calculated the energy use of those other activities, gaming might be a net energy saving activity.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?