Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Is StarCraft II Killing Graphics Cards?

CmdrTaco posted more than 3 years ago | from the wish-i-had-a-copy dept.

Graphics 422

An anonymous reader writes "One of the more curious trends emerging from last week's StarCraft II launch is people alleging that the game kills graphics cards.The between-mission scenes onboard Jim Raynor's ship aren't framerate capped. These are fairly static scenes, and don't take much work for the graphics card to display them. Because of this, the card renders the scene as quickly as possible, which then taxes your graphics card as it works to its full potential. As the pipelines within your graphics card work overtime, the card will heat up and if it can't cope with that heat it will crash."

cancel ×

422 comments

frosty pissy (-1, Flamebait)

Anonymous Coward | more than 3 years ago | (#33109334)

go fuck yourself.

Re:frosty pissy (-1, Flamebait)

Anonymous Coward | more than 3 years ago | (#33109676)

Hermaphrodite?

Ridiculous. (5, Insightful)

Mr EdgEy (983285) | more than 3 years ago | (#33109336)

How about timedemos for FPS games? Benchmarking your card? Tools used for overclocking to actually stress the card? These GPU's are designed to operate at max temp. Many games operate with no FPS cap unless vsync is enabled. This is a complete non-issue.

Re:Ridiculous. (4, Informative)

Vectormatic (1759674) | more than 3 years ago | (#33109414)

i was thinking the same thing, many games arent FPS-capped anyway, and even in capped games, gamers will put the settings up so high that the game wont run at the capped framerate all the time

Graphic cards should be able to cope with it, although i do believe that it is possible to load a GPU in such a way that more transistors are active at the same time then the manufacturer thought would happen.

So unless there are reports of thousands of melted video cards, i call shens

Re:Ridiculous. (0)

Anonymous Coward | more than 3 years ago | (#33109576)

As far as I've seen, no one's claiming "melted" cards or anything like that, it's just a standard inflammatory Slashdot headline. Basically, the game crashes during those segments because the card overheats, and the fact is that in many prebuilt systems (particularly "gamer" systems from places like Dell and Systemax) will overheat if the graphics card runs at 100% for an extended period of time. I think we're seeing lots of reports from Starcraft players just because it's such a popular game, particularly in the non hardcore gamer crowd, where there are a lot of those type of prebuilt systems.

The Fix (5, Informative)

Pawnn (1708484) | more than 3 years ago | (#33109770)

This 15 page thread has some people who say they've had melted cards. A lot of the problems seem to be with laptops. As a corollary, people are reporting that the "fix" also helps with Alt+tab speed if anyone cares about that. http://www.gamespot.com/pc/strategy/starcraft2/show_msgs.php?topic_id=m-1-55785055&pid=939643&page=2 [gamespot.com] Since I haven't seen anyone else post the fix, I will: Add the following lines to your "Documents\StarCraft II\variables.txt" file: frameratecapglue=30 frameratecap=60 You can add them to the beginning, end, or wherever. The game doesn't care.

Re:Ridiculous. (0, Interesting)

Anonymous Coward | more than 3 years ago | (#33109420)

Many games operate with no FPS cap unless vsync is enabled.

Yeah and they're typically actually doing some serious work which effectively caps the FPS. This ultrafast rendering of "fairly static scenes" as the summary puts it, is more akin to putting your car in neutral and then mashing the gas pedal to the floor for an extended period. Your engine might not explode or throw a rod right away but it'd really prefer to be idling.

This is a complete non-issue.

Unless your video card suddenly stops working.

Re:Ridiculous. (5, Insightful)

Anonymous Coward | more than 3 years ago | (#33109636)

If a graphics card cant survive a tight loop like that then it's designed by an idiot. My intel processor can survive that state.. and NO it's not like putting the car in neutral and mashing the throttle. you are comparing a device with a rotating mass getting sped up PAST the designed rpm rate+time point that actually would not cause a problem because of the rev limiter kicking in.

electronics, my uneducated friend, are different as there are NOT any moving parts in it.. This may surprise you.

I am tired of electronics nowdays designed for highest profit and not for quality. Engineers for chip makers are complete fucking morons if this is really happening.

Re:Ridiculous. (1)

C0vardeAn0nim0 (232451) | more than 3 years ago | (#33109902)

i don't think it's so much the engineers as the beancounters doing "engineering" with an excel spreadsheet.

and since those beancounters are usually higher on the company hierarchy, the engineers either obey or get shafted.

to use a car analogy (sorry!), engineering by marketing resulted in the the ford edsel, engineering by beancounting resulted in ford pinto.

Don't make car analogies if you don't understand (5, Informative)

Sycraft-fu (314770) | more than 3 years ago | (#33109642)

This is not putting your car in neutral and laying on the gas, it is a meaningless comparison. GPUs have no problem rendering excess frames, lots of excess frames, and simply not making any real use of them. This is no more a problem than having a CPU run a computationally intensive test that doesn't do anything. There is no difference from a heat or function standpoint between all the units being fully active rendering something simple quickly or all the units being active rendering something complex slowly. In either case all the logic is active with lots of power flowing through and thermal output is maxed. A component should be able to handle this, no problem. Whatever a CPU or GPU is rated to for speed is not a temporary max, it is what it can run at full time. If there is a failure, it indicates a defect of some kind somewhere.

The most usual defect is inadequate airflow. People have a case with poor airflow, and reduce it further by not clearing dust buildup. As such the components can't cool themselves well enough.

As the GP said: This is a non-issue. If it happens to you, the game revealed a problem, it didn't cause it. Fix your system.

Re:Don't make car analogies if you don't understan (1)

cynyr (703126) | more than 3 years ago | (#33109830)

well to be honest, there are so few cases with good airflow, and then the GPU manufacturers skimp on the cooling power of the stock heatsinks. 12volts and 10 amps is 120 watts, so yes you will need some substantial cooling to keep things in spec temps. Although cases like the SG07 have dedicated cold air for the GPU and a fair amount of cooling on the backside of the card as well.

In the end though, this really should be a non issue, with proper component selection, placement and cooling.

Re:Don't make car analogies if you don't understan (0, Troll)

BobMcD (601576) | more than 3 years ago | (#33109904)

Whatever a CPU or GPU is rated to for speed is not a temporary max, it is what it can run at full time. If there is a failure, it indicates a defect of some kind somewhere.

That's a wonderful world to live in, but it isn't exactly the real world. The chief advantage of the real world over the one you describe is that stuff here is a lot cheaper. This world has laptops that aren't necessarily 'toughbooks' and commodity hardware that is often imperfect, but 'good enough' for most uses until it becomes obsolete and you replace it anyway. None of the stuff bought in the here and now will handle running full-bore for very long unless you paid an exorbitant amount for it.

Re:Ridiculous. (0)

Anonymous Coward | more than 3 years ago | (#33109740)

You're making a logic error here. It seems to be an assumption that the video card will run infinitely fast unless frame-rate capped. This is wrong. We're not talking about an electrical circuit without a resistor.

Re:Ridiculous. (5, Insightful)

Andy Dodd (701) | more than 3 years ago | (#33109502)

There is a parameter used for most high-dissipation ICs (such as CPUs and GPUs) - It's called "thermal design power".

This is the absolute maximum amount of heat the card can dissipate under any circumstances (not counting overclocking). The nature and definition of TDP means it should be physically impossible for ANY software to ever cause the card to exceed TDP.

If you have a system that can't handle the card running at TDP, that's faulty design of your system, not whatever caused it to hit TDP.

Re:Ridiculous. (5, Informative)

bertok (226922) | more than 3 years ago | (#33109754)

There is a parameter used for most high-dissipation ICs (such as CPUs and GPUs) - It's called "thermal design power".

This is the absolute maximum amount of heat the card can dissipate under any circumstances (not counting overclocking). The nature and definition of TDP means it should be physically impossible for ANY software to ever cause the card to exceed TDP.

If you have a system that can't handle the card running at TDP, that's faulty design of your system, not whatever caused it to hit TDP.

Many video cards can exceed their TDP through certain sequences of instructions, and the drivers include code to prevent this from occurring. There's been issues in the past where this filter wasn't perfect, and cards were destroyed, typically when executing GPU stress tests.

Re:Ridiculous. (0)

Cyberax (705495) | more than 3 years ago | (#33109804)

"This is the absolute maximum amount of heat the card can dissipate under any circumstances (not counting overclocking). The nature and definition of TDP means it should be physically impossible for ANY software to ever cause the card to exceed TDP."

In theory.

Re:Ridiculous. (1)

Lonewolf666 (259450) | more than 3 years ago | (#33109824)

On top of that, it has been possible to put heat sensors on the chip and throttle the clock in case of overheating for several years now. IIRC Intel introduced this with the Pentium 4, and in some PCs with poorly cooled 3GHz+ P4 models this throttling actually kicked in. Annoying for the users, but at least their systems did not die.

Maybe AMD/ATI and NVIDIA should copy that feature? (apologies for dissing them if they have actually done so).

Re:Ridiculous. (5, Funny)

V!NCENT (1105021) | more than 3 years ago | (#33109506)

$ glxgears
5791 frames in 5.0 seconds = 1158.177 FPS
7120 frames in 5.0 seconds = 1423.968 FPS
6801 frames in 5.0 seconds = 1360.132 FPS
7110 frames in 5.0 seconds = 1421.871 FPS

Nope. No meltdown. Totally BS...

wow! what card?, and then I realized (2, Informative)

spineboy (22918) | more than 3 years ago | (#33109936)

I too, ran GLXGEARS to check my framerate, and was pulling 3500 FPS on a 6 month old good card, and was wondering - "HOLY fuct! -what card do you have that runs that fast?"

And then I remembered you could shrink the screen, and get higher FPS
(makes glxgers screen tiny)

20,900 FPS
21,500 FPS

meh...

Re:Ridiculous. (4, Interesting)

Zeussy (868062) | more than 3 years ago | (#33109514)

The issue is quite simple, stardock had the same issue with galciv 2. There are people playing sc2 who do not play games that fully tax the graphics card as these scenes do, and do not have well ventilated cases, causing the cards to overheat and crash. The issue is solved with a simple frame rate cap. Or the consumer to adequately ventilate their case.

Re:Ridiculous. (0)

Anonymous Coward | more than 3 years ago | (#33109806)

galciv 2 had a joke of a graphics engine though. That was just "Brad Wardell and Stardock" style programming.

Re:Ridiculous. (5, Interesting)

Sycraft-fu (314770) | more than 3 years ago | (#33109566)

No kidding. SC2 may end up being more intense if it happens to be just the right balance so that the ROPs, TMUs, and shaders all get to work to near capacity, but same shit: If your card crashes the problem is your setup, not the game. For a demo that'll kick the crap out of your card heat wise, try Furmark. It is designed such to run the chip to its absolute limits and thus have maximum power draw. If your system bombs it isn't the demo that is wrong, it is your computer. Maybe you don't have enough power, maybe your ventilation is bad, maybe your GPU has a defect in it. Whatever the case an intense load that causes a crash is revealing a problem, not causing it. Your system should handle any load given to it.

Re:Ridiculous. (1, Funny)

elrous0 (869638) | more than 3 years ago | (#33109594)

Yeah, but in the real-world, how many apps work your hardware to its max capacity for long periods of time? Considering how long Koreans are known to play Starcraft, I imagine there will be quite a rash of computer fires south of the 38th parallel, and a subsequent rash of suicides and shooting sprees.

Re:Ridiculous. (1)

egamma (572162) | more than 3 years ago | (#33109744)

Yeah, but in the real-world, how many apps work your hardware to its max capacity for long periods of time? Considering how long Koreans are known to play Starcraft, I imagine there will be quite a rash of computer fires south of the 38th parallel, and a subsequent rash of suicides and shooting sprees.

There was no 3D rendering in the original Starcraft. Or are you saying there are no news reports from Korea about SC2 issues?

Re:Ridiculous. (2, Interesting)

striker64 (256203) | more than 3 years ago | (#33109626)

Normally I would agree, the graphics card should be able to handle anything that is thrown at it, but there is something to this story. I have a radeon 4850 with one of these zalman coolers on it http://www.quietpcusa.com/images/vf1000-led.jpg and my case is big with lots of cooling. I have used this exact same configuration to play countless games over the last year, including MW2, BC2, etc. and never had a single crash. But now my system is crashing at the SC2 menus. My brother's machine is doing the exact same thing. Perhaps because the rendering is so simple, it's causing the card to go faster than the designers intended, causing extreme heat in one specific part of the pipeline. Anyhow, the fix mentioned in the article does solve the problem for me.

Not sure I get the reasoning here (2, Interesting)

Sockatume (732728) | more than 3 years ago | (#33109340)

Are most games framerate-capped? Wouldn't all games, at all times, be rendering as quickly as possible, operating to the graphics card's full potential?

Re:Not sure I get the reasoning here (5, Insightful)

Dragoniz3r (992309) | more than 3 years ago | (#33109366)

Yes, they do. It is quite standard practice for games to render uncapped. This story is just FUD and troll. I would've expected it to come from kdawson, but apparently I gave Taco too much credit.

To clarify my stance: This story is retarded, and all the time you look at it/think about it is time you won't get back.

Re:Not sure I get the reasoning here (5, Insightful)

crossmr (957846) | more than 3 years ago | (#33109584)

At this point I suspect "Kdawson" is a lot like "Alan Smithee". He just forgot to tick the box this time.

Re:Not sure I get the reasoning here (1)

Burnhard (1031106) | more than 3 years ago | (#33109666)

Crysis Warhead killed my ATI 4890. The fan bearings went (the card still worked and the fan kind-of cooled it, but the noise was unbearable). So yes, playing games can kill your graphics card, in much the same way that driving your car a long way can cause it to break down.

Re:Not sure I get the reasoning here (1)

grimJester (890090) | more than 3 years ago | (#33109862)

I'd be interested in seeing some of the more reputable hardware sites take up this story. Sure, running Furmark should stress your GPU far more than any game and unless you can run it without crashing your rig isn't stable.

I still wonder if there is something more to this story than a bunch of cards with insufficient cooling crashing. Few if any professionally assembled PCs should have bad enough cooling that a game could cause the GPU to overheat.

Re:Not sure I get the reasoning here (2, Interesting)

ShakaUVM (157947) | more than 3 years ago | (#33109868)

>>This story is just FUD and troll. I would've expected it to come from kdawson, but apparently I gave Taco too much credit.

It's news, because... it's about Starcraft 2? Kinda?

Why not run a story about how Quake 1 is killing modern computers? The last time I ran Quake it was somewhere above 300fps with vsync disabled.

Re:Not sure I get the reasoning here (1)

LWATCDR (28044) | more than 3 years ago | (#33109894)

I find it odd that people don't want the game frame rate capped.
Why go past 60 FPS? Okay maybe 120 if you nuts?
What do you gain? I would rather not put out the heat and eat up the power.
Of course the only game I really play is FSX and I would love to see 60 FPS with all the eye candy turned on.

Design issue? (3, Insightful)

DaveV1.0 (203135) | more than 3 years ago | (#33109346)

This sounds more like a design issue with the cards than an issue with StarCraft 2. If the card can not handle performing at its full potential, then the card was under-engineered in the first place.

read: StarCraft will expose your crappy setup (5, Informative)

Anonymous Coward | more than 3 years ago | (#33109360)

Clearly StarCraft is not at fault here. No software should be capable of damaging your graphics card. But if the thermal design of your system is broken, then it's your fault, or the manufacturer's.

If your card breaks and there is nothing wrong with your cooling, then your card was already broken before you even fired up StarCraft.

Re:read: StarCraft will expose your crappy setup (1, Insightful)

Anonymous Coward | more than 3 years ago | (#33109596)

Clearly StarCraft is not at fault here. No software should be capable of damaging your graphics card. But if the thermal design of your system is broken, then it's your fault, or the manufacturer's.

If your card breaks and there is nothing wrong with your cooling, then your card was already broken before you even fired up StarCraft.

Why are you even assuming the story is correct?

"Reports of graphics problems" is a bit nebulous, to say the least.

My money is on SlashFUD at this point in time.

Re:read: StarCraft will expose your crappy setup (5, Insightful)

RogueyWon (735973) | more than 3 years ago | (#33109672)

Or a more developed version of the same argument:

Starcraft 2 has a pretty wide audience, by the standards of a PC/Mac game, and while it's certainly not a Crysis-style hardware-hog, it does have higher requirements than a lot of the usual mass-market PC games (eg. The Sims and its sequels). In addition, its prequel, which is 12 years old and was technically underwhelming by the standards of its own time (the graphically-far-superior Total Annihilation actually came out first) has a large hardcore fanbase, a lot of whom probably don't play much other than Starcraft.

So Starcraft 2 is released and is promptly installed on a lot of PCs that are not routinely used for gaming, or at least for playing games less than a decade old. A large chunk of these PCs have never run a high-end modern game before. When asked to do so, the less-than-stellar graphics cards in a good portion of them give up and fall over. No conspiracy, no fault in Starcraft 2, just a lot of crusty PCs being taken outside of their comfort zone and not faring so well.

Uhh... (5, Interesting)

The MAZZTer (911996) | more than 3 years ago | (#33109362)

You can uncap the framerate in lots of games, but we've never heard about this problem before. I don't think this is a problem. Especaily since you can easily make a GFX card run at full capacity and a low framerate by simply playing a game that's a little too new for it, something a lot of people trying to put off upgrades do. If your GFX card can't run at it's maximum capacity without overheating, something is wrong with its cooling.

Correction (1)

The MAZZTer (911996) | more than 3 years ago | (#33109406)

s/I don't think this is a problem./This sounds like a red herring to me.

And the article talks about dust being the problem; which is exactly what I was thinking of when I said "something is wrong with its cooling". I've had that problem before with my old GPU; in Left 4 Dead 2 (but not in older games like TF2) I'd get great slowdowns every so often and my GPU was running pretty hot. Turns out it was throttling itself to keep itself from getting even hotter. A heatsink cleanout fixed that right up.

Re:Uhh... (2, Insightful)

Delwin (599872) | more than 3 years ago | (#33109442)

That's the whole point of the article.

Re:Uhh... (2, Informative)

The MAZZTer (911996) | more than 3 years ago | (#33109462)

Yeah I ripped apart the summary before I moved on to the article. It does look like the article still tries to place part of the blame on Blizzard though, as the author expects it to be patched.

Re:Uhh... (1)

drinkypoo (153816) | more than 3 years ago | (#33109650)

Yeah I ripped apart the summary before I moved on to the article. It does look like the article still tries to place part of the blame on Blizzard though, as the author expects it to be patched.

it's pathetic for computer hardware to kill itself by overheating, but if you know that it can happen, you should still do your best not to overheat it.

Re:Uhh... (1)

TheLink (130905) | more than 3 years ago | (#33109756)

> but if you know that it can happen, you should still do your best not to overheat it.

Yeah, don't play games like Starcraft till you buy a proper graphics card that can handle it :).

If the cards were dying even when the fans were working fine,there wasn't that much dust, and the ambient temps were within range, then I'd say the cards are faulty.

Might explain my crashes (5, Informative)

The Barking Dog (599515) | more than 3 years ago | (#33109364)

I'm playing Starcraft II on the last-gen iMac (purchased about four months ago) on OS X 10.6.3. The game is stable during gameplay, but it's crashed on me several times in cutscenes, onboard the Hyperion, or even in the main menu (ironically, while I was bringing up the menu to quit the game).

Re:Might explain my crashes (-1, Flamebait)

Idiomatick (976696) | more than 3 years ago | (#33109708)

"iMac" ... Found your problem.

Re:Might explain my crashes (0)

Anonymous Coward | more than 3 years ago | (#33109852)

Playing on a MacBook Pro from Aug 2009, with the discrete NVidia card. No crashes yet for me.

Re:Might explain my crashes (0)

Anonymous Coward | more than 3 years ago | (#33109856)

iMac has no fan, the video card was designed to be activley cooled

that is what you get when you buy form over function from a turtleneck wearing nutjob who has a obsession for removing fans

ever hear of the lisa? what about mac chimneys? howabout the G5 and its lava hot cpu with a socket7 cooler on top, or the constant issues of macbooks where the gpu comes unsoldered

starcraft is not killing your computer, vanity on you and the brand of computer you choose is

Re:Might explain my crashes (1)

grimJester (890090) | more than 3 years ago | (#33109900)

What graphics card does it have? I'd be surprised if an iMac doesn't have adequate cooling.

*Cough* (2, Insightful)

Lije Baley (88936) | more than 3 years ago | (#33109370)

Bullshit.

Re:*Cough* (0)

Anonymous Coward | more than 3 years ago | (#33109510)

Indeed, if this is killing your graphics card it's because IT WAS ALREADY BROKEN.

Software cannot kill Hardware (2, Insightful)

Anonymous Coward | more than 3 years ago | (#33109382)

Only lazy firmware developers for hardware can do that, the fault is not any game, its the driver (or, if the program somehow turns of the fan)

Already dead (3, Insightful)

KirstuNael (467915) | more than 3 years ago | (#33109384)

Graphics card that can't handle working to its full potential is already dead (as designed).

Re:Already dead (1, Interesting)

drinkypoo (153816) | more than 3 years ago | (#33109488)

Graphics card that can't handle working to its full potential is already dead (as designed).

Amen! Starcraft II is not killing graphics cards, graphics cards are committing suicide when asked to perform their regular function. The hardware should always have thermal protection. The driver should always prevent runaway. If these things are not true then the design was incompetent in some way. I am not a blizzard playboy (SCII can go piss up a rope, I don't pay for spyware if I can avoid it.)

game was crashing for me (4, Informative)

j0nb0y (107699) | more than 3 years ago | (#33109386)

This may have been the problem I experienced. I had played in the (multiplayer only) beta with no problems. Once the game came out though, I kept crashing in single player in between levels. I cleaned the dust out of my computer and that solved the problem.

I wonder how many people experiencing this just have too much dust built up in their computers?

Re:game was crashing for me (2, Informative)

Anonymous Coward | more than 3 years ago | (#33109562)

Yes, this is a real problem that has been discussed on many sites including on Blizzard's forums. I expect it will get patched by Blizzard eventually. FIX Some systems may reach high temperatures and overheating conditions while running StarCraft II. This is mainly due to the video card rendering the screens at full speed. As a workaround, there are two lines that you can add to your Documents\StarCraft II Beta\variables.txt file to limit this behavior. Frameratecapglue=30 Frameratecap=60 The frameratecapglue controls the framerate at the menu screens. The frameratecap controls the framerate on all other screens. You may adjust these numbers to your liking.

Seriously..? (5, Insightful)

Fusione (980444) | more than 3 years ago | (#33109408)

Story title should read: "Faulty video cards with inadequate cooling are freeze when run at their full potential". This has nothing to do with starcraft 2, other than that it's a video game that runs on a video card.

So it dies when used as intended? (2, Insightful)

RealGene (1025017) | more than 3 years ago | (#33109416)

Sounds like a design defect in the card, not the game.

What year is this? (5, Interesting)

Sir Lollerskates (1446145) | more than 3 years ago | (#33109424)

When graphics cards overheat, the worst thing that happens is a blue screen. On ATI cards, they just restart the card (it does a recovery-mode type of thing).

You can overclock any card to insane temperatures (90C+) without them even turning off, much less breaking them. There is simply no way that Starcraft 2 is killing any graphics cards.

There *was* one issue with an nvidia patch a while back which a driver update actually did kill some graphics cards, but it was nvidia's fault, and they promptly fixed it.

This article is pure misinformation.

Re:What year is this? (1)

The MAZZTer (911996) | more than 3 years ago | (#33109494)

My old nVidia card would underclock itself when it started to overheat. Good thing too, when my heatsink got clogged with dust it probably saved itself when I was still clueless about it.

Even if its true... (5, Insightful)

Richard_at_work (517087) | more than 3 years ago | (#33109428)

Its hardly "Starcraft II Killing Graphics Cards", its "Shitty Graphics Cards Dying Because Of Lack Of Self Moderation When Running At Full Speed". But I guess the second version doesn't include a much hyped game in the title...

more than crash... damage (3, Interesting)

Speare (84249) | more than 3 years ago | (#33109430)

The summary says an overheated video card will crash. It will do more than crash. It can permanently damage the video hardware. This seems like a major hassle to swap out the video components on a big gaming rig, but it can be a lot worse for high-end laptops. I've had similar problems with 3D software running on a MacBook Pro -- plenty of performance, but the video card gets second priority in the heat-management.

In my MBP, there are separate temperature probes on the CPU, hard drive, battery and chipset, but none on the dual video chip units, so the thermostat-controlled fan won't even kick in when either the "integrated" nor the "high performance" video units are the only stressed component.

Besides the hardware cooling problems, there's no reason for trying to draw more than 120 fps on most LCDs; software needs to get more responsible about heat and speed resource usage when given access to over-spec hardware. Limit the rendering loop to 90~120 fps, unless you're doing something purposely exotic such as driving stereoscopic displays or CAVEs (at 90~120 fps per camera).

Re:more than crash... damage (4, Insightful)

rotide (1015173) | more than 3 years ago | (#33109552)

I'm going to have to disagree here. It's not up to software developers to go around making sure hardware x and y won't just roll over and die during certain sections of their game.

It's up to hardware manufacturers to make sure their hardware works under all normal conditions. I mean really, if you make hardware that can fry itself, maybe you're pushing it to far.

Gee whiz guys! We can render this game at 4839483 FPS! But don't do it for more than 2 seconds or it'll melt! Woot, time to release them en masse! The benchmarks will look awesome!

Pushing a card to its max should _never_ cause it to "crash", let alone get damaged.

Re:more than crash... damage (2, Interesting)

Greyfox (87712) | more than 3 years ago | (#33109720)

Apple is particularly bad for this. I had an older MacPro desktop that would display video artifacts and then crash in any 3D application. From what I was able to determine from research on the internet, the model I had actually had a firmware issue that would prevent the fans from spinning up as much as they needed to as the card got hotter. This problem seems to have been fixed in later models but if your fan vents get clogged with dust you'll still have problems. If you google around on "Mac Video Card Overheating" you'll find plenty of posts on the subject, but very little in the way of potential solutions. There are some huge threads on the subject on Blizzard's WoW forums. For a while their techs would actually refer you to Apple to get a video card replacement.

Software AND hardware needs to be smarter about heat. Most computers these days have temperature probes all over the place, but nothing in the hardware or the OS will prevent your machine from destroying itself if you push the hardware at all. I used to build my own computers and started running into heat issues. Not wanting to get a degree in thermodynamic engineering, I started buying them pre-assembled. Now I see that those guys don't want to be bothered with getting a degree in thermodynamic engineering either, so I'm guess I'm going to have to go back to building my own.

Re:more than crash... damage (1)

PitaBred (632671) | more than 3 years ago | (#33109816)

In my MBP, there are separate temperature probes on the CPU, hard drive, battery and chipset, but none on the dual video chip units, so the thermostat-controlled fan won't even kick in when either the "integrated" nor the "high performance" video units are the only stressed component.

Sounds like a hell of a design problem. Given what you had to have paid for it, I'd take it back. There's no excuse for that kind of incompetence.

Either that, or you just don't know what the hell you're talking about. I give it 50/50 odds.

No. (1)

ledow (319597) | more than 3 years ago | (#33109432)

No. Crappy cards that overheat when left running displaying ANYTHING (static images, top-end 3D, what does it matter?) are killing those graphics cards. CPU's (and therefore GPU's) should detect overheat, then throttle back or switch off as necessary. If that still causes a problem in 2010, you have bigger problems on your hands than how often a game decides to blit surfaces about - such as a potential fire. If your case is that dirty, your card should still cope anyway, even if that means it just overheats a little, alerts you to the fact, then shuts down - or that you notice it starts running really slow and yet still ramping its fans up to maximum.

What it happens to be running when this happens is neither here nor there. This whole article just sounds like a way to scaremonger people into not buying StarCraft 2 (which I wouldn't be purchasing anyway, in case anyone wants to question why I debunk this crap). This is NOT a Starcraft-only problem. I could make a fifty-line bit of code with SDL and OpenGL that could tax a graphics card - it shouldn't make it stop working or die unless you've done something very stupid like : overclock, disable warnings, ignore alarms.

Short answer: No (5, Funny)

mike2R (721965) | more than 3 years ago | (#33109434)

Long answer: NOOOoooooooooooooooo!!!!!

Oblig. Car analogy (0)

Anonymous Coward | more than 3 years ago | (#33109440)

It's like putting your car into neutral and pushing the gas pedal all the way down. And holding it there.

The 'work' being done is simple. But you're doing it REAL fast.

..or... Are graphics cards poorly designed? (1)

popo (107611) | more than 3 years ago | (#33109444)

We're talking about a piece of hardware here which is capable of melting itself down with no internal cap on processing, and we're blaming the software?

IANAE (I am not an engineer) but it seems to me that the software designers should be able to throw whatever they like at the cards, and it's up to the hardware manufacturers to see to it that the hardware doesn't self destruct.

Orrin Hatch (1)

Stargoat (658863) | more than 3 years ago | (#33109450)

Is this Orrin Hatch's "Destroy the PCs" [bbc.co.uk] plan made manifest? It has taken 7 years, but what subtle, indeed Machiavellian implementation.

Maybe if the cards are overclocked. (1)

Seumas (6865) | more than 3 years ago | (#33109452)

If you're overclocking your card, I can see how running at full capacity could eventually succumb to thermal damage. If your card is stock, then how exactly is "taxing" it at 100% going to damage it? Does your CPU fry when it runs at 99% or 100%? Of course not - unless it is overclocked (too much) or otherwise improperly installed or configured.

If these cards are not overclocked and Star Craft truly IS killing them, it definitely has nothing to do with the cards running at full capacity and overheating.

People understand that these cards are tested off the line, right? They know what these cards can run at when they manufacture them. Not to mention, plenty of people also run various distributed projects on their GPUs without any problem whatsoever.

Re:Maybe if the cards are overclocked. (1)

The MAZZTer (911996) | more than 3 years ago | (#33109544)

If your overclocked card can't handle maximum load, you're doing it wrong. The whole POINT of overclocking is to increase the maximum load the card can handle. If the maximum is at a point where it will cause crashes or damage to the card when sustained for short or extended periods you've overclocked it too much.

DESTROY (0)

Anonymous Coward | more than 3 years ago | (#33109456)

Bringing destruction to more than fictitious characters!

Issue was fixed in patch 14 of Beta (2, Interesting)

Anonymous Coward | more than 3 years ago | (#33109464)

God /. you are WAY behind here. This was an issue 5 months ago in the Beta. There IS a hard cap in menus now.

BS (1)

zlogic (892404) | more than 3 years ago | (#33109466)

What about OpenCL/CUDA? These frameworks use the card's full potential, so far nobody reported any issues. If the card has cooling problems, it's clearly the faulty hardware. The only downside is a slightly more heat and noise from the videocard than there should be during these scenes. This is not a car where revving the engine on neutral indeed stresses the engine.

Mindless, driveling sensationalism (3, Insightful)

Nimey (114278) | more than 3 years ago | (#33109476)

OMG NEW HIGHLY ANTICIPATED TITLE KILLZ0RZ YOUR COMPUTAR!!!

No, if your machine is crappy, this exposes that you've got cooling or power problems, or both. You should see that you fix these.

In '94 I had a 486SX-25 that would choke and die when playing Doom in multi-player from time to time. It wasn't that the game KILLZ0RED MY COMPUTAR, it was that the CPU couldn't keep up with everything. Sticking a DX2-50 Overdrive into the socket solved that problem.

My guess? Users need to STFU (5, Insightful)

Sycraft-fu (314770) | more than 3 years ago | (#33109482)

I fail to see how rendering a scene at a high framerate would be any more challenging than rendering a complex scene at a lower frame rate. Remember that the hardware either is or is not in use. The ROPs, the shaders, etc. It isn't like there is some magic thing about a simple scene that makes a card work extra hard or something.

So my bet is you have users that have one or more things happening:

1) They are overclocking their cards. This is always a potential for problems. When you push something past its spec, you may find it has problem in some cases.

2) Their airflow sucks. They have inadequate ventilation in their case for their card.

3) Their PSU is inadequate for their card. High end graphics cards need a lot of voltage on the 12v rail. If you have one that can't handle it, well then maybe you have some problems in intense games.

Really, this sounds no different than the people who OC their processor and claim it is "perfectly stable" but then claim that Prime95 or LinX "break it." No, that means it is NOT perfectly stable, that means you have a problem. Doesn't mean the problem manifests with everything, but it means that you do have a problem that'll show up sometimes.

I'm betting it is the same thing here. It isn't that SC2 is "killing" their card, it is that their card has problem and SC2 is one of the things that can reveal that. There are probably others too.

So if your system is crashing in SC2 disable any overclocking, make sure you've got good ventilation (which may mean a new case) and make sure you have a PSU that supports your graphics card, including providing dedicate PCIe power connectors sufficient for it. Don't blame the software for revealing a flaw in your system.

Re:My guess? Users need to STFU (2, Interesting)

Zeussy (868062) | more than 3 years ago | (#33109658)

Someone who is write on the money. The cards are crashing due to inadequate case ventilation. Stardock got the same issues with GalCiv 2.

I fail to see how rendering a scene at a high framerate would be any more challenging than rendering a complex scene at a lower frame rate. Remember that the hardware either is or is not in use. The ROPs, the shaders, etc. It isn't like there is some magic thing about a simple scene that makes a card work extra hard or something.

Games now a days are highly threaded, with game logic and rendering happening in parallel and both in lock step (waiting for each other to finish). The difference between a complex scene and a simple scene is that the render thread will have less to update, and do more draw calls. If there is little or no animation to update (either updated in the game logic and pushed across to render thread, or updated in the render thread), no complex scene culling or management, no new assets to upload to video memory, and there is no game logic to handle so the render thread is not waiting on that to complete, a simple scene and simple just turn into a solid list of draw calls with very little update breaks for the GPU to take a break.

Re:My guess? Users need to STFU (1)

CrashandDie (1114135) | more than 3 years ago | (#33109774)

Open case, put desktop fan roughly inside, full blast. If it stops crashing, get better ventilation in your case.

The cards were already broken (2, Insightful)

19thNervousBreakdown (768619) | more than 3 years ago | (#33109484)

If a process, like a webserver, could erase itself from a hard drive by benign input, it would be a bug. This is no different.

My graphics card, a GTX 275, was factory locked to a 40% duty cycle on the fan, no matter how hot it got. I had to resort to RivaTuner to make the fan auto-adjust speed based on temperature. Since there is no speed limit where I'm putting people's lives at risk for rendering too many frames per second, or any other reasonable reason to limit the amount of work a card can do before it destroys itself when the hardware is perfectly capable of doing that work without destroying itself, the only conclusion is that it is defective.

That said, anything that doesn't use vsync is stupid, period, always, (unless you're benchmarking or trying to warm a cold room). Spending that extra processing power on a proper motion blur would have a far greater effect on perceived smoothness.

Re:The cards were already broken (1)

tibit (1762298) | more than 3 years ago | (#33109746)

I agree that the card you mention had an issue. But the main problem is that die temperature sensing is such a simple thing to so. Power chips (switchers, regulators, bridges) that sometimes sell for $0.10 apiece can have die temperature sensors and can turn themselves off to cool down. Why the heck graphics chip makers don't put temperature-controlled power management (clock scaling, unit cycling) is beyond me. It's not like it's rocket science. If you know your engineering, you should even be able to have a controller with adaptive coefficients, that will learn the thermal mass(es) and thermal sinks attached to the chip and be able to maintain a constant peak temperature to avoid thermally induced cyclic stresses.

My naive assumption... (1)

hort_wort (1401963) | more than 3 years ago | (#33109520)

I've been assuming all this time that the drivers would limit my framerate to be equal to my monitor refresh rate. I have a 60 Hz monitor, is there any reason to get more than 60 fps? It seems like having drivers to do that for you would save a bit of electricity and be good PR for the company that made them. Edumacate me, please o_o

Re:My naive assumption... (1)

gman003 (1693318) | more than 3 years ago | (#33109766)

Some game engines tie the physics calculations to the framerate, so having higher FPS in those does make the movement more realistic.

Additionally, most engines poll the input devices only once per frame. If you have a high-end mouse or gamepad that actually responds more than 60 times a second, having higher framerates will make input more fluid.

Finally, I like having a buffer zone. I usually tune my games to render about 80 fps, so if I run into a scene a bit more complex than normal, I still see 60 frames a second.

Re:My naive assumption... (1)

tibit (1762298) | more than 3 years ago | (#33109780)

Because graphics cards are used for many other things besides driving monitors. Think off-screen rendering (done by any modern OS behind your back), scientific calculations, etc.

Wait, why is this not consider a hardware flaw? (1)

NotSoHeavyD3 (1400425) | more than 3 years ago | (#33109524)

I mean when I code I never think "Well my code needs to check to see if it might damage hardware." since I try to keep my coding agnostic to the system it's on. (Admittedly financial software in the last place I worked) Starcraft 2 is using either DirectX or OpenGL so I'd expect to be hardware agnostic as well.(Sorry, I'm not a graphics guy so I might be talking out of my but.) Seriously, if I remember correctly there were systems in the early 80's that you could damage if certain code was executed in a certain way but didn't people consider that hardware flaw?(Really, it's no different than expecting them to write code in the game to check for a temperature spike because a fan fell off the card. How would they know what a bad temp would be without writing card specific code?)

What year is this? (1)

damn_registrars (1103043) | more than 3 years ago | (#33109558)

The first Command & Conquer game was released in 1995, with full-motion video cutscenes. Those scenes did not destroy any graphics cards that met the system requirements for the game. Why would video scenes start doing this to modern video cards?

Re:What year is this? (1)

PrescriptionWarning (932687) | more than 3 years ago | (#33109640)

FMV scenes generally only playback at 23 to 25 FPS because they are pre-rendered. In SC2 they are being rendered by the GPU as they are not pre-rendered FMV and so generally need in excess of 30 FPS constantly to not look stuttery, which is about the same thing you'd get from playing something like Gears of War or Crysis or really any recent good looking 3D game.

performance evaluation tools don't kill graphics.. (1)

gl4ss (559668) | more than 3 years ago | (#33109568)

during the gameplay of the single player campaign, the nvidia drivers I got did the dance to restart themselfs once, without a hitch(the game didn't crash either, just a blinking of the screen and a note about it waiting on the desktop after quitting).

but saying that using your graphics card at full juice is essentially saying that pcmark& etc programs would do it as well. there's an issue with the card already if it's killing them. the real problem with the game is that it hasn't evolved at all from starcraft 1, the units need intentionally more micromanagement than what feels right, you can't zoom the camera out so that you could keep an eye on two spots at once and so forth.

btw spoiler: bases make surprisingly good elements to use as walls to herd the zerg into killzones(in last mission anyhow).

BULLSHIT (0)

Anonymous Coward | more than 3 years ago | (#33109586)

I'm playing StarCraft II on a Vostro 1400 for hours and hours every day since I bought it. And I never encountered any issues with my graphics card at all....

JUST IN: Using your computer can reduce it's life (1)

cbreaker (561297) | more than 3 years ago | (#33109606)

The more time passes, the less time people understand anything about their computers, and unfortunately this includes most kids these days..

This is because StarCraft II is correctly written. (2, Insightful)

Maarx (1794262) | more than 3 years ago | (#33109614)

StarCraft II is exposing shoddy thermal engineering in video cards because, unlike most games on the market, StarCraft II is correctly utilizing your video card to it's fullest potential.

Say what you will about SC2 game balance, say what you will about Battle.NET 2.0's crappy interface, say what you will about how cheesy Jim Raynor is. I wouldn't disagree with you.

But when it comes to writing engines, Blizzard is the best of the best. Hands down. Everything they write runs smooth as silk, and they have a genuine talent for squeezing jaw-dropping performance out of even mediocre computers. StarCraft II contains correctly written code, and it will utilize your hardware to it's fullest potential. If you bought a bargain computer, put it together yourself, and skimped on the cooling, you're going to get burned.

Pun intended.

I have as much hate as the next guy for how StarCraft II was cannibalized in the name of profit, but this article? This is a non-issue. This is not Blizzard's/StarCraft's fault.

Re:This is because StarCraft II is correctly writt (0)

Anonymous Coward | more than 3 years ago | (#33109776)

I don't disagree that SC2 ISN'T causing the cards to die, but I'd like you to take a trip down memory lane that was Diablo 2 Direct3D performance :)

Re:This is because StarCraft II is correctly writt (1)

Maarx (1794262) | more than 3 years ago | (#33109874)

I don't disagree that SC2 ISN'T causing the cards to die, but I'd like you to take a trip down memory lane that was Diablo 2 Direct3D performance :)

touché

Re:This is because StarCraft II is correctly writt (1)

tibit (1762298) | more than 3 years ago | (#33109808)

I agree. Thermal engineering bites even big names in the ass. Repeatedly. MacBook Air, anyone?

Re:This is because StarCraft II is correctly writt (3, Informative)

19thNervousBreakdown (768619) | more than 3 years ago | (#33109832)

Uh, no, eating as much GPU power as possible to render a static scene hundreds of times a second on a display that can only probably display 60 frames per second is not an example of properly-written software. In fact, it's just plain stupid, and nearly as wrong as you can possibly be.

That said, it shouldn't have any effect on graphics cards other than making less resources available to other concurrently-running programs, and Blizzard should in no way be blamed for breaking people's cards.

Old graphics cards (1)

DeanLearner (1639959) | more than 3 years ago | (#33109638)

I've played lots of games that have been too intense for my graphics card to run at 60fps. That is the game was pushing them to 100% of what they were capable of. Don't remember CATCHING ON FIRE because of it.

Poor story? I think so.

Evil Giant Killer Dust Bunnies From Hell (4, Informative)

davidwr (791652) | more than 3 years ago | (#33109682)

The summary should say that it's the Evil Giant Killer Dust Bunnies From Hell, not Starcraft, that are shutting down the cards.

The breaking test (1)

Tei (520358) | more than 3 years ago | (#33109732)

Some building do this: at weekends, turn off and on the power of the whole building. This serve the purpose to force these bulbs that are about to fail.. to fail sooner. Maybe SC2 has one of the most popular games released in years is working as a unintended "break test". But it will be good if Blizzard adds some caps *anyway*.

Weird (1)

Shaltenn (1031884) | more than 3 years ago | (#33109760)

I left the game running frequently (as I'm lazy) at these cut-scenes for almost 4 days straight, and I had no problems.

software killing hardware (0)

Anonymous Coward | more than 3 years ago | (#33109822)

I actually have seen an example of software killing some hardware in the past (aside from something 'common' like a bios flash gone bad).

This was in the mid to late 90s, I was a very young teenager working in a local computer shop. A lady brings in a packard bell running 95 and the motherboard is all jacked. We sell her on one of those early (somewhat decent for their time) PCChips motherboards (then labeled simply "pc-100" which caused much confusion in the tech world) that had onboard sound/video/modem, which was pretty impressive for their time- although still rather low quality. So anyway I will try to say this is quickly as I can.

Every single time someone put a motherboard in this thing the floppy drive stopped working. The drive wasn't the problem, the cable wasn't the problem, and to make matters worse the board wasn't even the problem because from the very get go from firing the new board up we would boot off the very same floppy drive that would cease to work once the computer was restarted and windows booted. They went through 5+ motherboards before I decided to just have a process of elimination. I killed another 3 boards figuring it out, I was sure I had figured it out after the 2nd one, the 3rd one was just to make sure.

That early PNP driver that loaded up from the config.sys for those awful Aztech sound/modem combo cards that packard bell put in everything was the culprit. None of the techs had gotten around to removing unneeded previous drivers from config.sys or autoexec.bat, every single time that PNP software was loaded from the hdd in the womans computer it fried the onboard floppy disk drive controller on that particular "pc-100" motherboard model.

Air Duster (1)

StarWreck (695075) | more than 3 years ago | (#33109854)

Here's an idea! Grab a $3 can of canned air/air duster from Office Depot/Staples/Office Max/Fry's Electronics/Best Buy, open up the side of your computer, and then spray out all the dust that's accumulated in your graphics card fan and heatsink since the first StarCraft came out.

College, Careers, Marriages (2, Funny)

PowerEdge (648673) | more than 3 years ago | (#33109896)

If it is anything like the original it is killing college aspirations, careers and marriages and the nation of South Korea. Graphics cards should be the least of our concerns!!! I say this as a survivor of SC. Oh and Total Annihilation was the better game!!
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...