Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

CPUs Do Affect Gaming Performance, After All

timothy posted more than 2 years ago | from the that's-just-what-jesus-said dept.

Graphics 220

crookedvulture writes "For years, PC hardware sites have maintained that CPUs have little impact on gaming performance; all you need is a decent graphics card. That position is largely supported by FPS averages, but the FPS metric doesn't tell the whole story. Examining individual frame latencies better exposes the brief moments of stuttering that can disrupt otherwise smooth gameplay. Those methods have now been used to quantify the gaming performance of 18 CPUs spanning three generations. The results illustrate a clear advantage for Intel, whose CPUs enjoy lower frame latencies than comparable offerings from AMD. While the newer Intel processors perform better than their predecessors, the opposite tends to be true for the latest AMD chips. Turns out AMD's Phenom II X4 980, which is over a year old, offers lower frame latencies than the most recent FX processors."

cancel ×

220 comments

Sorry! There are no comments related to the filter you selected.

It's not all graphics (5, Insightful)

hammeraxe (1635169) | more than 2 years ago | (#41102227)

Try cranking up the difficulty of an RTS on a not-so-good computer and you'll immediately notice how things slow down

Re:It's not all graphics (3, Interesting)

locopuyo (1433631) | more than 2 years ago | (#41102273)

In StarCraft 2 my CPU is the bottleneck.

Re:It's not all graphics (2, Informative)

hairyfeet (841228) | more than 2 years ago | (#41103527)

This also shows what many of us have been saying which is Bulldozer is AMD's Netburst. I've stuck with the Phenoms because you get great bang for the buck and because it was obvious the "half core' design of BD was crap, now we have it in B&W, the much older Phenom spanking the latest AMD chips which cost on average 35-45% more. Lets just hope that recent hire of the former Apple chip designer to AMD can right the ship, because otherwise when I can't score X4s and X6s anymore i'll have no choice but to go Intel.

Re:It's not all graphics (2)

Pubstar (2525396) | more than 2 years ago | (#41104785)

This is why I have the 4 "second" cores shut off, and running them at their base turbo boost fequency. The thermals of my system are about on par with my old 965 (non OC'd), and the 8150FX provides a massive difference in gaming response (vs. the 4.2ghz OC on the 965). I thought that everyone knew that it was the sensible thing to do unless you are running something that taxes more than 4 cores at once.

Re:It's not all graphics (1)

Pubstar (2525396) | more than 2 years ago | (#41104791)

Forgot to add, the Base TB clock for my CPU is 4.2Ghz. Running at sub 40C currently under a moderate load.

Err (5, Interesting)

bhcompy (1877290) | more than 2 years ago | (#41102307)

Which idiot made that claim? Pretty much every hardware review site has CPU and GPU dependent games in their reviews when they review GPUs, CPUs, and OTB rigs.

Re:Err (3, Funny)

Anonymous Coward | more than 2 years ago | (#41102463)

For years, absolutely nobody has maintained that CPUs have little impact on gaming performance; all you need is a god-tier video card setup, and a game engine that magically handles everything via GPU.

There, I fixed it.

Seriously, this has to be the most nonsensical Slashdot summary I've read all day. CPU hasn't been a minor factor in gaming for several gaming aeons now, and there are no shortage of games that are critically dependent on it (Hi, Skyrim!).

Re:Err (4, Informative)

snemarch (1086057) | more than 2 years ago | (#41102649)

*shrug*

I've been running a Q6600 for several years, and only replaced it last month. That's a July 2006 CPU. It didn't really seem strained until the very most recent crop of games... and yes, sure, it's a quadcore, but game CPU logic hasn't been heavily parallelized yet, so a fast dualcore will still be better for most gamers than a quadcore - and the Q6600 is pretty slow by today's standard (2.4GHz, and with a less efficient microarchitecture than the current breed of core2 CPUs).

Sure, CPUs matter, but it's not even near a case of "you need the latest generation of CPU to run the latest generation of games!" anymore. Upgrading to a i7-3770 did smooth out a few games somewhat, but I'm seeing far larger improvements when transcoding FLAC albums to MP3 for my portable MP3 player, or compiling large codebases :)

Re:Err (2)

Verunks (1000826) | more than 2 years ago | (#41102871)

I had a dualcore(E6600) for 5 years and pretty much every new game in the past 3 years can use two or more cores, even if it's just two you have to consider the other programs running in the background, for example on bad company 2 punkbuster had a bug that after a few minutes it would use 20-30% of cpu, the game itself uses ~90% of cpu and because of punkbuster there was a lot of stuttering, now I have a sixcore(3930k) and yeah maybe six cores are too much for games but some of them like bf3 can already use them and I think pretty soon most games will use at least 4 cores, we'll just have to wait for the playstation and xbox to go "next gen" once again

Re:Err (3, Interesting)

snemarch (1086057) | more than 2 years ago | (#41103181)

PunkBuster spiking to 20-30% CPU is, as you mentioned, a bug - it is not the norm. And while people won't be shutting down every background process to play a game, they don't tend to run anything heavy while gaming. And all the regular stuff (web browser with a zillion tabs loaded, email client, IM client, torrent client, ...) is pretty negligible CPU-wise.

I personally haven't run into games that can utilize more than two cores (please let me know if they're out there!), and even then there's usually been synchronization issues that has kept the game from reaching 100% core utilization, even on the slower cores. Parallelizing stuff is hard, and outside of the core graphics pipeline (which runs mostly on the GPU), there's so much stuff that needs to run in strict order in a game engine. I sure do hope clever programmers will think of improvements in the future, though, since we'll hit the GHz sooner or later - and then we need to scale on number of cores.

As things are right now, I'd still say a faster dualcore is more bang for the buck than a slower quadcore, gamewise - but that might change before long. And considering that the current crop of CPUs can turbo-boost a couple of cores when the other cores are inactive, it's obviously better to shop for a quadcore than a dualcore - but with the current crop of games, you'd effectively be using the CPU as a faster dualcore when not running intensive background stuff :-)

You can't really compare the consoles directly to x86 CPUs, btw, the architecture is radically different - moreso on the playstation side than the xbox (and let's ignore the original xbox here, for obvious reasons :)). I wonder if Sony is going to keep up their "OK, this is pretty whacky compared to the commodity multicore stuff you're used to, but it's really cool!" approach, or if they'll settle for something "saner".

Re:Err (1)

Verunks (1000826) | more than 2 years ago | (#41103709)

for the console I was talking about the fact that 90% of pc games are console ports right now so with a new generation of consoles the pc version should also be better optimized otherwise we will have a lot games that runs at 20 fps

Re:Err (1)

nabsltd (1313397) | more than 2 years ago | (#41104547)

And while people won't be shutting down every background process to play a game, they don't tend to run anything heavy while gaming.

I was quite interested in TFA's data on performance while transcoding video, as I do that quite often myself. Their data mirrors my own anecdotal experiences...a low-priority video encode won't hurt much if you have a decent number of cores.

And all the regular stuff (web browser with a zillion tabs loaded, email client, IM client, torrent client, ...) is pretty negligible CPU-wise.

One of the things that does kill performance for me is moderately heavy background disk activity. Download-speed activity isn't a big deal, but a few GB of robocopy across the LAN will bring a lot of games to a halt for a second or two.

Re:Err (1)

Anonymous Coward | more than 2 years ago | (#41102905)

I'll one up that with a athlon64 x2 (OC@2.8ghz) ddr1 (!) that i replaced just a few months ago (paired with eventually a hd4870).
that CPU was released may 2005.

the only game i ever played that i thought "maybe i need a better CPU" was supreme commander... and that was years ago.
only recently did some games start using more then 2 cores effectively.

Re:Err (3, Interesting)

Cute Fuzzy Bunny (2234232) | more than 2 years ago | (#41103099)

For years, absolutely nobody has maintained that CPUs have little impact on gaming performance; all you need is a god-tier video card setup, and a game engine that magically handles everything via GPU.

There, I fixed it.

Seriously, this has to be the most nonsensical Slashdot summary I've read all day. CPU hasn't been a minor factor in gaming for several gaming aeons now, and there are no shortage of games that are critically dependent on it (Hi, Skyrim!).

Check out your favorite hot deals web site. The mantra is a celeron or any old amd chip made in the last 5 years plus a solid gpu = goodness. I coiuld point you to dozens of threads where this is the defacto standard.

But thats what you get when you combine cheap with minimal knowledge. Eventually everyone becomes convinced that its true.

Re:Err (1, Redundant)

Ironhandx (1762146) | more than 2 years ago | (#41104119)

Um, it is true. Frame latency doesnt even matter. Its less than 1ms in ALL cases. IE: Its imperceptible.

I just bought a FX4100 purely because it was cheap, had ENOUGH power, and with an excellent video card setup a better intel chip wouldn't provide any sort of noticeable performance increase. Current-Gen CPUs so far overpower current-gen game engines cpu requirements that this argument is just plain silly.

I even see someone making the argument that AI is causing massive cpu load.... get fucking real, AI has improved, but the most CPU intensive AI in a main stream video game to date was in Sins of a Solar Empire, and even on low video settings with 8 AIs running(on hardest diff) and putting them on instant build to churn out units the video card was still the limiter for watching the most massive ridiculous battle.

There are a few Game engines(like Eve Onlines game engine) that were coded mostly in CPU instruction and run mostly on the processor. CCP is in the process of fixing Eves engine and have made great strides but it still eats a lot of CPU time. Not many games exist in the same boat.

Re:Err (5, Interesting)

Sir_Sri (199544) | more than 2 years ago | (#41102725)

If you read the charts the assertion that 'cpu doesn't matter' is kind of true in a lot of cases.

It's not that it doesn't matter at all, but the difference between an 1100 dollar sandy bridge i7 3960 and a 200 dollar 2500k, even though they are almost a factor of 2 difference in performance side by side (http://www.cpubenchmark.net/high_end_cpus.html) is less than 10% in games. Now those processors are still *way* better than the AMD offerings unfortunately, and the AMD processors are in many cases so bad that becomes the dominant problem.

The new "bulldozer" architecture from AMD is a disaster, in just about every way. They're terrible. The charts clearly show that.

The video card makers (more than the review sites) have correctly pointed out that performance is much more likely to be GPU gated than CPU gated, or, if it's a problem like I'm working on now, it's a single CPU gated for an algorithm that doesn't neatly parallelize - so more cores doesn't do anything. If you're given a choice between a 1000 dollar CPU or a 600 dollar one from the same company odds are you won't be able to tell the difference, so in that sense they're reasonably correct, there's virtually no benefit to buying an extreme CPU or the like if your primary goal is gaming performance. If you're talking about the best use of say 1000 dollars to build a gaming PC, well then the cheapest i5 you can find with the best video card you can afford is probably the best bang for your buck.

As someone above said, an RTS like starcraft is more likely to be CPU limited than GPU limited.

What this tells us is that AMD processors are terrible for gaming, but there's virtually no difference which FX processor you buy (don't buy those though, if you're buying AMD buy a phenom), and within the Intel family there is again, virtually no difference for a factor of 4 or 5 price difference.

What they didn't look at (because you don't really benchmark it) is load times, I think the FX processors have a much faster memory subsystem if you have a good SSD than their Phenom counterparts, but otherwise someone should take a bulldozer to bulldozer.

If we were to revisit the oft used car analogy for computing, it's a fair assertion that which brand of car you buy won't help you get to work any faster day to day, slightly better cars, with faster pickup etc will have a small (but measurable benefit) but that's about it. Well, unless you buy a land rover, or a BMW 7 series (http://www.lovemoney.com/news/cars-computers-and-sport/cars/12461/the-country-that-makes-the-most-reliable-cars, http://www.reliabilityindex.com/ ), at which point, you should budget time into your schedule for the vehicle to be in the shop.

Re:Err (1)

cpu6502 (1960974) | more than 2 years ago | (#41102807)

I wonder if this same logic applies to browser performance? As they become more graphical and video-oriented will the GPU power matter more than the CPU?

Maybe I didn't need a new computer..... maybe I just needed to keep the Pentium4 and upgrade the graphics card to something fast. Then I could play HD youtube.

Re:Err (1)

bhcompy (1877290) | more than 2 years ago | (#41102835)

Only if the software you're using supports GPU acceleration, which I believe Flash does now.

Re:Err (1)

Sir_Sri (199544) | more than 2 years ago | (#41102967)

on an OS that supports it. No GPU acceleration on Windows XP generally, and older flavours of linux are the same deal.

Re:Err (1)

Billly Gates (198444) | more than 2 years ago | (#41104277)

Lets hope it fades soon. When XP gets EOL my hope is that web developers will drop IE 8 support so we can use HTML 5. Even IE 9 supports an ok section of it and will be auto updated to 10 with Windows Update unlike past releases.

Re:Err (2)

Sir_Sri (199544) | more than 2 years ago | (#41102891)

I wonder if this same logic applies to browser performance

In windows 8 it definitely will, windows 7 and linux, not so much. GPU acceleration is becoming more and more popular because GPU's are able to solve one type of problem significantly better than CPU's, if you can split your problem up, into the rendering problem and the logic problem the CPU becomes a lot less important, assuming it's fast enough to keep up with the GPU for whatever problem you have.

General purpose GPU acceleration isn't standard in use very well on any OS, although MS is doing so with the desktop and fonts in Windows 8, as you ask, it matters a lot more to a web browser than than your desktops. IE10 is natively GPU accelerated and I believe chrome, opera and firefox are all going that route to varying degrees.

In terms of total computer performance though, nothing beats a SSD on a fast memory sub system (sandy bridge or an FX setup). GPU acceleration makes it a bit snappier and responsive, a SSD completely changes how quickly applications start and how fast the machine boots and that sort of thing.

Re:Err (1)

Billly Gates (198444) | more than 2 years ago | (#41104235)

CPU more for browsers. Each tab in IE 9 and Chrome spawns a different process that can be delegated to the GPU. I know I get flamed for this but only IE 9 uses GPU acceleration fully while you have to turn on the extra options in config:flags in Chrome thanks to legacy XP support. DirectX11 will accelerate more canvas items which is why IE 9 was never back ported to XP.

AJAX is CPU dependent too so you made a great investment upgrading your turn of the century system. Flash does use some acceleration but only in Windows and MACOSX. Games almost everything but logic goes to the GPU in comparison. Hell even the polygon phsyics is moving to the GPU!

When IE 8 the last basticle of ancient browsers because less popular we will see webworkers in html 5 which enables elements in html and javascript use more cores rather than one core/process per tab today. Then your 8-core system will truly shine when you have 30+ tabs open.

Re:Err (1)

Billly Gates (198444) | more than 2 years ago | (#41104255)

Correction "Each tab in IE 9 and Chrome spawns a different process that can be delegated to the CPU". ... not GPU. Also notice I failed to mention Firefox with this. It does support web workers with more than 1 cpu but it is just one big bloated process the last time I looked if you do not use html 5.

Re:Err (1)

snemarch (1086057) | more than 2 years ago | (#41103275)

Wise words.

Just one thing: whether disk speed matters or not depends a lot on the game, and whether it's the "we have a fixed memory profile, and load all assets to memory while loading a level" or "we stream stuff as necessary" type. For instance, for Far Cry 2, it made pretty much no difference whether I had the game files on a 2x74gig Raptor RAID-0 or on a ramdisk. For a lot of engines, there's all sorts of things going on... Disk I/O, some CPU crunching, some sysmem->gpumem transfers, some gpu crunching... and enough wait-states that nothing ever runs at 100% speed.

Re:Err (2)

hairyfeet (841228) | more than 2 years ago | (#41103849)

While I agree with most of what you are saying I don't agree that AMD Phenom II based chips are "terrible" at gaming, if you look at the chart they are getting over 60 FPS and the huge difference in price between an AMD Phenom II quad or hexa compared to an Intel quad or hexa means you will have more money for a faster GPU or an SSD, which when you are already getting 60 FPS is gonna be probably the smart way to go. I know I built two hexacores for less than $850 with Win 7 HP X64 and HD4850s last year and they still blast through any game I care to throw at them with great graphics and no lag.

So I would say if ALL you are gonna do is game? Then the Intel dual cores would be the way to go. But if you are gonna be doing other things as well then it all comes down to whether you can afford to go for the higher end Intel quad and still have money left over for the rest of the parts you want. I know that my AMD hexa just chews through video transcodes while still giving me decent framerate and considering you can get a full hexa kit for just $340 [tigerdirect.com] compared to the cheapest Intel quad kit being $500 without HDD or DVD? [tigerdirect.com] That's a pretty damned big difference for an extra 20 FPS.

All I have to say is... (1)

sudden.zero (981475) | more than 2 years ago | (#41102317)

...DUH!

How is this even news? (0)

Anonymous Coward | more than 2 years ago | (#41102357)

Anyone who has put together a gaming machine has known this for years

What. What?! (5, Interesting)

RyanFenton (230700) | more than 2 years ago | (#41102403)

Who thought that CPU's didn't bottleneck gaming performance? Who ever thought that? Only the smallest of tech demos only used GPU resources - every modern computer/console game I'm aware of uses, well, some regular programming language that needs a CPU to interpret instructions and is inherently limited by the standards of clock cycle and interrupt tied to those CPUs.

GPUs only tend to allow you to offload the strait-shot parallelized stuff - graphic blits, audio, textures & lighting - but the core of the game logic is still tied to the CPU. Even if you aren't straining the limits of the CPU in the final implementation, programmers are still limited by the capacity of them.

Otherwise, all our games would just be done with simple ray-traced logic, using pure geometry and physics, there would be no limits on the number or kind of interactions allowed in a game world, game logic would be built on unlimited tables of generated content, and we'd quickly build games of infinite recursion simulating all known aspects of the universe far beyond the shallow cut-out worlds we develop today.

But we can't properly design for that - we design for the CPUs we work with, and the other helper processors have never changed that.

Ryan Fenton

Re:What. What?! (2)

Billly Gates (198444) | more than 2 years ago | (#41104297)

I will take a mediocre cpu with a kick ass GPU than the other way around. Sure I have an under clocked phenom II at just 2.6ghz but with my ATI 7870 I plan to get it will blow away an icore7 extreme with the HD 4000 graphics by several hundred percent!

GPU is where it is at with games. Just like with Windows an SSD makes a bigger difference than a faster CPU booting up.;

For years? (5, Interesting)

_Shorty-dammit (555739) | more than 2 years ago | (#41102413)

I don't recall ever reading on any PC hardware site anyone claiming that the CPU doesn't matter and all you need is a good graphics card. How on earth did anyone ever successfully submit that story?

Interesting research - poor Slashdot title (4, Interesting)

WilliamGeorge (816305) | more than 2 years ago | (#41102441)

The research into frame-rate latencies is really interesting, but the whole idea that *anyone* knowledgeable about PC gaming would have *ever* denied that the CPU was an important factor in performance is ridiculous. I am a consultant at a boutique PC builder (http://www.pugetsystems.com/) and I have always told gamers they want to get a good balance of CPU and GPU performance, and enough RAM to avoid excessive paging during gameplay. Anything outside of that is less important... but to ignore the CPU? Preposterous!

Then again, it is a Slashdot headline... I probably should expect nothing less (or more)!

Re:Interesting research - poor Slashdot title (2)

niado (1650369) | more than 2 years ago | (#41102511)

Nice! I recently found out about Puget when I was looking for an oil cooled PC [pugetsystems.com] kit. I've been drooling over that thing for months.

Re:Interesting research - poor Slashdot title (2)

UnknownSoldier (67820) | more than 2 years ago | (#41104621)

> The research into frame-rate latencies is really interesting,
Indeed. There was a VERY interesting article last year on Micro-Stuttering And GPU Scaling In CrossFire And SLI
http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html [tomshardware.com]

> but the whole idea that *anyone* knowledgeable about PC gaming would have *ever* denied that the CPU was an important factor in performance is ridiculous.
Not exactly. Battlefield 3 doesn't use more then 2 cores.
http://www.bit-tech.net/hardware/2011/11/10/battlefield-3-technical-analysis/7 [bit-tech.net]
http://www.techspot.com/review/458-battlefield-3-performance/page7.html [techspot.com]

If you have a high profile AAA title with that level of quality of graphics it kind of makes you wonder why other games "need" 4-cores?

Re:Interesting research - poor Slashdot title (0)

Anonymous Coward | more than 2 years ago | (#41104679)

Will George? Liverpool Will George??

FTFY (5, Insightful)

gman003 (1693318) | more than 2 years ago | (#41102443)

For years, stupid PC hardware sites have maintained that CPUs have little impact on gaming performance; all you need is a decent graphics card. That position is largely supported by FPS averages, as most GPU tests are run using the most powerful CPU to prevent the CPU from being the limiting factor, but the FPS metric doesn't tell the whole story. Examining individual frame latencies better exposes the brief moments of stuttering that can disrupt otherwise smooth gameplay. Those methods have now been used to quantify the gaming performance of 18 CPUs spanning three generations by some site that really has nothing better to do than to restate the obvious for morons. [ed: removed fanboy-baiting statements from summary]

Re:FTFY (1)

Twinbee (767046) | more than 2 years ago | (#41102667)

So its mention of AMD CPU latencies increasing (and Intel's decreasing) is wrong is it?

Re:FTFY (1)

gman003 (1693318) | more than 2 years ago | (#41103113)

No, I'm not saying it's factually incorrect. I'm saying that the way they put it into the summary was misleading flamebait.

A simple logical analysis shows that the primary factor in latency is instructions-per-clock, and clock speed (core count matters as well for applications with multithreaded rendering, but those are surprisingly few). The Phenom II series was good at both. The Sandy Bridge/Ivy Bridge Intel processors are also good at both, even a bit better. Bulldozer, unfortunately, went the Pentium IV route of aiming for a high clock speed (and high core count), which backfired when they relearned what Intel learned with the Pentium IV: that does not work. And so they're left with a dud processor line, but they don't even have the market share to survive just because they're big. This analysis also shows why it was a Phenom X4, not a Phenom X6, that earned AMD's top points - the X4s could reach a higher clock speed in the same thermal envelope as the X6, and thus the highest-clocked Phenom was the X4 980, supporting my hypothesis that few applications are able to use multiple threads for rendering.

Had they been able to phrase that information in a way that didn't sound like an Intel press release, they would have been fine. But it's hard to condense that into just a few sentences without coming off as biased in one direction or another.

Re:FTFY (1)

hairyfeet (841228) | more than 2 years ago | (#41104199)

Which makes me wonder a little about their tests, specifically if they turned off Turbocore in the BIOS. I may have missed it but I didn't see anything about TC in the article and the whole point of TC is to crank up the speed when you aren't using all the cores and thus have more thermal envelope to play with.

I know that stock my 1035T will jump from 2.6GHz to nearly 3.1GHz when I'm using 3 cores or less and if I wanted to play with the OC Tuner a little I can easily get that even higher, I've compared it with a Phenom II X2 at 3.1GHz and an Athlon II X3 at 3.2GHz I have at the shop and I can't really tell any difference with TC on as even today so few programs are using more than 2 cores that I'm often in TC mode running programs and games.

That said anybody who thinks they can put a shitty Celeron single core with a badass GPU and have a decent system is retarded, its ALWAYS been a balancing act. You look at your budget, decide what is important, and get the best deals that you can and hope for a nice mix. I have to agree with your assessment of Faildozer though, what a bad chip. I've been sticking with the Phenom IIs in my builds precisely because Dozer is a dog, its costs on average 40% more and runs on average 40% slower while using more power and cranking more heat. There really isn't a single metric that would sell dozer over Liano or Phenom, I just hope AMD straightens out because the last thing we need is Intel having a monopoly on X86.

Re:FTFY (1)

snemarch (1086057) | more than 2 years ago | (#41102679)

You're an AMD fanboi, eh? :-)

Re:FTFY (1)

gman003 (1693318) | more than 2 years ago | (#41103253)

*looks at current laptop* Core i7 3610QM
*looks at wreckage of last laptop* Core 2 Duo P8400
*looks at primary desktop* dual Xeon 5150s
*looks at secondary desktop* Athlon 900

Yeah, if I were going to accuse myself of fanboyism, I think I'd accuse myself of fanboying for *Intel*, not AMD. Now granted, I've got a few more AMD-based builds under my belt, but I've either given them away (the Phenom X3 build) or accidentally fried them (the old Athlon XP build).

In all honesty, though, both companies have their good and bad points. Each had at least one total bomb (Pentium IV, Bulldozer). Each has good server chips for different workloads (Opterons (even the Bulldozer ones) are good for massive number crunching, Xeons are better for general server loads). On the desktop, Intel generally skews to higher prices and higher performance, while AMD aims for half the price and powerful enough. I'll even admit that AMD's current mobile chips (Fusion) are great - a decently-powerful CPU and a decently-powerful integrated GPU, with a low power draw and low price. I almost considered getting a cheaper, smaller Fusion-based laptop and a very powerful desktop, instead of getting the powerful monster of a laptop I ended up getting. Might have been a better idea, actually, but oh well.

Re:FTFY (1)

snemarch (1086057) | more than 2 years ago | (#41103325)

Ah, nice to hear - your redacted summary just gave another impression.

Been through both sides myself, depending on what made most sense at the time - first box I owned was a 486dx4-100, obviously AMD. Current rig is third intel generation in a row, though - AMD haven't really been able to keep up (except for the budget segment) since Intel launched Core2, imho. Which is kinda sad - while I kinda would have liked to see x86 die and "something better" emerge rather than getting x86-64, at least AMD obliterated the P4 with AMD64. Would be nice seeing that kind of competition again :-)

Re:FTFY (1)

lasvegasseo (2708229) | more than 2 years ago | (#41102719)

lol true bombs. Seriously, CPUs have always been known to affect performance.

Lolz (0)

Anonymous Coward | more than 2 years ago | (#41102465)

Makes me glad I purchased a i7 - 2600k.

Re:Lolz (2)

darkwing_bmf (178021) | more than 2 years ago | (#41102635)

Mine is a 2600 also! But mine is made by Atari because I wanted a system made for gaming. I'm sure they're pretty much the same thing though.

Not a game player, but (2)

PCK (4192) | more than 2 years ago | (#41102477)

This should be obvious to anyone who has done any realtime/interactive graphics programing. As the frame rate gets higher the amount of time the CPU has to process the next frame gets smaller. It also becomes more diffcult to properly utilise the CPU fully unless you are willing to add a couple of frames of latency to generate frames in the future which I'd speculate is not ideal for a game type application.

Why should I care? (1)

Torp (199297) | more than 2 years ago | (#41102479)

My current rig that i build in 2007 and upgraded once in a while has decent gaming performance, even though i haven't put any money in it in 2 years or so... still on a Geforce 450.
Calm down please :)

What sites? (1)

Mathness (145187) | more than 2 years ago | (#41102481)

What sites have claimed that? And doesn't the article come (to an extend) to the same conclusion that HardOCP have had for quite some time now?

agree to disagree (1)

arbiter1 (1204146) | more than 2 years ago | (#41102489)

Over all GPU does impact FPS the most cheap one vs little more expensive one, but to say cpu has no impact is wrong. Overall its impact is very small but there is some.

Do What Now? (1)

Porksmuggler (1428045) | more than 2 years ago | (#41102497)

"For years, PC hardware sites have maintained that CPUs have little impact on gaming performance; all you need is a decent graphics card." Obviously straw man is obvious. Your aunt Sally would be ashamed...

Find it a bit odd (1)

joshtheitguy (1205998) | more than 2 years ago | (#41102507)

The only statement from the summary I kinda disagree with is the following. "Turns out AMD's Phenom II X4 980, which is over a year old, offers lower frame latencies than the most recent FX processors."

I only mention this because I replaced a Phenom II X4 980 with the FX 8150 last year which increased my average frame rates across the board. Oh well what do I know?
Not like I've experienced the exact opposite of their claims or anything like that.......

Re:Find it a bit odd (2)

cpu6502 (1960974) | more than 2 years ago | (#41103005)

They said frame rate *latencies* increased with the FX..... not that the frame rates went down.

Re:Find it a bit odd (1)

hughJ (1343331) | more than 2 years ago | (#41104237)

Unless I'm reading the article wrong, all they're doing is recording frame rates via FRAPS and using that to calculate the latency between frames. High frame rate = low latency and vice versa.

time for upgrade? (4, Funny)

zlives (2009072) | more than 2 years ago | (#41102509)

so... i should finally give in and buy the coprocessor for my 386!!

Re:time for upgrade? (1)

the_fat_kid (1094399) | more than 2 years ago | (#41102653)

I think it's just a fad.
Wait and see.

Re:time for upgrade? (1)

Hatta (162192) | more than 2 years ago | (#41102753)

Did any games support math coprocessors back in the math coprocessor days? My impression was that they were for office apps, Lotus etc.

Re:time for upgrade? (1)

wierd_w (1375923) | more than 2 years ago | (#41102851)

I dunno, but I could see it being exploited for the additional registers, and doing a floating point op at the same time as an executing loop.

It might have also been useful when doing software blitting on non accellerated cards.

Re:time for upgrade? (1)

washu_k (1628007) | more than 2 years ago | (#41102869)

Probably an outlier, but I remember that Scorched Earth [wikipedia.org] ran much better after I added a 387 to my 386 machine back in the early 90s. Specifically the projectile trajectories were calculated much quicker.

Re:time for upgrade? (2)

DNS-and-BIND (461968) | more than 2 years ago | (#41103601)

What were you smoking? Like you needed Scorched Earth to run faster? Jeez, I take my turn and then *boop* *boop* *bing* the computer players shoot and I have no idea what just happened. Oh, I'm dead now. That was fun. I don't mind losing, I just want to see WHAT THE HELL HAPPENED.

Re:time for upgrade? (1)

zlives (2009072) | more than 2 years ago | (#41103347)

wing commander 3? i think that helped!!!
Bluehair was the best captain

Re:time for upgrade? (0)

Anonymous Coward | more than 2 years ago | (#41103741)

I remember an FPS game, that in its map editor, you could only do flat ground without the coprocessor, but if you added it, you were able to do inclines and ramps, as it needed the extra processing power to handle the angles better. Can't for the life of me remember which game it was though.

Re:time for upgrade? (0)

Anonymous Coward | more than 2 years ago | (#41103011)

If you upgrade to the 486 DX you can get it on-die which is like way better, man.

Re:time for upgrade? (3, Funny)

danomac (1032160) | more than 2 years ago | (#41103305)

Yep, nothing like being able to calculate 1+1=3 quickly. Err...

Re:time for upgrade? (1)

zlives (2009072) | more than 2 years ago | (#41103389)

if only you had the math-co...

Re:time for upgrade? (1)

perlith (1133671) | more than 2 years ago | (#41103311)

so... i should finally give in and buy the coprocessor for my 386!!

No need to spend the extra money on the coprocessor. Hit the "turbo" button on your computer case.

Re:time for upgrade? (1)

zlives (2009072) | more than 2 years ago | (#41103365)

YES!! i had forgotten the turbo button, money saved... EXCELLENT :)

Re:time for upgrade? (1)

Nimey (114278) | more than 2 years ago | (#41104659)

Nah, I've got a TSR[1] that emulates an 8087. It totally speeds up Doom! ...actually, it really did on my 486SX-25, maybe 2-4 FPS. No, I don't know why; maybe without a co-pro present Doom would use emulated 387 instructions that were less efficient than emulating a simpler 8087.

[1] it was called EM87.

Intel VS AMD Temp Under Clocking (0)

Anonymous Coward | more than 2 years ago | (#41102513)

This video depicts the resultant frame rate drop when the heat sink is removed from both an AMD and Intel CPU. I'm not sure if this mechanism is still in place on Intel chips, although I recently swapped an Intel E6600 from a stock cooler to a tried-and-true aftermarket cooler and saw significant improvement in framerate. http://www.youtube.com/watch?v=06MYYB9bl70

what? (1)

Charliemopps (1157495) | more than 2 years ago | (#41102521)

This doesn't make sense at all. It's clear that the CPU is far more important than the GPU.
CPU speed solves stuttering and lag
Hard drive speed solves long load times
Memory amount decreases frequency of load times (memory speeds, despite what many thing, have relatively little to do with performance as even the slowest memory is far faster than any other component of the system)
GPU speed/memory amount affects quality of graphics settings and frame rate when those settings are turned on (i.e. you can check more boxes on the advanced tab without dropping to 10 fps)

As far as "bang for your buck" goes, the last thing you want to spend money on is the GPU. More memory is the cheapest way to improve performance, followed by CPU speed/cores. The GPU is the very last thing you want to invest in because the prices are so hyper inflated and the technology their pushing is usually not even used in most games. The difference in frame rates between a $200 card and a $600 card is usually less than 20% and that video card will be obsolete in 6 to 12 months. That's just not a good value.

Completely agree (0)

Anonymous Coward | more than 2 years ago | (#41102529)

Civ 5 for example slows to a crawl when I play any map that's bigger than "tiny". Admittedly, I'm running this on a laptop with a Core2Duo but I do have a semi-decent graphics card and have a 7200rpm drive. A good CPU is important, at least for the kind of games I like to play.

Also big differences between games - e.g. SWTOR (1)

Anonymous Coward | more than 2 years ago | (#41102555)

As someone who was quite keen initially I did a lot of volunteer support in the forum. Holy shit how many people complained about unplayable lag due to an old CPU. So not only is the CPU important, but it's of different importance depending on the game, and there is really no way to know for sure.

The morale of the story should just be that if you want to play the latest games, have the latest CPU and latest GPU. Anything else is a gamble.

Suggested tag for story: (1)

idontgno (624372) | more than 2 years ago | (#41102557)

"Strawmansummary"

Windows 7 stalls for no apparent reason (0)

Anonymous Coward | more than 2 years ago | (#41102615)

This may be because my gaming rig doesn't connect to the internet and moreover I've turned off both the ethernet port and all services to configure it.

Then again, it probably isn't, its just Windows jerking about.

Not, really, the fault of the game.

PS who the hell thought that CPUs didn't have an effect? Hell, almost every series of GPU has had the top-of-the-range version (especially if in SLI) completely worthless with the entry level CPU you'd get at a retail store, and that you'd need a much faster CPU to find any difference.

I cannot think of a single PC hardware site that maintaned that.

Minecraft (1)

Coolhand2120 (1001761) | more than 2 years ago | (#41102623)

Minecraft: I know it's not the best optimized game, but I'm pretty sure it still uses hardware. I have had an Nvidia GTX 275 forever though many CPUs. When playing Minecraft with an older Quad Core Intel CPU (can't remember the model number) I would get around 30FPS at medium settings, after upgrading to an I7 with the same video card, now my Minecraft FPS is around 90FPS with the same settings.

So I can attest empirically that "CPU matters" is in fact the case. Also games like ARMA2, Supreme Commander 1 and I'm guessing any game that has a whole lot of entities doing magic stuff in memory at the same time, the CPU matters a great deal. When upgrading my CPU and keeping the same video card the aforementioned games really improved quite a lot.

This could of course (in my uneducated opinion) be because the programmers of the game didn't load the code that could have gone into the GPU onto the GPU but rather "rolled their own methods" on the CPU and harmed performance unnecessarily. Either because they did so ignorantly or were forced to do so. I can see that happening in Minecraft and ARMA2 which has pretty amateurish (unoptimized) coding, but I can't see that being the explanation for Supreme Commander 1, the programmers on Supreme Commander 1 were indeed supreme coders and I bask in their glory.

Re:Minecraft (1)

SuricouRaven (1897204) | more than 2 years ago | (#41103345)

Tekkit mod certainly is CPU limited. Heavily so. To the point you can bring down a server by building too many buildcraft pipes.

I could have... (1)

CheshireDragon (1183095) | more than 2 years ago | (#41102639)

...told you this! I been saying this for years. I didn't buy an AMD FX 8-core and 16BG of RAM for kicks!
I will say though, that for a while, RAM was a major player.

They don't know basic chip arch? (2, Interesting)

towermac (752159) | more than 2 years ago | (#41102691)

Hm. First there is:

"...The FX-4170 supplants a lineup of chips known for their strong value, the Athlon II X4 series. Our legacy representative from that series actually bears the Phenom name, but under the covers, the Phenom II X4 850 employs the same silicon with slightly higher clocks."

and then:

"Only the FX-4170 outperforms the CPU it replaces, the Phenom II X4 850, whose lack of L3 cache and modest 3.3GHz clock frequency aren't doing it any favors."

How can I trust them if they are unaware of basic stuff any chip enthusiast should know? (The Phenom is the Athlon with level 3 cache. The Athlon has none.) They could have also touched on what the 2 AMD specific hotfixes were for.

I'm not shocked at the results, but I am skeptical of the degree of disparity.

Re:They don't know basic chip arch? (2)

bigdanmoody (599431) | more than 2 years ago | (#41103377)

The Phenom II X4 850 (and the 840 as well) is based on the C3 stepping Propus core, which means that it is essentially an upclocked Athlon II. It does not have any L3 cache. The article is correct.

Re:They don't know basic chip arch? (1)

towermac (752159) | more than 2 years ago | (#41103973)

Huh. Busted not knowing my model numbers. I was unaware they had a Phenom with the L3 cache disabled. I really thought that was the point of Phenom over Athlon. Wonder why they didn't use an X4 Phenom with the usual 6MB L3 cache.

Re:They don't know basic chip arch? (1)

hairyfeet (841228) | more than 2 years ago | (#41104339)

The X4 850 is NOT a Phenom, that is just marketing BS. The difference between the Athlon and the Phenom has always been the Athlon lacks L3 which the 850 lacks as well. So all the 850 is is an Athlon II with a different name on it, compare it to an X4 945 or even 925 and it becomes painfully clear that the L3 does make a difference.

Platform, not just CPU (0)

Anonymous Coward | more than 2 years ago | (#41102703)

I find anymore that when it's time for a new CPU, it's time for a whole motherboard. Either the socket has entirely changed, or there are new chipsets with more features, higher bandwith interfaces, etc. When your system is able to shuttle data around faster performance as whole increases.
Just for example - If you get a newer SSD it can really benefit from 6.0gbps SATA ports. The only practical way to do this nowadays is to get a newer motherboard. (Add-in SATA cards are either complete crap, or cost 2x what an entire motherboard does. There is no mid-grade add-on SATA market at all.)

I've got a core2 quad based machine at work. It's great! The Core2Quad chips can really do their job for their age.. But the rest of the system is really starting to show it's age. It won't be long before the new ivy bridge "pentium" (Low end 2 core) or i3 (mid-grade 2 core) systems will run circles around it. (Except for tasks that need lots of cores, like running VMs)

Frequency scaling (1)

fa2k (881632) | more than 2 years ago | (#41102731)

(from TFA)

After consulting with our readers, we've decided to enable Windows' "Balanced" power profile for the bulk of our desktop processor tests, which means power-saving features like SpeedStep and Cool'n'Quiet are operating. (In the past, we only enabled these features for power consumption testing.) Our spot checks demonstrated to us that, typically, there's no performance penalty for enabling these features on today's CPUs. If there is a real-world penalty to enabling these features, well, we think that's worthy of inclusion in our measurements, since the vast majority of desktop processors these days will spend their lives with these features enabled.

That's wrong for the Phenom II. I find a 30 % difference when enabling frequency scaling on a 965 for a single-threaded workload. It seems that each core is clocked independently and there is some delay when increasing the clock speed. Maybe the Windows frequency scaler is better, but for this CPU there seems to be a real difference. The problem is that they are talking about bursty load, and trying to quantify delays, so they should really try without freq. scaling on the Phenom II.

they need to look at other genres (0)

Anonymous Coward | more than 2 years ago | (#41102737)

A whole freakin bunch of us are all FPSed out after the last decade. Let's start focusing on some other genres for a change...

VINDICATION!! (1)

dave562 (969951) | more than 2 years ago | (#41102773)

I have been saying this for years, but have never had any data to back it up. For me it has always been a "seat of the pants" sort of metric. Over the last decade I have tried AMD CPUs on a number of occasions, and always found them to be lacking in comparison to Intel CPUs of the same generation. My latest gaming machine is running an i7-960 (got it cheap from NewEgg) and it works great with all of the games I play.

CPU still isn't a bottle neck. (1, Offtopic)

medv4380 (1604309) | more than 2 years ago | (#41102811)

Yea if I'm trying to render 120fps then yes it's a bottleneck. Chances are you only have a 60Hz monitor so VSync will lock you at 60fps. Most of the tests ran above 60fps with some exceptions on the older CPUs. So you can spend your money on an expensive Intel i7 to render frames you cannot see, or you can buy a cheaper processor and spend the money on a beefy GPU or fix the real bottle neck is the HDD and switching to a SSD is a better improvement.

Re:CPU still isn't a bottle neck. (0)

Anonymous Coward | more than 2 years ago | (#41103283)

This really depends on the game. Some games like Starcraft 2 really benefit from a faster CPU. My frame rate nearly doubled after upgrading from a Phenom 9600 to a Phenom II 1090t X6. I later upgraded my video card and while my frame rate didn't improve much, I can run it on much higher settings.

I think the key is to have a balanced system. The CPU + GPU + disk + ram must all be around the same quality. Sometimes just upgrading the GPU can be negative too. I've seen drops in performance because of the extra heat combined with the extra "activity" from the GPU. Bought a new water cooling setup and it gave me a 10% bump. By adding the new video card, I threw off the balance in the system including the heat and power supply.

There isn't a silver bullet to a fast gamer rig. It requires careful planning. In fact, if I were to go exclusively for gaming, I'd probably buy several SSDs and raid 0 them and go Intel Ivy bridge. However, I don't want to spend that much for casual gaming. I actually compile software much more often on my system and the 6 core phenom has worked out great for that. It can be a very parallel operation.

Game mechanics (1)

phorm (591458) | more than 2 years ago | (#41103969)

FPS's:
Depends on where the AI etc is. If mostly you're getting data on 32 players then you need a low-latency network connection and most of the rest is going to be rendering of fancy explosions, fog, scenery, etc. Hence things tend to tie up nicely with games like BF3 etc

Now get a bigger RTS.
Lots of on-map units. Pathfinding. AI. etc. Latency is important but CPU maybe moreso than graphics

Re:CPU still isn't a bottle neck. (1)

DeadboltX (751907) | more than 2 years ago | (#41103395)

Frames per second in video games are not all about what you can see. The FPS that a game plays at is in direct relation to input delay. A game that runs at 30fps is going to have twice as much input delay as a game that runs at 60fps, and 4 times the delay of a game that runs at 120fps. In highly competitive multiplayer games having an additional 20ms delay on all of your inputs compared to an opponent can make a difference.

Re:CPU still isn't a bottle neck. (0)

Anonymous Coward | more than 2 years ago | (#41103859)

I play UT2K4 at a rock solid 200FPS on my i7 with 670GTX and a netspeed tweak. I enjoy playing against folks who run at 60fps or less. :)

civ4? (1)

mcguyver (589810) | more than 2 years ago | (#41102953)

Tell this to someone who plays civilization...or SoF.

FPS is not the right metric (2)

cathector (972646) | more than 2 years ago | (#41103095)

as this article points out it's not the number of frames per second that really matters:
it's the longest gap between subsequent frames which the eye picks up on.

you could cram 200 frames into the last 10th of a second, but if the other 0.9 seconds only has 1 frame, it'll feel like 1Hz.

i typically chart another metric next to traditional FPS which is 1 / (max inter-frame period in one second).

Re:FPS is not the right metric (1)

hughJ (1343331) | more than 2 years ago | (#41104323)

It's still about the frames per second, just not the 'average' frames per second over some long arbitrary sample period. The days of showing single figure timedemo-style averages are long gone on most tech websites.

Price VS performance (1)

phorm (591458) | more than 2 years ago | (#41103145)

For me, it comes down to price VS performance, which is actually something I'm trying to figure out as I'm looking for a new rig.

At some point, too low performance becomes a regular lag-fest. However, if I can get decent load-times and run at 780P'ish resolution at good detail levels, I'm fairly happy. 1080P would be nice at medium+ detail for use when connected to a bigger TV, but I don't really see the point one an 22-24" LCD. I don't need 200fps, but somewhere consistent between 40-60+ would be nice for gaming.

Intel definitely tends to deliver more in the performance arena, but even the cheaper CPU's are often double the price of the AMD counterparts, and the motherboards tend to be a bit more costly as well.

Beyond gaming, compile-performance is nice. I do a bit of graphics work so while being able to play something like BF3 or Crysis at 1080P@60FPS or better is nice, not having to wait a long time for my app to compile or my mesh to render is equally as important, as is being able to run my desktop OS + possibly a VM or two.

Lastly of course is drivers. I haven't yet tried one of AMD's FM-series chips, but I'd hope they're Linux-friendly in terms of the graphics driver, etc. One thing I'd say for AMD is that their ATI acquisition seems to have improved their chips' graphical capabilities a notch, but also made a difference in the quality of drivers for the graphics line.

At the end of the day, is double the price or more going to double my performance, or is the more reasonably-price offering going to give me enough performance to suit my needs.

LOL (0)

Anonymous Coward | more than 2 years ago | (#41103413)

Are you joking? Maybe it's just me because I actually work in IT and was educated to do so but what retard would actually believe anyone who said CPU doesn't affect gamign performance? What retard would say that as if it wouldn't come back to haunt them?

And, no Cathector - an experienced PC gamer can tell the difference between an actual frame rate. Not 1 fps difference, but in 10s easily.

how to tell (4, Interesting)

Andrio (2580551) | more than 2 years ago | (#41103663)

In a game, look at the sky. If your framerate shoots up, the video card was your bottleneck. If it doesn't, your CPU is.

It all seems very intel biased AMD bashin (0)

Anonymous Coward | more than 2 years ago | (#41103835)

My observations:
- Biased article, intentional or not it's biased
- When AMD does do when they are still bashed down by the author

I won't be reading that site again, what crap.

Anything over 30fps or so is a waste... (0)

mark-t (151149) | more than 2 years ago | (#41104269)

Owing to persistence of vision, once you have faster frame rates than about 30fps or so, the eye is just going to blend them together, and you won't perceive all the individual pictures separately. The only reason you might perceive a difference between higher frame rates is because of the fact that there can be different information being presented to your eyes in individual frames which will still contribute to the overall image that you see, and more frames means more data being contributed to the image, but your eyes are still going to blur them all together.

So, theoretically, with all the suitable blurring applied to a single frame, the eyes would probably not be able to distinguish any difference between 30fps and a much higher rate.

Of course, implementing such blur which might effectively simulate a faster frame rate can often be just as, if not more computationally expensive than simply rendering all of frames at a higher speed anyways, except for very specific (and simple) cases... so I really don't know how practical said theory is.

Re:Anything over 30fps or so is a waste... (2)

WromthraX (948475) | more than 2 years ago | (#41104455)

Your eyes don't blur anything, blur you refer to (probably read about it somewhere) is why film looks smooth at 24 FPS, well it's because frames on FILM are blurred, your eyes have nothing to do with it. 30 FPS may look smooth but I definitely notice big difference between 30 and 60, even 60 and 90 looks different.

Starcraft 2 on Core i7 laptop (1)

Tasha26 (1613349) | more than 2 years ago | (#41104337)

Ok so when I get beautiful Starcraft 2 rendering from my GTX 570m and then there's a big lag (frame rate goes from 40-50 to 10 fps) because the screen is full of units firing at each other, I need to blame the CPU? I assumed it was Windows 7's fault -- they couldn't even code a multi-core OS properly. (I have a Qosmio X770-11C)

just another reason.. (1)

issicus (2031176) | more than 2 years ago | (#41104655)

the i5 2500k is the best gaming cpu .
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?