×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Can Legacy Dual-Core CPUs Drive Modern Graphics Cards?

Soulskill posted about a year ago | from the only-after-they-pass-driver's-ed dept.

Graphics 159

MojoKid writes "A few weeks back, we discussed whether a new GPU like the GeForce GTX 660 could breathe new life into an older quad-core gaming system built in mid 2008. The answer concluded was definitely yes — but many readers asked to reconsider the question, this time using a lower-end dual-core Core 2 Duo. The Core 2 Duo CPU chip used was a first-generation C2D part based on Intel's 65nm Conroe core. It's clocked at 3GHz with 4MB of L2 cache and has a 1333MHz FSB. The CPU was paired with 3GB of DDR2-1066 memory. The long and short of it is, you can upgrade the graphics card on a six year-old dual core machine and expect to see a noticeable improvement in game performance — significant gains in fact, up to 50 percent or more."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

159 comments

Yes of course (2)

colin_faber (1083673) | about a year ago | (#42838863)

Yes of course they can drive these cards, will they do it at the same performance as a modern dual or quad core CPU, no.

Re:Yes of course (4, Funny)

Hsien-Ko (1090623) | about a year ago | (#42839317)

I await the obviously conclusive "Can a Pentium M / Sempron be revived by a dual GTX680" article...

Re:Yes of course (1)

jedidiah (1196) | about a year ago | (#42840273)

You can revive an old Pentium or Sempron with much less than a dual GTX680.

Intel GPUs? Really?

Re:Yes of course (2)

zabby39103 (1354753) | about a year ago | (#42841751)

You can definitely revive an old Pentium M / Sempron with a modest 50 dollar or so low end card. It enhances a good chunk of the browsing experience, not just video acceleration.

Not exactly what you were asking, but I have a Pentium M based board that makes a great mediaPC now that I dropped a Radeon HD 6450 in it. Used to be nearly unusable. Now 1080p video, YouTube, web browsing, all great (just don't deviate too much out of those core tasks). For whatever reason seems a lot faster on Firefox over Chrome...

Re:Yes of course (4, Informative)

sortius_nod (1080919) | about a year ago | (#42839571)

Exactly my thoughts. 50% increase in performance? Not really impressive when you look at the graphics card charts out there. GTX 260 has far from 1/2 the performance of a GTX 660.

According to PassMark:

GTX 660: 4038
GTX 260: 1123

So with only a 50% increase in performance, I'd say it's a waste of money. The bottom line is that modern processors, chipsets, & RAM will make a massive difference in performance for modern high end graphics cards. If you're going to upgrade your graphics card, you need to reduce the bottlenecks in the system.

Re:Yes of course (2)

Nikker (749551) | about a year ago | (#42839855)

If you want to play games today then why not?

Bear with me for a second. A GTX 660 runs about $300. A few new components (mobo, RAM, CPU) centred around the 660 would be around $300(AMD) or $500(Intel), assuming your case and power supply can handle the upgrade. So you get the GTX 660 today and get decent frame rates just by pluging it in, over the course of the next months/years you save up the cash for the core components you need and you have the luxury or waiting on sales or good deals on Ebay/Kijiji/Craigslist, etc.

This way you can enjoy your games and know it will only get better from there. Otherwise you risk getting caught getting the oooh-shiny that's being pushed on you by the sales person and spending $1K+ on a machine that will only give marginal returns on the equivalent $600 box.

YMMV

Re:Yes of course (4, Insightful)

sortius_nod (1080919) | about a year ago | (#42840575)

It doesn't work like this though. Even if you take the 50% performance increase on face value (not taking into account higher AA/ASF/Shaders) that would mean a game running at 15fps would increase to 23fps. Not exactly much of an increase. Even if you were getting 30fps on the GTX 260, that's an increase of 15fps (which is what the tests essentially saw), hardly worth $300.

Meanwhile, if you spent the money on CPU/MBD/RAM & a mid range graphics card (say a GTX 480 at around $150), you'd see actual performance increases of around 3.5x that of sticking a GTX 660 on a crap motherboard with a crap processor.

Sure, if you had every intention of upgrading the rest of the components, the graphics card is going to be the easiest to swap out, but you're still going to need to upgrade the CPU/MBD/RAM.

The article hides the fact that the increase of a GTX 260 vs GTX 660 card in a modern system would be a ~400% increase in performance. Not sure what they're trying to prove, but to me it proves they know nothing about hardware, gaming or value for money.

Re:Yes of course (1)

Nikker (749551) | about a year ago | (#42840657)

That is true but your figures assume your 15fps is actually what this person is encountering.

APU FTW? (1)

Anonymous Coward | about a year ago | (#42840851)

Hate to sound like a fanboy, but I'm really just thrifty.
Save yourself alot of money and just get a AMD Vision system :)
A10 Desktops are pretty cheap(under $650,) come with a monitor and other accessories, and they are pretty much future proof in terms of games.
I know the A6 laptops are under $600, those will play any game available now, but get an A8 or A10 if you wanna crank all the visuals up to max.
For reference, I'm enjoying games like Borderlands 2 on my A2...and while the new DMC is lag city, I really can't complain about a $300 laptop. Especially when a $300 Intel laptop gets me laggy web browsing, laggy office, slow boot up...Sorry to bash Intel, I really do love them, they just aren't in my price range :(

Re:Yes of course (1)

Gadget_Guy (627405) | about a year ago | (#42840701)

If you're going to upgrade your graphics card, you need to reduce the bottlenecks in the system.

I think that it is a case of finding the right card to upgrade to. The GTX 660 is going to be wasted, and frankly it is way too expensive to consider. But I would like to see the comparison done with the GTX 650 Ti, or even the plain GTX 650. It would be the more interesting article to find the sweet spot of graphics cards for such an old system - the point at which the performance increase not match the price increase.

Re:Yes of course (0)

Anonymous Coward | about a year ago | (#42839723)

Yes of course they can drive these cards, will they do it at the same performance as a modern dual or quad core CPU, no.

That depends on what you want to do. *Some* CUDA applications require very little CPU usage and PCIE bandwidth.

Re:Yes of course (3, Informative)

hairyfeet (841228) | about a year ago | (#42841221)

I would say it all comes down to what games you are playing. if you are playing games like TF2 and Batman:AC? Well no problem then, slapping a new GPU will give it a good kick in the pants. if you are trying to play some huge RTS with a ton of units? Then the CPU is gonna be the bottleneck.

That said its often cheap to upgrade your CPU, especially if you have an AMD as they have so many backwards compatible chips and hung onto the AM socket for so long. A good place to look at getting a new CPU would be StarMicro [starmicroinc.net] which I've used a LOT in the shop with never any issues, they go from the socket 478 on the Intel side to socket 754 on the AMD side with just a ton of chips to choose from. If you want a gaming machine they have plenty of high clocked Athlon and Phenoms at good prices and if you want a chip to make a killer HTPC out of this low power Phenom X4 [starmicroinc.net] makes a pretty kicking HTPC chip and its only $68 bucks.

So its really not that hard to keep a system that is a few years old gaming well, my youngest is gaming great on a 3.3GHz Athlon X3 and that chip was only $65 on sale, and my oldest got a Phenom II X6 for only $100 as part of a kit. While these aren't gonna beat any i7 like my 1035T they are still great for gaming and have no trouble playing all the new games we have run on them.

No, CPUs have different instruction sets than GPUs (1)

Anonymous Coward | about a year ago | (#42838905)

Next question.

Re:No, CPUs have different instruction sets than G (0)

Anonymous Coward | about a year ago | (#42840155)

That's how I read the title at first. It's a shit title.

Not really (4, Interesting)

Billly Gates (198444) | about a year ago | (#42838915)

At least not reliably.

The issue is PCI express 1.0 and 1.1 performance on 2.0 cards and later. Geforces have been known to crash using an earlier slot technology or on lower end systems. Maybe that has changed since the 9600GTX, but I switched to ATI for this reason. Even many Radeons are only tested with later hardware and instability and other bottlenecks happen as many games as Windows swaps video ram to the system ram even when there is plenty of ram available.

Re:Not really (0)

Anonymous Coward | about a year ago | (#42839379)

a 4 year old 9600GTX? My guess would be that yes, they fixed that :)

Re:Not really (1)

Osgeld (1900440) | about a year ago | (#42839517)

hell they fixed that on my 9600GT, PCI-E 1.0 slot, 2.0 card have had no problems since 2008 till now, running on a geforce 7 chipset cheap ass MSI motherboard

Re:Not really (1)

nanoflower (1077145) | about a year ago | (#42840091)

Just upgraded from an AMD 4670 to a GeForce GTX 650TI on my 775 MB E5200 with 4GB of DD2 memory. It's making a significant difference in the performance of many games and works just fine with my PCI-E 1.1 slot.

Could I see a bigger boost if I upgraded the MB/CPU/Memory? Sure, but I would rather wait until Haswell desktop CPUs hit the market to see just what they bring to the table since what I have now is working for me.

Re:Not really (1)

Jupix (916634) | about a year ago | (#42841979)

I don't think the problems are gone from the NVIDIA side. The CPU might drive the GPU well but the motherboard might not. Let me explain.

For the past month I've been reviving an old system originally built for office duties and photo/video editing. That included a move from a PATA HDD to an SSD, and a new GPU. The old GPU was a GTX 280 that had to be underclocked manually at every boot for it to work.

The system is now
ASUS P5K Deluxe Wi-Fi motherboard with Intel P35 chipset
Intel Core 2 Duo E6750 CPU
ASUS GTX 650 TOP GPU
Chieftec TX series 650W PSU

The system gives no signal even in POST. The system boots normally (I can hear the OS sounds from the speakers and can interact with the OS) but there is never a signal to the monitor. This is with both DVI and HDMI cables. The system and display work fine with the GTX 280, but as soon as I swap in the GTX 650, there's no signal.

The P5K Deluxe has some relevant BIOS settings which I tried to preset before swapping in the new GPU, but that was a no-go.

I tried another PSU, a Chieftec HX series 650W, which I know to be faultless, again no signal. I also tried a recent ATI video card (I think it was a 7550 or 7750) and that gave no signal as well. I also got no signal with a GTX 560 from the same manufacturer as the 650.

The machine works fine-ish with a GTX 280, just not with Kepler GPUs. All of this may just be a non-compatibility between old ASUS motherboards and new GPUs, meaning by default they should work, but the fact remains, the old machine won't work with the new GPU.

Gotta love (or hate) the consoles (0)

Anonymous Coward | about a year ago | (#42838971)

Mostly new games are 90% console conversions so 5 year old cpu with modern GPU will do fine since textures are low res anyway.

Re:Gotta love (or hate) the consoles (1)

tepples (727027) | about a year ago | (#42840653)

And those games that aren't console conversions are low-budget indie games or low-budget casual games that can probably run on a GMA anyway.

why 3gb ram and not 4gb or 8gb++? (1, Interesting)

Joe_Dragon (2206452) | about a year ago | (#42838993)

why 3gb ram and not 4gb or 8gb++? at least have dual channel ram with 2 2gb sticks.

Re:why 3gb ram and not 4gb or 8gb++? (1)

medv4380 (1604309) | about a year ago | (#42839071)

Maybe the old duel core doesn't have PAE support and is stuck at a 3 gig limit.

Re:why 3gb ram and not 4gb or 8gb++? (2)

Billly Gates (198444) | about a year ago | (#42839153)

Most corporations only have 512 - 1024 megs of ram. The ones who finished moving to Windows 7 have more sane amounts but many still have 512 and only have 5 tabs or less in Firefox or IE when browsing and that is perfectly fine for general use.

Not everyone is a slashdot geek with 8 gigs of ram, SSD, and decent video cards with their modded desktops.

Re:why 3gb ram and not 4gb or 8gb++? (0)

Anonymous Coward | about a year ago | (#42839369)

My home desktop has 16gb, my work desktop has 8gb and my work laptop has 4gb. You can get 16gb of ram for $70 these days it aint 2008 no more

Re:why 3gb ram and not 4gb or 8gb++? (0)

Anonymous Coward | about a year ago | (#42839845)

And you don't consider yourself a slashdot geek? Read their whole post.

Re:why 3gb ram and not 4gb or 8gb++? (1)

AK Marc (707885) | about a year ago | (#42841217)

My home laptop has 16 GB RAM, but my work system has 2 GB on XP (Win7 by the end of the year is the current rumor).

Re:why 3gb ram and not 4gb or 8gb++? (1)

ninlilizi (2759613) | about a year ago | (#42839725)

My year old, £200 /netbook/ came with 8GB of RAM.

The 32GB in my desktop cost half what the netbook did,

2008 would like there recommended spec back.

Re:why 3gb ram and not 4gb or 8gb++? (0)

Anonymous Coward | about a year ago | (#42841511)

I work for a TV station, and I edit video on systems with 1.5-2GB of RAM. Mind you, these systems are 10 years old. Utterly useless for anything but re-encoding to a different format - which is fine, if you've got a few hours to waste.

I often end up bringing home work to encode here. I can do an hour show in 15 minutes. At work, it'll take anything up to 4 hours to rip and encode it. The manager says that his IT guys tell him the machines are fine for what they do, and he's willing to pay me to sit around while this is happening.

The trouble is, he's not willing to pay me to sit around. The sitting around happens in the 40 minutes after they stop paying me.

Re:why 3gb ram and not 4gb or 8gb++? (2)

Billly Gates (198444) | about a year ago | (#42841771)

Sounds like your IT guy needs to have his walking papers signed by HR.

Make a business case and document and report it. I am sure the beancounters will shit their pants when they see you wasted $30,000 in lost salary to save $3,000 on a workstation. My above example is for office users with 4,000+ computers where it is simply not possible to upgrade. Only a major refresh signed by the CIO all at once will get the dinosaurs out. IN that scenario 512 megs of ram is very slow but can work for light office work fine if you stick with Office 2k3, adobe 8, and IE 6.

Many of these systems are common for home users too. In this economy $11/hr is the new norm for college students. Simply pitching a perfectly working computer does not make economic sense more than replacing a video card if you saved money for awhile to do so if you are a gamer making that wage.

But you are more valuable than working off the click demonstrate it to your bosses boss with your boss around and casually mention how you can do half your days job in 15 minutes on a modern computer and see what he says?

Such backward thinking IT people who do not want to do their job are incomptentent and need to go.

Re:why 3gb ram and not 4gb or 8gb++? (2)

arbiter1 (1204146) | about a year ago | (#42839195)

Sadly PAE in my experience is iffy period, i got a q6600 machine has 2x2gb sticks in it and don't have even 3gb useable. it runs xp 32bit for a reason. PAE has never helped give full 4gb ram on any 32bit OS i have ever used.

Re:why 3gb ram and not 4gb or 8gb++? (2)

colin_faber (1083673) | about a year ago | (#42839269)

When my q6600 was running XP it addressed 3.5GB of memory. As soon as I installed win7 it addressed all 6GB I had in the box.

PAE under FreeBSD and Linux works fine, it's just the amount of userland vs kernelland addressable memory that's the issue. On a PAE kernel you're still stuck with 3.5GB of kernel land addressable memory.

Re:why 3gb ram and not 4gb or 8gb++? (1)

Anonymous Coward | about a year ago | (#42839793)

That's because only the server versions of Windows allow PAE for memory address space expansion.

Driver compatibility (1)

tepples (727027) | about a year ago | (#42840673)

only the server versions of Windows allow PAE for memory address space expansion

And Microsoft put this policy into place because manufacturers of workstation hardware and peripherals couldn't clean up their drivers to make them compatible with PAE.

Re:why 3gb ram and not 4gb or 8gb++? (1)

loufoque (1400831) | about a year ago | (#42841441)

You need to add /PAE to the booting options or Windows XP will not have PAE support enabled.
Of course each application is still limited to 3GB, PAE just means that you can have 3GB per process.

Re:why 3gb ram and not 4gb or 8gb++? (1)

Jaysyn (203771) | about a year ago | (#42841775)

Correct me if I'm wrong, but I believe your video RAM should also be taken into consideration regarding the memory addressing limitation.

Re:why 3gb ram and not 4gb or 8gb++? (1)

washu_k (1628007) | about a year ago | (#42839243)

All Core 2 CPUs have PAE, even the Celeron versions

Some lower end chipsets from the Core 2 era don't support more than 4GB of physical address space, even with 64 bit OSes.

Re:why 3gb ram and not 4gb or 8gb++? (1)

Khyber (864651) | about a year ago | (#42839093)

Because a GTX 660 likely comes with a starting MINIMUM of 1GB RAM? So on older systems, still running oh 32-bit operating systems, you can actually use ALL available RAM instead of being bottlenecked to just 4GB due to shit PAE?

Re:why 3gb ram and not 4gb or 8gb++? (2)

Dputiger (561114) | about a year ago | (#42839477)

As the author:

Because the point was to test a system that was assembled using upper-midrange configuration in 2008. Back then, a majority of customers were still using 32-bit Windows and while 2GB DDR2 DIMMS were available, 1GB were the sweet spot.

My first configuration was a Q6600 with a GTX 260 and 3GB of RAM. I swapped in the E6850 to settle the dual-core question.

Also because that's all the DDR2 I still had on hand after so long.

But 3GB is reasonable. It's enough RAM that someone who upgraded to 64-bit Windows 7 (the OS I tested) might not have felt the need to upgrade more.

Re:why 3gb ram and not 4gb or 8gb++? (1)

jackb_guppy (204733) | about a year ago | (#42839785)

Just got my wife a "new" from ebay... Q6600 with 3G - 2x 1 GB and 2x 1/2 GB. So dul channel will work on a 32bit OS.

Once I get her to 64bit then I will load 4x 2 GB, so her machine will be same as mine.

Re:why 3gb ram and not 4gb or 8gb++? (4, Informative)

tlambert (566799) | about a year ago | (#42840047)

why 3gb ram and not 4gb or 8gb++? at least have dual channel ram with 2 2gb sticks.

Because the defective Merom chipset in use in the Core2Duo systems did not support greater than a 4G memory mapping space, and 1G of that was taken up as I/O space, so it was unable to remap the extra 1G of physical RAM and.or move the I/O hoe, even though it had the physical address lines to do so.

The chipset was manufactured between Nov 2006 and Oct 2007, but was used far longer than that by many manufacturers, since Apple was soaking up almost the entire supply of the corrected chipset, which was manufactured between Nov 2007 and Oct 2009.

Intel screwed up, and then taped out anyway in order to meet market deadlines.

It typically wasn't a big deal for most people, since the 2G SIMMs were very unstable at that point, and even desktop systems rarely had more than 3 SIMM slots. This changed in 2009 when Hynix finally fixed their 2G SIMMs, but the company nearly bit the dust anyway, as by then it had defaulted on several loans and one debt-equity swap.

Most people only discovered the screwup in the Merom chipset that happened to be in their machine when they started trying to use 2 2G SIMMs in their Core2Duo machines with the old Merom, and were only seeing 3G of RAM show up to the OS.

Re:why 3gb ram and not 4gb or 8gb++? (1)

yuhong (1378501) | about a year ago | (#42841123)

The chipset was called Lakeport (otherwise known as Intel 945), not Merom. And it was the current mobile chipset between Jan 2006 and Jan 2007. It was actually introduced to mobile with the 32-bit only Yonah (Core 1) processors. On the desktop, the original Intel 965 chipset was introduced with the original Core 2 launch in 2006, but it is true that it can also be used with the old 945 chipset that was introduced with the old Pentium D in 2005.

The key conclusion, if you won't RTFA (4, Informative)

rcastro0 (241450) | about a year ago | (#42838999)

To save you a few clicks, here's the key conclusion (and much better said than the summary from /.) :

  Intel Core 2 Q6600 chips aren't available new these days, but Ebay has a ton of them, regularly priced between $50-$70. (...) Is a new CPU worth the price? I'd say yes --especially if you've currently got a dual-core CPU in the 2.2 - 2.6GHz range. The combined cost of a used Q6600 and a GeForce GTX 660 should still come in below $300 while delivering far better performance than any bottom-end desktop you might assemble for that price tag.

Re:The key conclusion, if you won't RTFA (-1)

Anonymous Coward | about a year ago | (#42839085)

Thanks for ripping the article. Nothing like biting the hand that feeds you.

Re:The key conclusion, if you won't RTFA (0)

Anonymous Coward | about a year ago | (#42839473)

I ad block a fair bit. A site with multi page articles where the page won't ever fit on the screen definitely is not getting unblocked. It doesn't even stretch so there are grey bars down the side. Come on.

Re:The key conclusion, if you won't RTFA (2)

Joe U (443617) | about a year ago | (#42839649)

Intel now and then makes some real 'stand out' chips, the Q6600 is one of them. It runs pretty great for it's line and can be overclocked.

Re:The key conclusion, if you won't RTFA (1)

spire3661 (1038968) | about a year ago | (#42840285)

It was good but hot and power hungry. I was using a Q6600 as a DVR/Video Compressor after i upgraded to Sandy Bridge. Recording and compressing the 2012 Olympics killed it after about 6 days of solid CPU use. Im sure it was the mobo that died, but i wasnt going to resurrect it in the Ivy Bridge era.

Re:The key conclusion, if you won't RTFA (1)

Mashiki (184564) | about a year ago | (#42839679)

Wow really now? I guess that's a good deal, and they didn't really do any shopping around. But I recently built a FX-6100($119), with a MSI970a-G46($85), 8GB of ram(gskill f314900CL9D-8GBSR)($29), and picked up a 560Ti on sale with instant rebates for $99. What's that work out to being? $332 plus tax or $381 w/tax where I live. I mean come on it's not a blazing fast machine or anything, but it's sure not bottom end desktop. And it'll handle pretty much everything on the market in terms of gaming, well if not exceptionally well. The only thing it's choking on is Crysis3(insert jokes about Crysis).

Re:The key conclusion, if you won't RTFA (2)

Curate (783077) | about a year ago | (#42839983)

How's Duke Nukem Forever on that rig?

Re:The key conclusion, if you won't RTFA (1)

Mashiki (184564) | about a year ago | (#42840439)

How's Duke Nukem Forever on that rig?

Well that would require me owning it, and even though it's been on sale a dozen times on steam as cheaply as $5 I still don't own it, maybe if it gets down to $2.99. Hah

Though my shogun2 DX11 high bench gives me an average FPS: 41.6875 [pastebin.com] on the new 313.96 drivers. And here's the old 3dmark score [3dmark.com] from back a bit ago before I started tinkering with it. Still haven't gotten around to running a new bench for it. Back when I first did the build it was in the top 3 fastest in the FX6100/560ti category at stock.

Re:The key conclusion, if you won't RTFA (3, Interesting)

Klinky (636952) | about a year ago | (#42840001)

The caveat to sticking with the Socket 775 platform is DDR2 memory, which is usually going for twice as much as comparable DDR3. What with 2GB being the maximum practical size for a DDR2 DIMM, many boards are limited to a 4 - 8GB maximum.

Some might entertain the notion of going with an AMD AM3+ board. Going from a low end dual-core Intel solution, to a AMD quad-core solution with 8GB of RAM for around $150 - $175 is a nice performance boost. You could put that money towards a Q6600 and some more RAM, but then you have effectively maxed out your system, and the next time you upgrade you will have to rip everything out anyways. If you wanted to jump to Intel's new lineup, then you will be spending $150 - $175 on the CPU alone to see a performance increase.

Re:The key conclusion, if you won't RTFA (1)

armanox (826486) | about a year ago | (#42840445)

There are LGA775 boards with DDR3. I have one currently taken apart at work (has a Core 2 E8400 in it).

My upgrade experience (2, Interesting)

Anonymous Coward | about a year ago | (#42839015)

I recently upgraded my CPU from a E4400 to a FX-6100 and added an SSD. I would say the SSD was probably the only reasonable upgrade, in terms of gaming. The FPS certainly better, but it was already above 50-60 FPS in Team Fortress 2. What's the point in making a difference if your eyes aren't going to register it?

The SSD was an excellent upgrade. I used to launch TF2 and go heat up some dinner while waiting for it to load. Now it launches and loads levels in under 30 seconds. That's much, MUCH better than before.

On the other hand, I work a lot with Xen on my Linux partition. Upgrading from a CPU that didn't have any virtualization extensions to one that did made my life so much easier. Being able to launch any kind of OS with very good performance (for a VM) is such a nice upgrade from a VM that could only launch Linux guests.

Re:My upgrade experience (1)

loufoque (1400831) | about a year ago | (#42841465)

I hope you chose a motherboard and chip combination that supports IOMMU as well.
Though this might not matter too much if you're only launching Linux guests.

Depends on what game/app the GPU "drives" (1)

SpaceManFlip (2720507) | about a year ago | (#42839081)

I used to game on a dual-core, but I upgraded to quad when I could. $75 on ebay got me a used Q6600 (core 2 quad, 65nm cpu) which I run at 3GHz+ now.
Recently I upgraded my older midrange GPU to a newer one (not the newest mind you) - a GeForce 560ti 2GB card.
Now I can play the highest-end games, by squeezing every bit of juice out of my old mobo/cpu/ram combo. I play Battlefield3, and the new Crysis 3 open beta. This is where my comment can shed light on OP's question - both of those recent high-end games pretty much max out all 4 cores of my quad-core.
BF3 usually eats up at least 85-90% of all 4 of my Q6600's cores running at 3.07GHz, and I get 20-60 fps depending on a variety of factors like number of players (networking bottlenecks), size of the map, number of explosions happening at once, etc.
Just recently I tried the Crysis 3 open beta, and ran the graphics up most of the way to the max, and it uses more CPU than Battlefield 3.
So I think if you want to play the bestest of all teh games, in terms of how many fancy pixels will dazzle your optic nerves, then you need more than 2 cores now. But if you want to play new games like Borderlands 2 etc that use older engines (UT3 etc) then a dual-core may work. Hey! Look at the box's system requirements or something maybe?
BTW, I have 6GB of DDR2 at about 900MHz, 4-4-4-12 timings, and a PCIe 2.0 x16 slot for the aforementioned GPU and CPUs. I know all of these specs are behind the times, and I do a lot of work at work with newer stuff like AMD Bulldozer-equipped servers and i-series Xeon-equipped workstations, so I have a fairly good idea of how much better the new CPU architectures are. Still I choose to postpone my personal upgrades until extra money magically appears, because it just works right now.

(p7us one Informative) (-1)

Anonymous Coward | about a year ago | (#42839091)

escape them by Ph1losophies must will recall that it in any way related m0ch organisation,

No surprise (3, Informative)

FranTaylor (164577) | about a year ago | (#42839183)

It's no surprise that you can hook a fast GPU to a slow CPU and get good results, look at Raspberry Pi, who could imagine doing HDMI video with a single core 700 MHz processor?

Re:No surprise (0)

Anonymous Coward | about a year ago | (#42839441)

You probably meant HD video.

Re:No surprise (1)

FranTaylor (164577) | about a year ago | (#42840375)

I'm looking right at the HDMI connector

Re:No surprise (0)

Anonymous Coward | about a year ago | (#42840681)

Amazing... is the Raspberry Pi also capable of doing USB audio or Ethernet spreadsheets?

Re:No surprise (0)

Anonymous Coward | about a year ago | (#42841277)

How about you look at the dedicated decoder included in the graphics chip, the only thing the HDMI connector provides is an alternative to the other high-res connector, LVDS, and component video connections.

Re:No surprise (1)

Kjella (173770) | about a year ago | (#42841759)

HDMI video? Computers have been able to put out HD resolution since the 90s, maybe you're thinking about H264 video or some other video codec? They only work because the Pi has hardware decoding capability, you don't need a fancy CPU if it isn't going to be doing the work...

"up to 50% or more" (2, Funny)

Anonymous Coward | about a year ago | (#42839207)

Seriously? "up to 50% or more"? Can the submitter get any more vague [xkcd.com] ?

Console ports. (1)

Kaenneth (82978) | about a year ago | (#42839389)

That's because most commercial PC games are coded for an XBox 360 level of CPU, most of what the better GPU does is push the same image to more pixels. If a game could use more CPU for anything aside from eye candy, it could end up affecting the gameplay itself in unpredictable ways; like when I tried playing Wing Commander on a modern CPU... Undock and WOOOOOOOSH SMASH! into an asteroid instantly; or 'El Fish', which on a 386 took 10-15 minutes to generate a fish... tried it on a modern CPU, when it starts it divides some number by the number of minutes to generate a fish... less than 1 minute? divide by zero crash.

Turn based games like Civ 4 fortunately scale very well, I no longer have time to get a snack waiting for the computer controlled civs until the endgame.

Although ideally more of the Graphics pipeline can be offload to the GPU hardware instead of the driver software leaving a smidgen more CPU for the game code itself.

Please, try not to laugh. Seriously. (1)

ma1wrbu5tr (1066262) | about a year ago | (#42839471)

We have an older Socket AM2 board and a 64x2 4200+ CPU. I paired it with an ATI RADEON HD 4670 1GB video card and 2x2GB RAM and it still does almost everything I throw at it. However, I've noticed my newer games are struggling in spots. This mainboard will handle the 6000+ CPU that has double the L2 cache and faster clock. My question is, "Is it worth the $60+ to upgrade or should I just be looking for a newer machine?". Please note, I don't have a lot of cash to throw around and that eventually I want to get something newer as I have an interest in playing MechWarrior Online. This machine will probably get HTPC status via Linux/XBMC. This is my quandry.

Re:Please, try not to laugh. Seriously. (2)

Osgeld (1900440) | about a year ago | (#42839691)

for about the same amount of money you can get a AM3 chip. It will work in a AM2 / 2+ system (but check with your montherboard maker first!)

80 bucks gets you a 3.2Ghz Phenom II quad core
http://www.geeks.com/details.asp?invtid=HDZ955FBK4DGM-BP&cat=CPU [geeks.com]

currently I am using a Phenom II tri core at 2.8Ghz with a GTS250 on a motherboard that is AM2+, but is "AM3 ready" whatever that means and it did make a noticable improvement, but not "OMFG punch your momma" improvement over the 2.5Ghz X2 I was using... just enough to smooth out many jitters

Re:Please, try not to laugh. Seriously. (2)

nanoflower (1077145) | about a year ago | (#42840219)

It really comes down to what is holding back your games. Depending on the game it could be the CPU or it could be the GPU, or even both. I know that I'm seeing a nice boost moving from a 4670 to GT650 TI even though I'm still on a E5200 (OC'ed) CPU. But the games I'm playing tend to use the GPU more than the CPU. If I were playing a game like Civilization V I would have been better off upgrading the CPU and sticking with the 4670. So take a look at the games you play and how they stress the CPU. If they are regularly hitting over 75% CPU usage then it may be worth upgrading.

Re:Please, try not to laugh. Seriously. (1)

AK Marc (707885) | about a year ago | (#42841661)

It's almost always the video card (usually GPU, though for high-end GPU/game combinations, it can be the interface). Back in around 1998, a friend built a $3000 gaming machine with good everything. About a year later, I built one for about $700 that was faster. I got the oldest, most out of date setup I could that ran the best cards of the day, and $450 of the $600 was video card. My load times sucked, but once the game was loaded, my FPS was better. That's for the games that were graphics-limited (most of them that had a concept of "FPS" were).

Known this for years (1)

DCFusor (1763438) | about a year ago | (#42839647)

Even since PII and PIII, we'd been speccing an above-average graphics card on our dev machines in a software shop to get better performance per buck - and not just on games.

Probably not. (1)

usagimaru (2327148) | about a year ago | (#42839685)

I have an AMD Phenom 9600, which is 4 cores at 2.3GHz. It still gets by decent enough for anything except games that have come out in the last 2 years. Many of newer games max out my CPU and leave me with an unplayable experience. I have a friend with the same GTS 450 as me, who gets by fine with an mid-tier Intel machine purchased last year.

Re:Probably not. (1)

Osgeld (1900440) | about a year ago | (#42841819)

waht video card do you have, you mention your phenom wont play games, but then say your buddy with a GTS40 will

duh whats the link, running the geforce 720 integrated on your machine?

I can run xbox360 titles in higher than 720p on a X2 and a 9600GT, whats your problem?

AMD X2 6000+ & GeForce 580 here! (2)

Orphis (1356561) | about a year ago | (#42839781)

Seems quite silly to have such an old CPU (dual core 3GHz) with a (back then) top of the line GPU but it's working great! Note that I'm also using 6GB of ram at 800Mhz dual channel (1+1 + 2+2 GB).
I am able to play LOTS (if not all?) games with high / very high graphic detail since then. There are a few options that are tightly coupled with the CPU sometimes and I avoid these, but the rest works great at 1920*1200 (24" screen), even with new games.

My next upgrade will probably be a CPU upgrade, probably with the new Intel Haswell this time when it's released, but I'm not expecting a big boost in games, mostly a faster system overall (dual core is still a bit limited when you have so many programs launched in the background).

My question is (1)

I+AOk (2264416) | about a year ago | (#42839791)

Can an AMD Athlon 64 X2 drive a Radeon HD 7770?

I've got a Matrox Parhelia APVe, which has no open-source drivers, and the latest are for Ubuntu 8. Been thinking of getting a Radeon, will I be able to play any good games with it???

1995 era computer vs. 2000 era computer (1)

matty619 (630957) | about a year ago | (#42839795)

Imagine trying to pair a graphics card from 2000 with a cpu from 1995. Not only would the 1995 CPU be wed to a motherboard with no AGP slot, but the real world benchmark of a 133 MHz Pentium Vs a 1 GHz Pentium III was HUGE. The clock speed alone was nearly 8x greater, not to mention the greatly improved instruction sets...and FSB improvements. I honestly thought that by now, there would have been some sort of "killer app" that would have really put the pressure back on the desktop, to where the average person would really *need* that Core i7 over the i3, but to the average user, it doesn't make a bit of difference. Even to me, my 4 year old Q9400 paired with DDR2 800 is still more than adequate driving 3 1920x1200 monitors and massive multitasking. It even handles the occasional gaming weekend quite well, as well as ripping HD video content. Not to mention today's video cards still physically fit in my PCI slots!

Re:1995 era computer vs. 2000 era computer (1)

Osgeld (1900440) | about a year ago | (#42841243)

I kind of did that, dropped a Radeon 7000 circa 2001 in a 1997 powermac 9600 300Mhz, plays Qauke III Arena really nice, other than that, yea the rest of the system really holds it back.

though on a 1997 system with a 1997 video card the video card was obviously holding the system back, just simple 2d quickdraw the cpu would go to idle while wating for it, so theres a balance

will your old machine benifit from a new video card, hell yes, will your new video card make your old machine perform like a new machine, hell no, but it will go quite a way if you just need an extra little boost to a system thats doing fine, but may be just draggy in the video department.

HD 4770 (0)

Anonymous Coward | about a year ago | (#42839805)

I bought a HD 4770 and I had a dual core athlon at the time, according to benchmarks.. I want to say I was losing about 5% performance due to the CPU bottlenecking.

Speaking from experience... (3, Interesting)

Anachragnome (1008495) | about a year ago | (#42839817)

Speaking from experience, I can attest to the conclusions of the article.

The machine I am using as I write this is similar to the machine descibed, though I am running 3.25GB of DDR3 (the most this motherboard can utilize for some odd reason). This computer was one of the 1st-generation "Built for Vista" machines--it's a Gateway my daughter bought intending on putting XP on it. Turns out much of the hardware had no drivers for XP, and...well, to be honest, it sucked so bad she bought ANOTHER computer (Best Buy wouldn't give her a refund).

I ended up with it eventually. I up-graded the RAM as best I could (had sticks laying around), installed Windows 7, and dropped a HD7550 in it--While it isn't a screamer, I actually use it as my gaming machine. The biggest visually noticeable performance gains were, by far, from installing Windows 7. The drivers that Windows found worked great. The video card was the next increase in performance, and it was astounding.

But here is the important thing I discovered with this arrangement--the gains are entirely dependent on the software being used. Some games use massive amounts of CPU when they could be handing off some of that load to the video card, and those games don't run so well. Other games are better in this regard and take advantage of the video card and those games I can usually run at maximum settings.

I play an emulator of Star Wars Galaxies and most times I have two instances of the game running concurrently as well as a browser on a secondary monitor. I usually have Ventrilo running at the same time. Sure, only one instance of the game is actually being rendered, but the CPU load is doubled...and this machine handles it wonderfully, with game settings maxed out. I've also run Skyrim easily on this machine, mods galore.

I am quite pleased with the arrangement

Re:Speaking from experience... (1)

Anachragnome (1008495) | about a year ago | (#42839887)

"... though I am running 3.25GB of DDR3 (the most this motherboard can utilize for some odd reason)..."

I was incorrect--4 sticks of Crucial 1GB DDR2.

Re:Speaking from experience... (0)

Anonymous Coward | about a year ago | (#42841283)

you should consider running a 64bits OS if not already the case.

Re:Speaking from experience... (1)

O('_')O_Bush (1162487) | about a year ago | (#42841761)

The kookiest part, you'd see the largest performance gains still by wiping off W7 and replacing it with Windows XP. In almost all games I tested, running Windows XP with a 7950GT was exactly equivalent to running a 9800GT on W7. Incredible how less efficient the W7 OS is. You need roughly twice the power to achieve the same performance.

Maybe I'm an outlier? (1)

Anonymous Coward | about a year ago | (#42840087)

I'd like to see Metro 2033 in those tests. That's the most demanding FPS I'm aware of, beating out even Crysis.

Presently, I run triple 28" monitors with an overclocked GTX 680 in NVidia Surround (5760x1200) backed by an Intel i5-2500K stock and 8 GB DDR3-1333 RAM.

Even with this configuration, I struggle to get acceptable framerates in Metro 2033. I had to turn the settings way down to achieve this, and it's still the minimum of what I would consider playable. I suspect my biggest limiter in this regard is the fact that when I got my GTX 680 they were new and hard to get ahold of, thus I went with the 2 GB model. I've personally witnessed my system hit 100% GPU memory utilization while playing WoW in 5760x1200 on Ultra. I suspect the 4 GB models would not have this problem.

Sure, single monitor gaming you can skimp on components. But once you move past that, forget it; you need prime hardware or you're going to see the difference plainly.

Re:Maybe I'm an outlier? (2)

0123456 (636235) | about a year ago | (#42840437)

The GTX660 is memory limited, so no CPU will give good frame rates in Metro 2033 at that resolution. It can't even sustain 60fps with the game maxed out at 1920x1080.

Dumbfounded (1)

psinet (1665115) | about a year ago | (#42840195)

How can anyone be suprised by this - let alone /. readers/submitters?

Re:Dumbfounded (1)

osu-neko (2604) | about a year ago | (#42840945)

How can anyone be suprised by this - let alone /. readers/submitters?

Because a lie repeated often enough gains the ring of truth. You hear it said quite frequently in gaming fora that there's no point in sticking a top end graphics card in an old machine, the CPU won't be able to keep up with the demands of the game, so the whiz-bang GPU is just going to waste when it's the CPU that's holding you back, like putting big speakers on a small stereo that doesn't have the power to drive them. It's poppycock, yes, but not everyone is expert enough in what really is going on in a video game to know that...

Re:Dumbfounded (0)

Anonymous Coward | about a year ago | (#42841193)

Taken in context though, a GPU upgrade providing more performance is great in this specific context. Since games are no more demanding of the CPU now than they were when the Core 2 Duo first came out (thanks consoles!) it makes sense that improving your GPU would provide a performance increase.

On the other hand, look at Planetside 2 or Mechwarrior Online if you want an example of where this upgrade would fall down.

Speaking of legacy... (1)

macraig (621737) | about a year ago | (#42840203)

... can I get that GTX 660 for an AGP slot?

Re:Speaking of legacy... (0)

Anonymous Coward | about a year ago | (#42840573)

Forget about AGP, how about that GTX 660 for an 8 bit ISA slot? :P

Athlon X2-5600+ and GTS-250 (1)

EmagGeek (574360) | about a year ago | (#42840343)

I play games just fine on this rig... 4GB of DDR2 and an SSD on Windows 7...

Re:Athlon X2-5600+ and GTS-250 (1)

Osgeld (1900440) | about a year ago | (#42841271)

I have a tri core phenom 2.8 ghz, gts 250 and 4 gigs of DDR2 on windows 7

at the very worst you get a studder now and again from most "normal" games, heavy number games can bog here and there but not enough to bother me

Makes sense (1)

Murdoch5 (1563847) | about a year ago | (#42840351)

Why wouldn't you be able to? The issue with running a graphics card is actually a combination of the chipset on the motherboard and available power delivery. The CPU actually has very little to do with interfacting to the graphics card, the point of DMA ( Direct Memory Access ) and other transport systems is to seperate the CPU from the rest of the hardware. The motherboard acts like a crossing guard steering all the "traffic", the PSU delievers all the "food" and the CPU's only job is to think about what it's passed.

well, duh (0)

Anonymous Coward | about a year ago | (#42841195)

Uh.. yeah. Every single person here knows this. Gaming has not been exceeding our CPUs for several generations of hardware now. It's all about pushing the GPUs.

I've got this build, it works great (0)

Anonymous Coward | about a year ago | (#42841431)

My 6 year old core 2 duo computer just got upgraded into a new case and given a Radeon 7870, and it can run new games on medium with max framerate and high between 30-50 average. The CPU and motherboard outlived the plastic power button and other functions on my case and is still running.

Another huge improvement to my computer was adding a solid state drive. I upgraded the graphics card first and noticed a big improvement, but then adding the SSD felt like a big improvement as well. Chivalry: Medieval Warefare runs really smoothly even on mostly high settings now.

I recommend upgrading your video card if you have a core 2 duo for sure.

It worked on my P4 (1)

jamessnell (857336) | about a year ago | (#42841835)

In 2010, I put a then-modern PCIE video card in my P4 3GHz HT box. Suffice to say, Starcraft II ran on Ultimate settings just fine. I think the big difference between my beater and a much newer machine was load times, but once the core game was up and running, it kept up really really well. I think it probably helped that the video card had a fair bit of memory. Perhaps if I wanted to revive that machine further, I could also throw in SSDs, which would probably only offer limited benefit, but would certainly reduce any potential for disk io in the actual drives from being a choke-point. lol.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...