×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Can a New GPU Rejuvenate a 5 Year Old Gaming PC?

samzenpus posted about a year ago | from the making-the-best-of-it dept.

Graphics 264

MojoKid writes "New video card launches from AMD and NVIDIA are almost always reviewed on hardware less than 12 months old. That's not an arbitrary decision — it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. That said, an equally interesting perspective might be to compare the performance impact of upgrading a graphics card in an older system that doesn't have access to the substantial performance gains of integrated memory controllers, high speed DDR3 memory, deep multithreading or internal serial links. As it turns out, even using a midrange graphics card like a GeForce GTX 660, substantial gains up to 150 percent can be achieved without the need for a complete system overhaul."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

264 comments

no surprise there (1)

Anonymous Coward | about a year ago | (#42687971)

this doesn't surprise me one bit.. the GPU does most of the heavy lifting anyway, when it comes to games

still, an i7 will show you substantial performance enhancements

Re:no surprise there (4, Informative)

Ancient123 (724901) | about a year ago | (#42688133)

I have an i7-920. Still have yet to hit a reason to upgrade.... I bought it on new years eve 2008.

Re:no surprise there (3, Informative)

Gerzel (240421) | about a year ago | (#42688287)

I'm still using a core2 quad and am getting by fine (GeForce 9800 gpu). Sure I don't turn the graphics full up on games but I'm doing alright.

Re:no surprise there (4, Informative)

Warma (1220342) | about a year ago | (#42688687)

I had a comparable processor, which I bought on the christmas of 2009. However, some new games such as Mechwarrior Online and Planetside 2 are heavily CPU-bound and the machine was lacking when running them. I upgraded to i7-3770K and the improvement was dramatic (30-40 -> 60fps for MWO and 40-50 -> 90fps for Planetside). The graphics card did not change, as it was already rather powerful (Radeon 6970) and not a bottleneck on the detail levels I was using.

This was literally the difference between unplayable and playable, so if you play those games, there absolutely is a reason to upgrade.

Re:no surprise there (4, Insightful)

fuzzyfuzzyfungus (1223518) | about a year ago | (#42688307)

this doesn't surprise me one bit.. the GPU does most of the heavy lifting anyway, when it comes to games

still, an i7 will show you substantial performance enhancements

It's a bit more nuanced than that: certain upgrades lean almost entirely on the GPU(say you get a fancy new monitor and want Game X to look good on a 1920x1080 or 2560x1440 instead of a 1280x1024); but you can run into situations where no CPU is really enough CPU(RTS pathfinding in games that permit a lot of units is a particularly hairy case. Supreme Commander, say, can merrily chug along at 60fps with a screen full of units cranking out idle animations; but a few hundred bots scrambling to navigate can bring your CPU to its knees.) It's certainly a less common issue than an inadequate GPU; but it can happen.

Re:no surprise there (1)

Danieljury3 (1809634) | about a year ago | (#42688641)

Or late game in Sins of a Solar Empire where my GPU is sometimes almost idling because the one CPU core its using is at 100%. I would have thought running hundreds of AI subroutines would be easy to multithread. Makes me want to get a i5 to replace my AMD Athlon II that was mid-low end when I brought it 3 years ago

Re:no surprise there (1)

Molochi (555357) | about a year ago | (#42688343)

Sure i7 kicks ass. But I'm still waiting for a reason to upgrade from an OC Q6600 main system,

Of course, I'm "only" running my games at 1080p and the games are just xb360 ports so i don't expect to need to upgrade.

Re:no surprise there (0)

Anonymous Coward | about a year ago | (#42688505)

Motherboards that support Q6600's don't support DDR3, PCIe 3.0, SATA 3.0, or USB 3.0. (I'm sure about the first two and making an educated guess on the latter two). You can get 2x faster compile times if you upgrade to an i7 (2x as many threads, 2x main memory bandwidth, 2x hard disk bandwidth), and you can upload textures to the video card 2x faster. And you can back up your stuff to a thumb drive 10x faster.

p.s. Let me get this straight. You admit to being a PC ricer (OC), but you don't want the latest shiny? Does not compute.

Re:no surprise there (0)

Anonymous Coward | about a year ago | (#42688695)

Yeah, what you say's not entirely true. There no prob with Q6600 and DDR3. This cpu sits on a cheapass G41 mobo with CPU pin mods (this just lies to the mobo and makes it think it's supposed to run at 1333). The LGA775 motherboard requres DDR3 (as many do) and it's running 1:1 at a modest 1333 (Q6600 is at 3GHz) .
The system of note uses a uATX board that came out in 2008 so i think it fits.

  I'm not dissing i7 vs C2Q, I know that even an i3 can match it nowadays. But IT'S JUST NOT SLOW ENOUGH to MATTER YET. I'm old school. OC for effect not EPEEN. Also, ricer's use watercooling. this is on a stock HSF :)

A long time ago in a land far far away you used a turbo button and a heatsink (gasp! what's that?) to run an XT system as fast as a 286...
Many systems later, you bought an ABit BP6 and put 2 celeron 300a CPUs on it and overclocked them to 450, ran Win 2000 and called it a day...
  until you bought a Duron 600 and overcloclocked that to 1GHz (or if you prefer a Coppermine Celeron) (because MT wasn't used as much as MHz) ....
  then you picked up a Wilemette or Barton...
then an underclocked A64....
Get it?

None of these systems were ever "limited in games" by their CPUs before they were retired.

Re:no surprise there (0)

Anonymous Coward | about a year ago | (#42688845)

Part of the problem with me wanting to upgrade is the limitation of the latest (intel) CPUs to overclock past older CPUs and the base price for that limited upgrade. I'm not really interested if I have to spend 2 or 3 times as much for the hardware when an XBox runs the game just fine.

If they want to be lazy and say the latest PC game (that runs fine on an old fucking XBox360) needs a new i7 system with an addl $200 vidcard, well that's someone else's problem.

Re:no surprise there (2)

TapeCutter (624760) | about a year ago | (#42688827)

I have an i5 with an SSD and an i7 with a traditional HDD, both have very similar GPU's, both run about the same for demanding video games ('world of tanks' at highest quality to be specific). Prior to installing the SSD the i5 was clearly an inferior setup and could not cope with the game without setting the quality slightly above "total crap". I've had that setup for about a year but I also had the SSD replaced under warranty after about 6mths of use, overall I think it's been well worth the dollars and the hassle.

From the credit_where_credit_is_due_dept: the standard Win7 "Performance Information and Tools" really is a very useful "upgrades for dummies" guide as to where you should focus your hardware dollars. With an i5 or better, it's unlikely it would recommend a CPU upgrade.

Re:no surprise there (2)

hairyfeet (841228) | about a year ago | (#42688597)

Frankly games even to this day use so little CPU that an i7 for gaming is honestly overkill. You can take any $70 Athlon triple and have a great time gaming on it as long as your GPU has a little muscle. Oh and this was one of the nice things about socket AM3 lasting so long on the AMD side as you could buy a dirt cheap dual in 08 and have upgraded to a quad or hexacore right now for very little without having to toss your board and RAM.

If I were building a gaming PC today on a budget I'd probably go for the Thuban Hexacore, not only can it be had for as little as $100 for a true 6 cores (unlike the FX series which frankly the BD/PD chips are just hyperthreaded half cores instead of real cores) but when you are playing most games turbocore kicks in so its like having a really fast triple core but then when you actually need 6 cores for transcoding or the few games that will scale that high you have 6 actual cores.

But playing games on a 5 year old PC really isn't that big a deal as I was selling Phenom quads at that time and they play games just fine and my GPU is nearly that old (an HD4850) and while I'll be upgrading in a couple of months (the HD6850s seem to be the sweet spot ATM) honestly it plays all the games just fine. Just Cause II, Saints Row 3, The Borderlands series, the Crysis series, all play with plenty of bling so I don't see what the "ZOMFG!" is about a 5 year old PC. After all this isn't the MHz wars anymore, since switching to cores programs just haven't kept up with the speed of the hardware, not even close. Hell I often transcode or build DVDs WHILE gaming and I still don't get all laggy as the games just haven't kept up with the hardware.

And if you look at the specs of the next gen consoles I honestly doubt I'll have to build a new system anytime soon, the PS4 looks to be a COTS middle tier AMD APU and if rumors are true MSFT will be going with an AMD octocore which is just a quad with hyperthreading so it don't look like anybody that built a PC in the past few years will have trouble gaming. A word of advice on GPUs though, Geeks has some truly crazy deals on their refurbs and I must have gone through over a hundred at the shop without a single problem, they really are rock solid.

Re:no surprise there (0)

Anonymous Coward | about a year ago | (#42688797)

I'm currently building an i7 3770 + vt-d motherboard that I can passthrough the pci express so I can game in the vm while running linux as my main os.
From what I've read around, this setup will chop 20% off the cpu but will otherwise allow to game on linux without rebooting.

Note however that it's not an economical decision since I can probably buy two PCs for the same money.

No (1, Insightful)

Hsien-Ko (1090623) | about a year ago | (#42688005)

AGP bridges suck.

PCI-E DDR2 rigs aren't even that old or even considered "obsolete" either.

Re:No (1)

Tourney3p0 (772619) | about a year ago | (#42688075)

Seeing as how he mentions the Geforce GTX 660 specifically, I don't think he's talking about a 10 year old AGP system.

Re:No (0)

Anonymous Coward | about a year ago | (#42688083)

I also say no, but for different reasons.

1) Games are still using DirectX 9. Even the most budget of GPUs today can handle these graphics demands. (Thanks consoles!)

2) Realtime lighting and physics processing is standard in most games thanks to middleware solutions. The CPU is used for these tasks.

3) Thanks to crappy coding standards among game developers, GPU features are often ignored in favor of brute-force calculations through the CPU. (Sony's new Planetside 2 is a high-profile victim, but several other major games have suffered from performance issues that severely curtail performance on even relatively modern dual-core CPUs.)

4) Thanks to crappy coding standards among game developers, multi-processor features are often ignored in favor of brute-force calculations through the CPU. Hardware manufacturers are taking notice of this, however, and optimizing around the issue.

Right now the best gaming PC upgrade money can buy is a SSD, followed very closely by an extremely fast four-core or better CPU.

DX10 requires Vista (1)

tepples (727027) | about a year ago | (#42688211)

Games are still using DirectX 9. Even the most budget of GPUs today can handle these graphics demands. (Thanks consoles!)

Thanks consoles, or thanks Windows XP?

Re:DX10 requires Vista (4, Insightful)

0123456 (636235) | about a year ago | (#42688219)

Thanks consoles, or thanks Windows XP?

Thanks Microsoft for trying to use DirectX as a stick to force people to switch from XP to Vista. Hey, kind of like Window 8.

Re:DX10 requires Vista (0)

Anonymous Coward | about a year ago | (#42688279)

Thanks Microsoft for trying to use DirectX as a stick to force people to switch from XP to Vista. Hey, kind of like Window 8.

Hell, DX10 and DX11 can run on Linux now. Why use Windows?

Re:DX10 requires Vista (0)

Anonymous Coward | about a year ago | (#42688683)

Because DX works much better under Windows.

Re:DX10 requires Vista (1)

Anonymous Coward | about a year ago | (#42688385)

Thanks consoles, or thanks Windows XP?

Thanks to the 90% of normal consumers who own a laptop.

Desktop systems are pretty much dead in consumer space outside of "hardcore gamers". So if you expect your title to sell, you either make Crysis or something which works okay with Intel video.

Re:No (1)

witherstaff (713820) | about a year ago | (#42688229)

You're absolutely right on your suggestions. I have crossfire Radeon 6970s and I'm CPU bound on Planetside 2 with my phenom II x4 AMD chip.

Re:No (0)

Anonymous Coward | about a year ago | (#42688115)

DDR2 based systems aren't old indeed, and their Q6600 wasn't an average CPU at the time either.

I just upgraded a pair of old non-gaminc PCs with Athlon 64 X2 3800+ CPUs to Phenom II X4 965 BE CPUs. It's quite a massive upgrade. Sure, it's not a 3570K but it's more than good enough for several more years and it was a $80 upgrade. A 3570K with a new good quality Z77 motherboard and 8GB of DDR3 plus sales taxes and shipping would have costed me over $400 per PC.

If you shop wisely and know the hardware you can upgrade on the cheap.

Re:No (1)

Gadget_Guy (627405) | about a year ago | (#42688727)

AGP bridges suck. PCI-E DDR2 rigs aren't even that old or even considered "obsolete" either.

You obviously didn't read the article. They tested whether there was any benefit to upgrading the graphics card, and the figures show that there is. They didn't use an AGP motherboard. And it doesn't matter whether you call the system old or not, because the topic was whether you could improve a 5 year old system.

The answer is yes. It doesn't matter what your theory says, because in practice you can extend the life of an old system with a single hardware upgrade.

Older = how old? (4, Insightful)

girlintraining (1395911) | about a year ago | (#42688007)

The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer. That's why nobody reviews it: Because you, McThrifty, aren't the target market and nobody's going to send you free hardware to test since your readers are, well... cheap.

Most of those hardware reviews you see online get the newest video cards for free specifically because their reviews are tailored to the guy who has a McDuck-sized vault of cash ready to be spent getting that extra .8 FPS out of Crysis.

Re:Older = how old? (4, Informative)

0123456 (636235) | about a year ago | (#42688027)

The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer.

And a GTX 660 is not a $400 card, it's more like $200.

The real issue is that most games are designed to run on consoles with their ultra-crappy CPUs, so they do very little on the CPU even on a PC. I've rarely seen my i7 go over 20% CPU usage in any game I've played in Windows with the CPU monitor running.

Re:Older = how old? (0)

Anonymous Coward | about a year ago | (#42688713)

If you have a 4 or 6 core i7 and the game is mostly single threaded you're not going to get more than 20% CPU usage in Windows, since it'll only use one core and many CPU monitors (e.g. Windows Task Manager) consider 100% CPU usage as all CPU in use. So one core at 100% usage in a 6 core machine with the other cores idle would be about 16% CPU usage.

Re:Older = how old? (4, Interesting)

Rockoon (1252108) | about a year ago | (#42688821)

And a GTX 660 is not a $400 card, it's more like $200.

..and its 140W TDP, significantly more than the 8800 GT or 9800 GT NVidia card that was $200 when they pieced together their 5 year old system, so they need a new power supply too.

Re:Older = how old? (2)

Tough Love (215404) | about a year ago | (#42688055)

However, a $100 graphic card of today is most likely going to leave any high end card of 5 years ago well back in the dust. Probably worth sticking one in, should be good enough for most games.

Re:Older = how old? (1)

Anonymous Coward | about a year ago | (#42688149)

Wrong, sorry but it's plain old wrong. The biggest features on cards are not a faster GPU, but rather more and better memory on the graphics card. A new card with 32 Mb of ram is not going to beat the performance of an older card with 128Mb for much.

Lets not ignore the fact that the CPU speed (bus speed) is going to determine a whole assload of the performance. I have purchased new high end graphics cards a few months before upgrading CPUs, and been amazed at the difference a complete upgrade made. No matter how badass the graphics card is, the instructions all have to pass through the CPU. Multi-core does not help this as much as people think, especially on an intel chip where the bus is shared for all cores.

Re:Older = how old? (5, Funny)

Anonymous Coward | about a year ago | (#42688407)

A new card with 32 Mb of ram

When was the last time you were shopping for a graphics card? 1998?

Re:Older = how old? (5, Funny)

MrBippers (1091791) | about a year ago | (#42688587)

He's a hardware architect that's been out of work since 3dfx closed down, you insensitive clod!

Re:Older = how old? (1)

Anonymous Coward | about a year ago | (#42688943)

No matter how badass the graphics card is, the instructions all have to pass through the CPU..

Mentioning 32Mb GPUs, and then this gem. Seriously, if you don't know anything about the subject, shut the hell up.

Re:Older = how old? (0)

Rockoon (1252108) | about a year ago | (#42688859)

However, a $100 graphic card of today is most likely going to leave any high end card of 5 years ago well back in the dust.

The high end cards from 5 years ago such as the 8800 GTX/Ultra are pretty much on par with todays 650 GT which is on the market for the $100 you just pulled out of your ass.

You've got to spend closer to $200 to beat the 8800 ultra.

Re:Older = how old? (0)

Anonymous Coward | about a year ago | (#42688071)

The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer. That's why nobody reviews it: Because you, McThrifty, aren't the target market and nobody's going to send you free hardware to test since your readers are, well... cheap.

Most of those hardware reviews you see online get the newest video cards for free specifically because their reviews are tailored to the guy who has a McDuck-sized vault of cash ready to be spent getting that extra .8 FPS out of Crysis.

Not all reviewed cards are $400 high end cards. Hothardware, Tomshardware, et al review cards at many different price points for the McThrifty's and Epeeners alike. In TFA, the card reviewed is $220 (after rebate.)

Usually, just as they test high-end cards in systems with high-end gear, they test the modest cards with new, modest gear. What's novel in TFA is that they're testing a new, modest card on older gear.

Re:Older = how old? (0)

Anonymous Coward | about a year ago | (#42688659)

I remember in 2008, I plunked down about $400 on a barebones kit, I think I've upgraded the GPU once since then. I haven't used it in a year, but up until that point, it was holding up just fine for most of the games I was playing. It couldn't handle The Witcher 2, but I think that was about it.

And reason why they use those shiny new computers, is mostly to encourage you to buy those shiny new computers even if a GPU upgrade is sufficient to play the game.

Re:Older = how old? (4, Informative)

B1oodAnge1 (1485419) | about a year ago | (#42688721)

To be fair, at 25 years old and over 200 games bought on steam I think I fit the target market for PC games pretty squarely, and I just upgraded my 8800 GTS to a GTX550Ti on my computer that is around 6 years old.
I went from needing to run at medium/low settings at 1080 to being able to run just about everything maxed out at 1920x1200 for about $120.

This just in, duh (3, Informative)

redmid17 (1217076) | about a year ago | (#42688013)

Is Ric Romero posting stuff to Slashdot? "Upgrading the largest bottleneck for game performance can substantially improve your playing experience!" Whether or not it's worth doing is another matter, but anyone who's built their own computer or even reads websites like tom's hardware or benchmarking sites knows this.

Re:This just in, duh (1)

illaqueate (416118) | about a year ago | (#42688089)

The question was whether it improved it enough to be a viable upgrade. And the answer is yes, assuming the CPU in the system is a quad core or better. Dual core, no. Luckily people stopped buying those around 2007 (e8400)

Re:This just in, duh (4, Informative)

illaqueate (416118) | about a year ago | (#42688101)

That said, ~48% of Steam users still have a dual core on Steam according to their hardware survey

Re:This just in, duh (2, Insightful)

Anonymous Coward | about a year ago | (#42688295)

That is easy to explain as a fair bit of new laptops are still dual core. People try to game on laptops.

Re:This just in, duh (0)

Anonymous Coward | about a year ago | (#42688711)

I still have a pc with an Athlon X2 4400 and an nvidia 8800GT. It runs most games absolutely fine. I don't know about Crysis because frankly those sorts of games do not interest me very much, but the only games in my collection with problems are late-game X3:TC and Civ5 which are mostly cpu-bound.

So, at least until X:Rebirth comes out, I see no compelling reason to upgrade right now, even though this rig is already five years old. It's great.

Re:This just in, duh (1)

Rockoon (1252108) | about a year ago | (#42688907)

I still have a pc with an Athlon X2 4400 and an nvidia 8800GT.

Thats because the 8800 GT was only marginally slower than the fastest card that you could buy when it was released (which was the 8800 Ultra.)

The 8800 GT falls flat at higher resolutions with complex pixel shaders, where a new comparable card like the 650 GT (for $100) would not but is just as weak at actually texture lookups. The upshot is that for older games that heavily abuse multi-texturing, these two cards are almost exactly equal but for newer games that just go ahead and compute stuff every frame the 650 is marginally better (always less than twice as good.)

Re:This just in, duh (1)

redmid17 (1217076) | about a year ago | (#42688161)

People stopped buying dual cores around 2007? Maybe you missed all the current computers with Core i3s and Core i5s?

Re:This just in, duh (1)

0123456 (636235) | about a year ago | (#42688195)

Maybe you missed all the current computers with Core i3s and Core i5s?

Most or all desktop i5s are quads. Laptop i5s are duals with hyperthreading and 'turbo' mode. I think both desktop and laptop i3s are dual with hyperthreading.

So finding a dual-core that can't run four threads is becoming difficult.

Re:This just in, duh (2)

Osgeld (1900440) | about a year ago | (#42688213)

not really true, I dropped a rather inexpensive HD6870 in my tri core and game performance noticeably increased, more detail at longer distances with a higher framerate ... on a pci-e 2.0 box

I plan on upgrading the rest eventually, but going from a gts250 to the ati = pretty substantial difference

Re:This just in, duh (1)

Anonymous Coward | about a year ago | (#42688223)

... anyone who's built their own computer or even reads websites like tom's hardware or benchmarking sites knows this.

Tom's hardware was heavily suspected of falsifying benchmarks in exchange for money from Intel and Microsoft years ago (Google it you lazy pricks!). Tom's and other sites often provide BS benchmarks. Meaning, they spec a new Intel against an old AMD and claim Intel is way better. In other cases, they load up a system with 256GB of memory and give the system they want to talk down 4GB. While these cases are technically not false, the merit and the motive behind such benchmarks are worth chastising people that publish this type of information. Often finding the spec's of the real test gear is buried and extremely difficult to find.

Sorry, but I trust companies proving benchmarks as far as I trust my turds will fly. If you work in IT, setting up your own bench marks for these types of applications should be trivial. Most vendors will give you gear to demo. I don't trust companies that rely on vendors for gear and revenue to provide real data, there has been way too much shit in the pool. If I don't personally test or know the person that did, it did not happen.

Re:This just in, duh (2)

Racemaniac (1099281) | about a year ago | (#42688777)

so i read your comment, and tried to google it, and couldn't find anything...
care to give a link? (or keywords i should google?)
it sounds interesting, but can't seem to find much about it.

Re:This just in, duh (0)

Anonymous Coward | about a year ago | (#42688977)

https://www.google.co.uk/search?q=tom's+hardware+intel+biased

Re:This just in, duh (0)

Anonymous Coward | about a year ago | (#42688583)

"Largest" does not tell us how large. This review does.

It also lets people know if it's worth it to upgrade a non-gaming PC that never had a decent GPU.

That's two more points of useful information than the number contained in your post.

It all depends on the CPU in the old PC (1)

D,Petkow (793457) | about a year ago | (#42688053)

Yes and Now, depends on the old rig's specs, not only the video card! The older CPU _could_ become a major bottle neck, if not multi-core, HT aware. 5 years ago a high end rig could be powered by a decent dore 2 duo CPU circa 2008, or early 2009, which if pretty decent in overlcocked to nice and stable 3.5 - 4.0 GHz. On the other hand an older Single core and not HT aware CPU will be a huge bottleneck even if a modern GPU is put on, for example Barton core Athlon's. (a bit older though) Two older HDDs could also be in a RAID setup, defragmented, so no HDD bottle neck. The FSB and motherboard chipset could be a culprit. The PSU must not be underestimated, especially if the new GPU has a high TDP. Overall my opinion is that it all boils down to what you need and want. If the old rig is not tool old, a newer GPU could be a life saver, but do not expect maxed out eyefinity play on three/six displays.

The CARD in your HEAD! (-1)

Anonymous Coward | about a year ago | (#42688059)

"Everything we see has some hidden message. A lot of awful messages are coming in under the radar - subliminal consumer messages, all kinds of politically incorrect messages..." - Harold Ramis

âoeRFID in School Shirts must be trial runâ

The trial runs began a LONG time ago!

Weâ(TM)re way past that process.

Now weâ(TM)re in the portion of the game where they will try and BRAINWASH us into accepting these things because not everyone BROADCASTS themselves on and offline, so RFID tracking will NEED to be EVERYWHERE, eventually.

RFID is employed in MANY areas of society. RFID is used to TRACK their livestock (humans) in:

* 1. A lot of BANKâ(TM)s ATM & DEBIT cards (easily cloned and tracked)
* 2. Subway, rail, bus, other mass transit passes (all of your daily
activities, where you go, are being recorded in many ways)
* 3. A lot of RETAIL storesâ(TM) goods
* 4. Corporate slaves (in badges, tags, etc)

and many more ways!

Search the web about RFID and look at the pictures of various RFID devices, theyâ(TM)re not all the same in form or function! When you see how tiny some of them are, youâ(TM)ll be amazed! Search for GPS tracking and devices, too along with the more obscured:

- FM Fingerprinting &
- Writeprint
- Stylometry

tracking methods! Letâ(TM)s not forget the LIQUIDS at their disposal which can be sprayed on you and/or your devices/clothing and TRACKED, similar to STASI methods of tracking their livestock (humans).

Visit David Ickeâ(TM)s and Prison Planetâ(TM)s discussion forums and VCâ(TM)s discussion forums and READ the threads about RFID and electronic tagging, PARTICIPATE in discussions. SHARE what you know with others!

These TRACKING technologies, on and off the net are being THROWN at us by the MEDIA, just as cigarettes and alcohol have and continue to be, though the former less than they used to. The effort to get you to join FACEBOOK and TWITTER, for example, is EVERYWHERE.

Maybe, you think, youâ(TM)ll join FACEBOOK or TWITTER with an innocent reason, in part perhaps because your family, friends, business parters, college ties want or need you. Then itâ(TM)ll start with one photo of yourself or you in a group, then another, then another, and pretty soon you are telling STRANGERS as far away as NIGERIA with scammers reading and archiving your PERSONAL LIFE and many of these CRIMINALS have the MEANS and MOTIVES to use it how they please.

One family was astonished to discover a photo of theirs was being used in an ADVERTISEMENT (on one of those BILLBOARDS you pass by on the road) in ANOTHER COUNTRY! There are other stories. Iâ(TM)ve witnessed people posting their photo in social networking sites, only to have others who dis/like them COPY the photo and use it for THEIR photo! Itâ(TM)s a complete mess.

The whole GAME stretches much farther than the simple RFID device(s), but how far are you willing to READ about these types of instrusive technologies? If youâ(TM)ve heard, Wikileaks exposed corporations selling SPYWARE in software and hardware form to GOVERNMENTS!

You have to wonder, âoeWill my anti-malware program actually DISCOVER government controlled malware? Or has it been WHITELISTED? or obscured to the point where it cannot be detected? Does it carve a nest for itself in your hardware devicesâ(TM) FIRMWARE, what about your BIOS?

Has your graphics card been poisoned, too?â No anti virus programs scan your FIRMWARE on your devices, especially not your ROUTERS which often contain commercially rubber stamped approval of BACKDOORS for certain organizations which hackers may be exploiting right now! Search on the web for CISCO routers and BACKDOORS. That is one of many examples.

Some struggle for privacy, some argue about it, some take preventitive measures, but those who are wise know:

Privacy is DEAD. Youâ(TM)ve just never seen the tombstone.

Re:The CARD in your HEAD! (-1, Flamebait)

Anonymous Coward | about a year ago | (#42688093)

You posted as AC, but We know who you are. You will not escape this time, we are coming for you. The reptilians have commanded us to end all opposition.

sure, but... (1)

smash (1351) | about a year ago | (#42688099)

.... take that money you spend on the GPU, and spend it on a motherboard with i7 and integrated GPU, and you'll likely get a speed up as well. with faster processing for everything else.

Also depends on the game (2)

Sycraft-fu (314770) | about a year ago | (#42688199)

Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

So how much upgrading a given component makes a difference depends on what else you have in your computer. If your system has a CPU that was top of the line 5 years ago, but an integrated GPU, then ya a new GPU will probably be the best use of money. However if the CPU is underpowered, then a new GPU will do little if anything.

Also you are right in that integrated GPUs have gotten way better. Time was, integrated Intel GPUs sucked even at desktop operations. Back in the P3 days I recommended a discrete GPU to everyone because the integrated ones were that bad. Now with Sandy/Ivy Bridge they are quite good. You can game on them, even new games. No they don't do as well as a discrete GPU, but they really are more powerful than you might think.

Re:Also depends on the game (2)

EvanED (569694) | about a year ago | (#42688275)

Back in the P3 days I recommended a discrete GPU to everyone because the integrated ones were that bad. Now with Sandy/Ivy Bridge they are quite good. You can game on them, even new games. No they don't do as well as a discrete GPU, but they really are more powerful than you might think.

Hmmm, my research from a few months ago suggested otherwise, at least to some extent. My home desktop is from 2008; it had a GeForce 8800 GTS in it which unfortunately decided to go kaput. The timing was kind of bad because I will probably be getting a new computer in the second half of this year but I didn't want to pay for one now, so I had to decide what to do: (1) get some cheapass GPU that would be on-par or better with my 8800, (2) get a midrange card now and have an overpowered GPU for a while until I get a new system (when I'd migrate the GPU), (3) pull my anticipated system upgrade earlier and just get a new system then, or (4) do the opposite of #2 -- get a new mobo and CPU then, and live off integrated graphics for a while. (My final decision was #2.)

(1) was my favorite choice but got cut out because I couldn't find a cheapass-enough cheapass card. (4) was my next choice -- I do enough stuff that would benefit from extra CPU power. My reasoning was that in the 4 years since I got my system, not only would the normal rate of technology change have worked its magic but that Intel seemed to be paying attention to integrated graphics and so I expected that to have improved more than, say, the improvement in CPUs and discrete GPUs during that time. I figured that my 8800 was running most things I was interested in reasonably enough, so if I could match the 8800's power in integrated graphics, that would be sufficient. But that didn't seem to be the case. Unsurprisingly it was hard to find comparisons of modern integrated graphics with a card as old as the 8800, but the couple that I could find didn't paint a particularly good picture. From everything I could tell, and from the discussion on the forum where I was talking about my options, even the Intel HD4000 would be inferior to my 8800. (And actually, what someone said was that the 8800 slightly beats even the AMD A8-3870k, which beats the HD4000.) About the only good news would have been DX11 support.

So... it depends on what you mean. Integrated is definitely way better than it was, but at the same time... it's still got a loooong way to go before it matches discrete.

Re:Also depends on the game (3, Insightful)

adolf (21054) | about a year ago | (#42688471)

My home system is from 2008 also, and sports a pair of 9800GTs.

I've gone through many of the same thought processes as you, and come to many of the same conclusions.

Here's what I've gleaned:

1. A five-year-old video card (or a pair of them) should be trivially-cheap to replace with an efficient and modern equivalent, but it's not.

2. The prettiest games I want to play today bog my Q6600 CPU more than my video cards, which just loaf along on such titles.

3. I need more RAM. 4GB isn't enough and DDR2 is fucking expensive. A motherboard+CPU sidegrade is damn near free with 2x4GB DDR3, compared to 2x4GB of DDR2 by itself. And getting a significantly faster CPU at the same time isn't significantly more expensive.

4. Integrated graphics, no matter the claims by people who say they're quite good enough, suck in comparison to even quite old dedicated hardware.

5. Conclusion: To upgrade my 5-year-old gaming rig piecemeal, keep the GPU(s), replace everything else, and ignore integrated graphics.

Re:Also depends on the game (1)

turing_m (1030530) | about a year ago | (#42688783)

Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

One thing to bear in mind with gaming benchmarks - they are performed running just the game, to keep everything else equal. In real world use it's nice to have the flexibility not to have to close down your browser and other applications, especially if you aren't the only user logged into the system. For that reason, you want more cores than you need just for the game. Maybe a quad core if you want dual core performance, or hex core if you want quad. And given how games have adapted to using multiple cores, it would pay to get more cores than you need if you are going to futureproof.

SSD (4, Informative)

Barlo_Mung_42 (411228) | about a year ago | (#42688105)

One thing that helped boost my older system was switching the drive to an SSD.

That speed improvement is in your mind (1)

Anonymous Coward | about a year ago | (#42688269)

SSD only improve startup times. It doesn't improve runtime performance, even on I/O intensive applications.

Re:That speed improvement is in your mind (2)

polyp2000 (444682) | about a year ago | (#42688623)

Not sure about anyone else - but on my Ubuntu systems SSD made a significant difference to more than just startup times - Web Browsing for example is much snappier - im guessing this is due to the drive more readily capable of writing image / web page data and fetching from disk cache for rendering etc.

Any "runtime" performance that relies heavily on disk based caches will see a benefit here. I've used SSD for I/O intensive applications such as running
Solr / Lucene search engine the improvement here is also very significant. Applications that are heavy on writes will also benefit.

So i kind of disagree with you on that one i guess!

Re:That speed improvement is in your mind (3, Interesting)

Barlo_Mung_42 (411228) | about a year ago | (#42688625)

Perhaps it depends on the game. Counter Strike gives an advantage to those that load the map the quickest. Being able to get to the bottom of the ramp in Dust2 to counter snipe the inevitable sniper is huge. Just sayin.

Re:That speed improvement is in your mind (1)

Anonymous Coward | about a year ago | (#42688637)

Not true. It's not as if programs don't open files after start up.

My computer runs much much much faster with the OS on a SSD. Now all the DLLs, dependencies or whatever are on an SSD too, so it's much faster.

Re:SSD (1)

uvajed_ekil (914487) | about a year ago | (#42688333)

EVERY component is relevant to gaming performance: HD/SSD, RAM, CPU, and GPU are all important, especially with some of the latest games. And you need to get enough juice from the power supply (without immediately killing it), you have to be able to keep it all cool, you want a motherboard that isn't itself a bottleneck or otherwise a hindrance, and of course you don't want to watch the action on a 17" CRT. So while I wouldn't recommend relying on integrated graphics or a $50 card, you can't forget about all the other components if you are building a gaming rig on a budget.

Definitely go SSD for OS and game installs if you can afford a second, bigger HD for other stuff, unless it will be a dedicated gamer and you can live with the low capacity. But on a tight build/rebuild budget I'd skip the SSD rather than forego something else.

Re:SSD (1)

Slugster (635830) | about a year ago | (#42688715)

I have a computer several years old I upgraded to SSD because the mechanical drives were failing. I saw a significant improvement in gaming after the SSD swap: with FPS games, previously I had to turn most of the visual effects off because the video was rather choppy. Now they're all left on and the game still runs just fine.

SSD's don't cure everything, nor do they speed everything up (some stuff takes just as long, because it's set to take X amount of time anyway). But for a lot of things the instant-response is very nice. I have a SSD OS and SSD swap/small storage drive, and I still have a huge archive mechanical drive for less-frequently-used things.

My main complaint after switching to mostly-SSD is that--when starting up in Win7--some programs insist in spinning up ALL the drives before they do their shit. Even when nothing they need at all is on the mechanical drive. (sigh) So it seems you cannot have the maximally-super-fast computer setup until it is cheap enough to switch to all-SSD drives. :\

Depends on PCIe x16 lane (0)

Anonymous Coward | about a year ago | (#42688109)

If it's version 1 a high end card will be bottle-necked. If it's version 2 or higher you should be good to go.

jees (0)

Anonymous Coward | about a year ago | (#42688119)

get a new mobo and pc with that card it will play all games
toss in 16GB ram and you can do 3d dev too.
get a mobo you can goto 64 gb and have 64 bit win7 and your rocking for making your own games

migrate to a quadro then when you can afford and really take it to the next level
at 900$ for the lowest quadro its more of a price then most pcs

Re:jees (1)

Osgeld (1900440) | about a year ago | (#42688193)

jeez almost all pc games are spec-ed to 2005 xbox360 hardware

my old spare machine which is a amd x2 2.4 ghz, 4 gigs of ddr2 and a geforce 9600GT can run most modern games at mid high to high quality at 1280x1024, and that's coming up on 6 years old ... cost like 300 bucks in parts new

More cores == better performance (0)

Anonymous Coward | about a year ago | (#42688135)

There are several bottlenecks in modern (or not-so-modern) systems. These are the CPU number of cores and speed, the memory and I/O bus that moves data around the system from memoryCPUperipheral devices such as your video card, there is the GPU, and there is raw RAM size. For games, GPU and I/O bus and memory interconnect speeds are key. Unless you replace the motherboard (and CPU) this last item cannot be improved. Given the low cost of some pretty awesome GPU gear these days, that is one way to make instant improvements in gaming performance, especially with current 3D rendering in hardware. Adding memory, and running a true 64-bit operating system (if your system supports that) will also help for an incremental price.

That is ONLY if the software is written for it (0)

Anonymous Coward | about a year ago | (#42688293)

Most software .... and games are so poorly written, that you can have 1000000 cores and it will peg only one.

Re:That is ONLY if the software is written for it (1)

petteyg359 (1847514) | about a year ago | (#42688313)

Actually, it's you that is poorly written. You are preventing them from multi-threading input, which is the main task of most games.

Perhaps, perhaps not (0)

Anonymous Coward | about a year ago | (#42688147)

I've been down this road many times having worked in a AAA game studio as IT Manager and here is what I've found. Most newer cards with the desirable features consume lots of electricity and at this point in time actually as much as a refrigerator it would seem. They also generate an excessive amount of heat as well. Before purchasing make sure your power supply is up to the task or you will be in store for some interesting side effects. For example a newer card shoehorned into an older Dell desktop at times would actually melt the card and cause fires (I'm looking at you Unreal 3 engine) depending on how untested the combination was. Other times it worked great, ran great and you save some money. Just dot your "i"s and cross your "t"s and you'll be fine. Also newer cards tend to have extra power connections that just aren't available in the older computers. Sometimes older games ran great, other times they locked up the CPU thanks to drivers from the newer cards or questionable graphics extensions being used in an unfamiliar context. Hopefully this advice will save you some $$$ it has saved my friends some $$$ many times there is nothing more exacerbating than buying a graphics card that just doesn't work.

Re:Perhaps, perhaps not (1)

0123456 (636235) | about a year ago | (#42688203)

Most newer cards with the desirable features consume lots of electricity and at this point in time actually as much as a refrigerator it would seem. They also generate an excessive amount of heat as well. Before purchasing make sure your power supply is up to the task or you will be in store for some interesting side effects.

When playing games, my i7 + GTX 660 system takes a whole 200W at the wall. My Pentium-4 with Nvidia 7800 used to take more like 350W.

Re:Perhaps, perhaps not (0)

Anonymous Coward | about a year ago | (#42688401)

Minimum power requirement for the GTX 660 is 450w while the 7800 minimum is 400w. Taking the whole system into account you can still run it out of spec and get away with it but power consumption as a whole will suffer depending on the quality of the power supply/mobo combo and the ability to convert the various voltages. The card itself is only part of a bigger picture. The difference between running 20psi through a turbocharger on a lower compression engine and 15psi on a high compression engine can sometimes be kaboom. There are a lot of similarities between power adders for cars and power adders for computers.

Re:Perhaps, perhaps not (1)

Anonymous Coward | about a year ago | (#42688497)

You're virtualy incomprehensible, but I feel comprelled ot respond. People have been pushing stupid high power supply wattage ratings upon folks for years now.

The power listings for video cards are published under the assumption that the buyer has a "free" power supply which came with their case. Those power supplies aren't worth their weight as scrap metal. It's highly likely that the i7 + GTX 660 system pulls about 200W from the wall during gaming. The card only has a TDP of 140W.

Unless you're overclocking an AMD processor hard, I'd wager that a Corsair 430W CX430 would run just about any reasonably constructed single card $700 computer you could build.

Re:Perhaps, perhaps not (0)

Anonymous Coward | about a year ago | (#42688691)

Woosh. Take the GTX 660 and put it into the 350w P4 and then what do you get? The GTX 660 avg. is higher than the 7800 peak consumption. That's nice an i7 + GTX 660 pulls less but has nothing to do with JUST replacing the graphics card with a newer one.

it can (1)

GarretSidzaka (1417217) | about a year ago | (#42688155)

i put a 9800 GSO in a old sony vaio that was new enought to be first gen PCI express.. its single core 3.2 ghz with hypthread P4
its got 1.5 gb ram (i added extra gig, it was hard to find old ram but you can). since the card was going to be a strain on the power supply i pulled everything like floppy tv tuner lots of crap. it runs sims 3 and minecraft smoothly for my 1st grade son. good computer now and the PSU hasnt burned out!

it can be done!!

see if you can upgrade all the parts you can: ram, cpu, and video card (even upgrade the PSU if its not proprietary like the sony)
the prices might be dirt cheap for obsolete parts (if you can find em)

Hidden loss, underutilisation (0)

Anonymous Coward | about a year ago | (#42688177)

What is the amount of waste? putting in 600's card over a 200's will certainly improve things, but what is the utilisation of the card?
CPU/RAM combo needs to be fast enough to move data between GPU and main memory, if it is not quick enough, the card can not do any more work E.g. $200, ~80% busy, 40fps

Could one get away with buying a 500's card with only marginal FPS loss but nearly half the cost? E.g. $120 95% busy, 37fps

Similar System (1)

baynham (874879) | about a year ago | (#42688209)

I have a Q6600 @ 3.4Ghz and paired it with a cheap 5870. I'm still able to play most games at high settings, even at 2560X1600. IMO it is one of the 'best' CPU's ever made. Will be somewhat sad to upgrade when IvyBridge-E is released. I would have liked to have seen the same benchmarks with the processor overclocked too but nice article. Q6600 FTW!

Yes and no (1)

uvajed_ekil (914487) | about a year ago | (#42688271)

A newer/better GPU can indeed improve the graphics and gaming performance of an older computer, but it won't make it perform like a newer machine with other superior hardware. Duh.

It seems like perfect common sense, but obviously not everyone gets it so I'll state it like this: If you took a shiny 2012 BMW V8 engine and plopped it into your rusty 1982 BMW 733i, your car would be faster and more fuel efficient (assuming you could even mount the new motor and get everything hooked up), but it wouldn't automatically handle like a 2012 or have bluetooth or a navigation system, and you'd eventually run into unforeseen problems if you really got on it and tried to drive it like it was new. Straight line acceleration would be fantastic (like running certain benchmarks on a new GPU in an old machine) but real world drivability would be more lackluster.

Driving isn't all about raw horsepower, just as PC gaming isn't all about the graphics card. If you have a 5400 RPM hard drive, maybe even an ATA one, with 2 GB of DDR2-667 and a a single core Pentium, you might not want to play the latest games, even if they technically will run on your rig. I DO have a 5 year-old desktop at home (among other machines) and I have vowed not to spend another cent on it. It works fine for basic stuff, but at some point you just need to think about starting over.

Re:Yes and no (0)

Anonymous Coward | about a year ago | (#42688473)

Car analogies of performance of 80's crap doesn't work, because cars were restricted in performance for about 20 years. Please consider either comparing advances in car tech to 1973 vehicles or post 1993 vehicles in future posts. Thank you.

Re:Yes and no (0)

Anonymous Coward | about a year ago | (#42688589)

Performace 80's crap like the illegal 1989 nissan skyline r32? Riiiight. Sounds like the only thing restricting your performance is a restrictor plate you cannot figure out how to remove.

Here is some food for thought: http://www.youtube.com/watch?v=jurkxqPH83A

Yeah put a bigger graphics card in it has become "put a bigger turbo in it bro".

CPU can be a bottleneck (1)

hsa (598343) | about a year ago | (#42688493)

Here, have a look at this Anandtech E-350 review:
http://www.anandtech.com/show/4499/fusion-e350-review-asus-e35m1i-deluxe-ecs-hdci-and-zotac-fusion350ae/15 [anandtech.com]

They pair very low-end AMD CPU with best GPU on the market at the time. Results: the CPU does affect the performance. No suprises there..

You need to be more specific with your hardware.

Also, take a look here:
http://www.anandtech.com/bench/CPU/48 [anandtech.com]

Re:CPU can be a bottleneck (1)

Gadget_Guy (627405) | about a year ago | (#42688905)

They pair very low-end AMD CPU with best GPU on the market at the time. Results: the CPU does affect the performance. No suprises there..

The quoted test was probably more limited by the PCIe slot running at x4 instead of x16 rather than the low spec CPU. If you can't get data to the graphics card quickly enough then even the fastest CPU will be hampered.

If you are playinggames (0)

tlambert (566799) | about a year ago | (#42688519)

Buy a gaming console instea of playing then on your PC.

Re:If you are playinggames (1)

Krneki (1192201) | about a year ago | (#42688547)

Only if you like the games on the consoles. I don't.

I'll avoid the whole graphical differences that PC can bring.

Some like to waste their money on cars, booze, ..... I like to spend 400Eur on the PC every year.

Re:If you are playinggames (2)

Mike Frett (2811077) | about a year ago | (#42688603)

You're kidding right? My brother is always complaining about having to pay various fees, as he says "To even turn my Xbox on!" sometimes. That and $70 games are enough to put a person in the poor house. The article is correct in saying a new GPU will boost performance on an older system. But don't buy a super high end card for it, eventually the CPU will become too much of a bottleneck.

I think I'll stick to my PC with Xubuntu (I actually BUY games I like). Oh noesss I mentioned Linux, will I be Modded as a Troll again? Beaten into oblivion by Slashdots Secret Society of Microsoft Fanboys AKA SSMF? Tune in next week for the exciting conclusion!. And don't forget to drink your Ovaltine.

Re:If you are playinggames (1)

Warma (1220342) | about a year ago | (#42688963)

This is not a valid statement and has never been. Entirely different games are marketed for PC and consoles, and if you want to play all the games you enjoy, eventually you will be forced to own both a fast gaming PC and a console.

backwards compatibility rant/warning (1)

volvox_voxel (2752469) | about a year ago | (#42688551)

There are a lot of great [old] games out there. I have"Windows XP mode" virtual PC, but found that my video card does not have drivers for XP, so I can't play them. Has anyone here had a similar experience? I personally have an Nvidia 690 card, but now I find that I must maintain a slew of older PC's so I can get my programs/hardware/"old" games to run.

If you do get a new video card, make sure that it has drivers for XP [or older] as well so that you can play your old games in a virtual machine.

Re:backwards compatibility rant/warning (1)

Psyborgue (699890) | about a year ago | (#42688673)

No need to use a VM. Just use compatibility mode in windows 7 or 8. 8 does some cool stuff with retro games (automatically sets the correct settings for many games). I'm currently playing black and white as well as neverwinter nights... Both Win 95/98 era games. No issues. I run lots of retro games.

Also. If you're trying to get 3d to work in a VM, you need to use drivers supplied by the VM vendor in the client and enable the relevant settings on the host.

So what's new? (1)

Torp (199297) | about a year ago | (#42688655)

I've lately upgraded the GPU every other generation (i buy mid range cards like the 660) and the CPU every 4 years or more. It's been fast enough for my purposes.

Good question (0)

Anonymous Coward | about a year ago | (#42688717)

Bad answer. They only tested one system. Not exactly statistically (read scientifically) interesting...

Of course, the real answer is the dreaded 'depends'. Still inquiring minds would like to know WHAT it depends on and HOW much.

Mathematical question (1)

Kokuyo (549451) | about a year ago | (#42688743)

If card A has a performance of x (which I'll define as 1) and card B a performance of x+2, wouldn't that mean it's two times better?

The article keeps saying three times better, but wouldn't the correct way to phrase that be "It's three times as good?"

Similar things with percentages. If something has 200% the value of something else, it's twice as valuable and not two times more valuable, right?

I notice similar things in German, which is my main language. Am I just a grammar Nazi (badum-tis) or does that bother you too?

Works for me (1)

burisch_research (1095299) | about a year ago | (#42688753)

I'm running a new-ish HD5970 card on a five-year-old Intel Core 2 Duo, 2.66GHz. Over those years I've also added an SSD (boot + apps), some extra hard drives, and an extra monitor. The machine is very reliable and quick enough that I really don't need to upgrade. Although I definitely will upgrade this year; five years is really old for a PC.

Settings in games should match your setup. (1)

johan jacobsson (2824281) | about a year ago | (#42688803)

If you have an old computer you should use settings in games to match that even though you've bought yourself a shiny new gfx card. Going for HQ and 4x MSAA in BF3 is just plain stupid. There are countless guides out there on which settings to tweak to get the settings that matches your setup. Nvidia has a great guide that explains all settings and their impact on the FPS and suggests what to set depending on your setup. Using MSAA for example is one of the best way to ensure you get low FPS and most guides suggest you turn it of an use FXAA to get better FPS. While in whine mode I'll add a few more things. What resolution was used? A comparison on what FPS to expect when; buying a GTX 660 for your old computer VS buying a new mid range MB/CPU and a GTX 660. An analysis on whether the CPU was the bottleneck for these games and if so, by how much? Is the GTX 660 at $220 the card to get when upgrading a 5-6 year old computer or will a GTX 650 TI for $150 give you the same result? All in all I appreciate that he wrote the guide as I'm sure many people are asking themselves the question whether to do a full upgrade or just upgrade for example the gfx card. But at the same time.. a job worth doing is worth doing well.

Anyone here with Socket 939? (0)

Anonymous Coward | about a year ago | (#42689105)

I have AMD Opteron 185 2.6 Ghz, Socket 939 and eVGA GeForce GTX 260 SuperClocked, 3GB RAM and WD Raptor HD
I can play StarCraft 2, Call Of Duty 5 on my 1680x1050 monitor.
The computer is also my development machine and for this task it's OK.
Do you think upgrading to more powerful video will allow me to play Latest Call Of Duty with max settings?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...