Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Forget Expensive Video Cards

CmdrTaco posted more than 8 years ago | from the oversimplifications-are-funny dept.

322

Anonymous Reader writes "Apparently, the $200 in video cards does not produce the difference. While $500 video cards steal the spotlight on review sites and offer the best performance possible for a single gpu, most enthusiasts find the $300 range to be a good balance between price and performance. Today TechArray took a look at the ATI x1900xtx and Nvidia 7900gtx along with the ATI x1800xt and Nvidia 7900gt."

cancel ×

322 comments

Sorry! There are no comments related to the filter you selected.

Forget expensive English lessons (4, Funny)

Anonymous Coward | more than 8 years ago | (#15231659)

You apparently don't need them to get your submissions approved.

Re:Forget expensive English lessons (0)

Zeussy (868062) | more than 8 years ago | (#15231888)

Someone set us up the graphics card.

Not directly related to TFA (2, Insightful)

remembertomorrow (959064) | more than 8 years ago | (#15231662)

But I will not even consider purchasing an ATI card until they get their Linux compatibility (drivers) up to snuff.

I'd rather not be locked to one platform because of a piece of hardware.

Re:Not directly related to TFA (1)

Timesprout (579035) | more than 8 years ago | (#15231687)

I'm sure ATI engineers are rushing back to the office as I type this to work on Linux compatability. That or making mental notes to give the boss an earfull on monday morning about dropping linux compatability altogether since its not appreciated or increasing revenue.

Re:Not directly related to TFA (0)

Anonymous Coward | more than 8 years ago | (#15231710)

That or making mental notes to give the boss an earfull on monday morning about dropping linux compatability altogether since its not appreciated or increasing revenue

And thus completely lose out on the rendering farms that use linux machines. His boss will be very happy he has a smart employee when they both lose their jobs.

Re:Not directly related to TFA (1)

ScrewMaster (602015) | more than 8 years ago | (#15231764)

Render farms don't need or have video on each processor ... you just have thousands of machines in racks processing data streams. The resulting output files are then taken and rendered to video or film. Besides, the number of systems in all the world's render farms is a tiny fraction of the number of chipsets ATI sells to the notebook and desktop markets. When Linux makes significant inroads into those markets ATI will follow suit. What you're reallying complaining about is the fact ATI is making no particular effort to develop that market by supplying video drivers in a format that you find acceptable. And there's no particular reason that they should unless their own analysis says that it will be profitable. Companies get to choose where they put their development dollars. Now, if there was some proof that, say, Microsoft was paying off or pressuring ATI to keep their Linux drivers on the back burner, that might be different.

On the other hand, ATI has never been particularly good about drivers, even under Windows, so don't expect them to be much help in the Linux world either. That stupid-ass .Net-based "Catalyst Control Panel" they've been shipping for a while now is just obnoxious.

Re:Not directly related to TFA (5, Informative)

TheRaven64 (641858) | more than 8 years ago | (#15231918)

Render farms don't need or have video on each processor ... you just have thousands of machines in racks processing data streams.

Actually, that's not quite true these days. A modern render farm has a GPU (or two) in each node, and uses it for all sorts of things. If you are only doing relatively low-quality renderings, you can use something like Chromium and get high framerate, enormous images rendered through OpenGL. If you are doing ray tracing, you can speed this up hugely using the GPU.

Even volume rendering runs on the GPU these days. You can split an enormous volume into 256^3 cubes, render these quickly on an large array of GPUs and then composite the individual rays using the alpha blending hardware on a smaller array of machines in a tree configuration until you have the final image[1].

So, no, not every node needs a video output capability, but if you want state-of-the-art performance they do all need at least one GPU.


[1] Some people are using other kinds of stream processor for this step these days, but that's still a relatively young research area.

Re:Not directly related to TFA (2, Interesting)

It'sYerMam (762418) | more than 8 years ago | (#15231879)

Having talked personally with the ATi linux team (back before I bought an nVidia) I know they do try with the resources they're given by the management. They also take into account the complaints of the users - although, being bound by NDA, I'm pretty sure they can't give out "coming soon" notices. Certainly, way back when there was this nasty problem with UT2k4 and the ATi linux drivers, they wouldn't disclose that it was fixed before they released.

What about games? (1)

Junta (36770) | more than 8 years ago | (#15232036)

I realize there is more to 3D than games, but generally speaking only games provide reason for new video card purchases, since moderate graphics cards can handle XGL and such sufficiently.

I know UT2k4, Quake3 (and below), Doom3, and Neverwinter Nights all run native linux, but why not call for linux compatibility from game publishers? I know at least I'm disappointed in things like Oblivion and NWN2 not being on linux (and by extension not on my list of stuff to buy). NWN1 sucked a fair amount of licenses out of my household, but without Windows licenses and not wanting to use something as much of a kludge as wine to play a game, no more licenses for us...

Re:Not directly related to TFA (1)

repruhsent (672799) | more than 8 years ago | (#15231713)

Why would you need anything more than around a GeForce FX 5200 or so with Linux? Sure, you can play Doom 3, but that's about it. I also think that a lowly GeForce 2 MX would run your dumb glxgears at like 15000 fps or whatever. You hardly need a really fast card for compiz. I just don't see the point. Please don't use an example like Nexuiz for rebuttal purposes, either; nobody cares about FSF-approved games. The fact of the matter is, nobody really cares about blazing fast video performance on Linux. ATI doesn't either - that's why the drivers suck.

You're not locked to Windows because of ATI hardware. You can always use the driver's supplied with X, although those aren't accelerated. Sure, you won't be able to run your dumb GL accelerated terminal emulators then, but that's hardly being locked into Windows. You could probably run from a console without X on ATI hardware, too.

Also, it's pretty presumptuous of you to assume that anyone here would care about you not purchasing ATI hardware. In short, you're an idiot.

Re:Not directly related to TFA (0)

Anonymous Coward | more than 8 years ago | (#15231758)

It's pretty presumptuous of you to assume that Linux has a total of two 3D games, both of them FPSes. You're the idiot, and a bigot to boot. Go back to playing that flight sim in MS Excel on your Intel Extreme Graphics.

Re:Not directly related to TFA (1)

heinousjay (683506) | more than 8 years ago | (#15231870)

There's no need to be presumptuous - the gaming choices in Windows are vastly more plentiful, and the performance is higher. It's what those of us who participate call a 'win-win.' Go ahead and be morally superior (haha) but realize that sometimes that comes with a price.

Re:Not directly related to TFA (0, Flamebait)

nstlgc (945418) | more than 8 years ago | (#15231760)

Personally, I'd rather not give up technological advantage because of some crappy hippie OS. Oops, did I say that on Slashdot?

Re:Not directly related to TFA (0)

Anonymous Coward | more than 8 years ago | (#15231824)

Then don't buy an nVidia card. nVidia's binary x86 drivers may be better than ATi's, but if you happen to use another platform, you're stuck with the open source drivers. The open source drivers for nVidia cards plainly suck.

Re:Not directly related to TFA (1)

HaDAk (913691) | more than 8 years ago | (#15231958)

ati's drivers are crap no matter what os you're using. i much prefer forceware.

Re:Not directly related to TFA (0)

Anonymous Coward | more than 8 years ago | (#15231961)

I'd rather not be locked to one platform because of a piece of hardware.

That's why I like to program on my abacus I made at home with some two liter caps. / you're a dorkus

Practically irrelevant - unfortunately.... (1)

King_TJ (85913) | more than 8 years ago | (#15232012)

This whole discussion centers around the best 3D gaming cards for the money. This is only *barely* a concern for the Mac using audience, much less Linux users. Just because you can play a few games like Quake or Doom in a native Linux version doesn't mean it's a primary concern of many Linux users to have optimal 3D gaming performance.

The OS just doesn't really have gaming as a primary focus. So ATI's lack of focus on Linux compatibility isn't all that surprising on their $300-500 cards made for gamers, is it?

Re:Not directly related to TFA (0)

Anonymous Coward | more than 8 years ago | (#15232088)

Then you better crack open the piggy bank..

http://www.theinquirer.net/?article=31009 [theinquirer.net]

Whatever... (5, Funny)

Kjella (173770) | more than 8 years ago | (#15231664)

I'm sure the $500 GFX cards only exist to make spending $300 on a single component of a computer seem reasonable.

Re:Whatever... (1)

raz0 (899158) | more than 8 years ago | (#15231700)

Actually most every new game today is GPU bound, so naturally the absolute most significant performace boost will be from the graphics card. I used to think the same way as you do, but you will gain MUCH more performace by getting a low end CPU, motherboard, RAM and then spending most your budget on the graphics card.

Re:Whatever... (4, Insightful)

ivan256 (17499) | more than 8 years ago | (#15231850)

The write the games for the hardware that is out there. If the GPU sales strategy didn't work, you'd see more games for lower powered cards, not more people with machines that can't handle modern games.

Re:Whatever... (0)

Anonymous Coward | more than 8 years ago | (#15231988)

actually most games are not gpu bound anymore with the new 7900 gt SLI and x1900XT.
even less gpu bound with the quad sli ones.

Re:Whatever... (0)

Anonymous Coward | more than 8 years ago | (#15231707)

Seriously, some people buy complete computers for half that.

Re:Whatever... (5, Insightful)

X43B (577258) | more than 8 years ago | (#15231771)

"I'm sure the $500 GFX cards only exist to make spending $300 on a single component of a computer seem reasonable."

I'm sure you are probably joking but I think you nailed it on the head. Having a super expensive card, even if it is a low seller, has many positive benefits.

1) You will sell some to those who want to be ub3r133t
2) You get the publicity of being "the best" even if no one actually buys the best
3) Perhaps most importantly, the "Wendy's Effect". It is oft quoted that no one buys Wendy's triple cheeseburger. Someone at Wendy's decided that offering it was a waste so they removed it. However, this almost immediately reduced the number of double cheeseburgers sold. Apparently when people see that there is something more expensive and more "over the top" they are much more compelled to buy the next lower version than if that same version was the high end.

Re:Whatever... (5, Funny)

Bottlemaster (449635) | more than 8 years ago | (#15232069)

It is oft quoted that no one buys Wendy's triple cheeseburger.


I once bought a Wendy's triple cheeseburger. At the time they had a "double the meat for 99 cents" offer, and I could get the best deal if I bought a triple cheeseburger and doubled the meat. Unfortunately this resulted in a burger with only four patties, and I had to return it. Apparently "double" means the same thing as "increment" at Wendy's.

Re:Whatever... (3, Interesting)

Tyrant Chang (69320) | more than 8 years ago | (#15232081)

To add little more to your post, I think the term is compromise effect: http://www.everything2.com/index.pl?node_id=127302 9 [everything2.com] (or also known extreme aversion effect). People will generally choose a midpoint of an option set and framing an option as a middle makes it more attractive.

Apparently, this effect has been "applied" to many fields like marketing, sales, negotiation and also in legislative world where a legislator will present a stupid bill that he knows will fail because of the backlash but will make his next bill more reasonable (as we see too often).

Re:Whatever... (4, Interesting)

blair1q (305137) | more than 8 years ago | (#15232106)

>Apparently when people see that there is something more expensive and more "over the top" they are much more compelled to buy the next lower version than if that same version was the high end.

don't confuse compelled for enabled

people don't want to feel like pigs

they feel like pigs when they get the biggest item

if they take the next-biggest item, they both satisfy their need to serve themselves, and their need not to be gluttonous

also, it's very common that the best value is to be had by taking the second-tier item; the reason is that on a learning-curve pricing scheme, the slope is steepest between items near the premium end of the curve; why a learning-curve pricing scheme applies is beyond the scope of this article, many reasons can be found, and exceptions as well

well, (2, Insightful)

joe 155 (937621) | more than 8 years ago | (#15231665)

obviously just sticking in a crazily expensive video card won't make a system radically better, computers are a bit bound to go at the speed of the slowest part (I know that doesn't always hold true) but if you computer costs $1000 then spending $500 on a card wouldn't be sensible

Shock! Horror! (1)

tomstdenis (446163) | more than 8 years ago | (#15231666)

You mean you don't need the most expensive hardware possible to enjoy life?

No way!!! BUY BUY BUY!!! /me happy with my 6600 :-) [it's the cheapest non-crippled PCIe card I could find at the time]

Tom

Re:Shock! Horror! (0)

Anonymous Coward | more than 8 years ago | (#15231945)

Yeah but try to play Fear or Oblivion at 1600X1200 resolution with all features on and AF at 16X on your 6600 and then tell me a 500.00 card isn't better

Re:Shock! Horror! (2, Insightful)

tomstdenis (446163) | more than 8 years ago | (#15231967)

I'd need to buy a 1600x1200 monitor first...

Besides I don't view "game not working on setup" as a feature. If the game was well put together it would have reduced texture/polycount modes so it could work on a more appropriate range of hardware.

There use to be a day where programmers were judged by how well they could make software fit the hardware. Not the other way around.

Tom

Re:Shock! Horror! (3, Informative)

Professor_UNIX (867045) | more than 8 years ago | (#15232076)

Yeah but try to play Fear or Oblivion at 1600X1200 resolution with all features on and AF at 16X on your 6600 and then tell me a 500.00 card isn't better

I rarely play games at more than 800x600 anyway so no loss for me. My $150 GeForce 6600 card came with a $50 instant rebate for a video game at Best Buy so I picked up a copy of Battlefield 2 with the card. It plays absolutely fine on my AMD Athlon XP 2400+ system with the 6600 card at 800x600. It's AGP to boot! I imagine I'll need a better motherboard and processor if I really wanted to take advantage of some higher performance graphics cards, but I have other priorities at this time in my life. Maximizing my 401(k), building a house, putting away money for my child's college education, etc.

Have you sat back and thought about how far that $500 would go if you didn't just throw it away on a piece of computer equipment that will be obsolete in 3 months? For example, find some financial calculators and do some calculations of putting $500 every 3 months into a high growth rate mutual fund or stocks for example. I bet you'd be pleasantly suprised by the kind of growth your investment would return. Who am I kidding eh? This is Slashdot. Spend spend spend fools! Spend so my stocks will increase in value! Woohoo.

Re:Shock! Horror! (0)

Anonymous Coward | more than 8 years ago | (#15232006)

Hear Hear! Well spoken, Bruce!

I still find no reason to upgrade my 6600 I got for $130.

Though I guess if I felt the need to run something at a resolution that did nothing but make the close textures blurry and polygon angles more noticable I'd sing another story.

1024x768 2xAA 4xAF is more than adiquate for me, holmes!

Try $200 (4, Interesting)

eln (21727) | more than 8 years ago | (#15231679)

I find the cards that are at the price point of around $150 to $200 are usually good enough to play new games for about 2 years after they're purchased with all of the eye candy enabled. After that, you can either buy another $150 to $200 card (which obviously is far more advanced than the one you bought 2 years previously) or continue to play newer games without all of the eye candy enabled.

Re:Try $200 (1)

Hnice (60994) | more than 8 years ago | (#15231965)

This is about where I've always bought, too, and I'm very happy. I do find, though, that I can't usually run things all out (8xAA and the anisotropic all the way up). I mean, I really don't care, because it's an arms race I simply can't afford to get caught up in, but I do think that there's a difference.

Not very surprising? (1)

raz0 (899158) | more than 8 years ago | (#15231689)

Not going for the top of the line graphics card, motherboard, CPU, RAM heck virtually every piece of hardware yields you the most bang for the buck. Is anyone really surprised by this? It's common knowledge that companies will use high prices for their top-of-the-line hardware to cover the price of R&D, while later supplying nerfed versions of the same hardware to cover the casual or mid-range consumers.

7900GTX = top of the line performance
7900GT = high-end though much better performance/dollar ratio.
7600GT = mid-range runs-all graphics card that great for the budget-aware gamer.
7300GS = low-range graphics card for the casual gamer that wants to play the odd game of Sims 2 or similar.

Re:Not very surprising? (4, Interesting)

Kjella (173770) | more than 8 years ago | (#15231766)

Not going for the top of the line graphics card, motherboard, CPU, RAM heck virtually every piece of hardware yields you the most bang for the buck.

Actually it's more generic than that. If you look at hard disks (because it has such a good metric, but the same applies to all hardware) you'll see $/GB is not lowest at the low end - there's the infamous "sweet spot" in the middle. Same with CPU, the lowest CPUs don't give the most bang for the buck. There's some inherent costs in just producing and shipping the product, which means the lowest are typically really very crippled but not that much cheaper. In terms of absolute performance, mainstream is the best. Of course, that does not mean your utility of the performance is maximized unless it's exactly 1:1 with the dollar value. My parents could get a 7900GTX SLI & 750GB Seagate disks and their utility would be 0 (over their current machine). There's no sense spending money on performance if you're not getting utility, and it makes good sense to spend money where you are getting utility, even if you're moving away from the sweet spot.

Re:Not very surprising? (1)

tomstdenis (446163) | more than 8 years ago | (#15231783)

You need a 7xxx series to get "mid gaming"?

I can run most FPSes at 1024x768 with my GeForce 6600.

I only spent 175$ on my card not 300$.

Me thinks you're not really viewing things objectively.

Tom

Re:Not very surprising? (1)

raz0 (899158) | more than 8 years ago | (#15231911)

7xxx doesn't say *anything* about the performace. It's just the graphics card "generation". What is *does* mean, is that it will support the latest technologies like HDR and SM3.0, but again I stress - it doesn't say anything about the performace. In fact, your Geforce 6600 is most likely more powerful (I haven't seen seen benchmarks, but this is almost certainly true) than the 7300 chip. And yes, I deliberately excluded previous generation cards from my posts, since it wouldn't add to my point. Buying last generation graphics cards is almost never a good idea if you want the most bang for the buck, since Nvidia and Ati repeatably produce new generations which cover all the price points from top of the line to low-range with support for the latest technologies, which you'll miss out on with a previous generation graphics card. Yes, your 6600 (GT?) has now been replaced by the 7600GT.

Re:Not very surprising? (1)

tomstdenis (446163) | more than 8 years ago | (#15231960)

7600-256MB cost 180$, 6600-256MB cost 120$ and the 128MB version is 20$ cheaper.

That's today. But 6 months ago the price disparity was much higher. My 6600 cost 180$ when I bought it and the 7600 was over 250$.

Some people wanna game but don't care if they have to do it at 1024 or 800. Admitedly today if I had to buy either I might just go for the 7600 since the price difference is so low. So I guess you have a point about just sticking to the 7xxx series.

My bad...

Tom

Not actually news to me... (2, Interesting)

Inverted Intellect (950622) | more than 8 years ago | (#15231693)

The price/performance graph for most every imaginable computer component can be represented by a bell curve. It just so happens that I'm in the market for a 300$ graphics card. I plan on buying the Nvidia 7800 GS, which is the most powerful AGP card available. While it sucks that those with AGP mobos have been left without an upgrade path, this particular price range works fine for me. I figure it'll be the last major upgrade to my close-to-obsolete AGP slotted computer.

Re:Not actually news to me... (1)

Vidiot3k (612026) | more than 8 years ago | (#15232052)

Sure there's an upgrade path... time to build a new system!

MDA (1)

tonigonenstein (912347) | more than 8 years ago | (#15231695)

What's wrong with an MDA ?

$300 is not expensive? (5, Insightful)

suv4x4 (956391) | more than 8 years ago | (#15231698)

Wait a second, since when $300 for a friggin' video card is not expensive? Because there's $500 cards?
If there were plenty of $2000 video cards, would $1000 be not expensive then?

Someone's being brainwashed here...

When a pretty good video card is in the range of $80-$160... now that's more reasonable.

Re:$300 is not expensive? (3, Interesting)

chrismcdirty (677039) | more than 8 years ago | (#15231746)

http://www.newegg.com/Product/Product.asp?Item=N82 E16814150098 [newegg.com] . Try that. If you're willing to spend twice the price, (and have an SLI-capable system) I hear they perform very nicely in SLI, and still less than the price of a $300 card.

Re:$300 is not expensive? (3, Funny)

suv4x4 (956391) | more than 8 years ago | (#15231839)

http://www.newegg.com/Product/Product.asp?Item=N82 E16814150098. Try that. If you're willing to spend twice the price

Thanks for that, I'm looking to update my card but always the next big thing comes up (or how about HD support now.. the latest bullshit) and I gotta sell my grandma to afford it so I'm just stuck with my GeForce MX4 here.

I felt really cheated by the article title you know? It went like: Forget about expensive video cards! Ok did you forget? Now we'll remind you with this review of $300 video cards...

Re:$300 is not expensive? (1)

chrismcdirty (677039) | more than 8 years ago | (#15231855)

I agree. When my Radeon 9700 died a few months back, I was pretty mad that I could no longer buy that in a store, but I could still buy a 9550. So my choice was either that or an X7/800 for the same price I paid for my 9700, neither of which I wanted.

Re:$300 is not expensive? (2, Insightful)

jellomizer (103300) | more than 8 years ago | (#15231812)

I wish I had mod points today for that. The Hard Core gammer is expecting so much out of their systems and the game companies are obliging thus making these cards ultra expensive. I say stick with seeing some polygons for a while and use the money on Fun Games, or Food, or for Dinner(s) on a Date. Back in the old days of the late 80s and early 90s computer games were designed to run on the average running spec computer. CGA graphics were still Available, EGA was widely used when VGA was new, and could run on 256k of ram. At the Time getting a VGA Display and Card was close to $500-$600 And for a couple of years after that I have only seen one game released for VGA. But computer game companies stuck with CGA and EGA for a long time because most people didn't have VGA displays. VGA finally came to become standard around 92/93 and then SVGA was out but no games used SVGA they were all EGA or VGA. Not until 94 was SVGA games were introduced running on Windows 3.1 right before 95 was released with win32 abilities. But the games were targeted and performed optimally on the standard computer at the time. Then when Quake came out with support for 3d Graphic Cards, and network it started the hard core gammers started spending all the money for these cards because now they can see the moving pixels of your opponent where he is stuck on a lower resolution and get some skips you have the advantage and able to kill him without him seeing you. So hard core gammers got spend more an more money. And game companies realize there is a market they made games that push the edge more and more. And leaving a class break between people who play computer games for fun and Hard Core Gammers who do it for more then fun.

Re:$300 is not expensive? (2, Interesting)

ZeroExistenZ (721849) | more than 8 years ago | (#15232011)

I agree... i'm still using my Geforce TI4200; I bought it cheap cause the DirectX9.0 cards were coming out and I didn't feel like going for the premature bugs and such.

There hasn't really been a game I couldn't play; I've finshed Halflife 2, all Need For Speed titles, all GTA titles, and so many more...
So why would I need to fork out 300-600 for playing what I can play now, but "better"?

Nothing has felt as sluggish and jaggy as trying to play Blood II with a voodoo2 card. on a P200. Are kids these days spoiled rotten? I had to work to finance my PC-spending, and I still do.

Re:$300 is not expensive? (1)

mbourgon (186257) | more than 8 years ago | (#15231823)

I remember buying the original Geforce. $300. Amazing graphics, used it for a long while. But if I'd known that it would cause the "average" videocard's price to be $300, I would not have bought it.

Re:$300 is not expensive? (1)

SeeMyNuts! (955740) | more than 8 years ago | (#15232080)


How much money are people supposed to spend on passive entertainment? It'd be better to spend the money not spent on gaming on tickets to a traveling Broadway play or a live concert. Pre-produced entertainment has become so common and without novelty that live entertainment is actually more worthwhile, now. Perhaps it'll help remind people they are actually alive and not stuck in a cube-shaped room with a glowing window to an imagined world.

It worries me when I talk to a teenager and the first thing they want to talk about is video games. Video games video games video games. The perfect thing to occupy a mind that doesn't want to engage real issues and real responsibility.

Games aren't all bad, but there is just way too much of them! Throw a game at a kid and he'll slink away to a room for the next two weeks. Do that 20 times a year, and it's as if there's no child at all! Do parents feel good about this?

Re:$300 is not expensive? (1)

ergo98 (9391) | more than 8 years ago | (#15232104)

How much money are people supposed to spend on passive entertainment? It'd be better to spend the money not spent on gaming on tickets to a traveling Broadway play or a live concert. Pre-produced entertainment has become so common and without novelty that live entertainment is actually more worthwhile, now. Perhaps it'll help remind people they are actually alive and not stuck in a cube-shaped room with a glowing window to an imagined world.

Your comment is very confusing. Plays and concerts are passive entertainment, and video games are active entertainment. Not really sure what any of it has to do with pre-produced entertainment.

In any case, many video gamers are busy playing challenging games socially (online games are prevalent now, often with dozens of players), often communicating and having fun with people from around the globe. Pretty hard to villainize that.

Me my Mum and I.... (3, Interesting)

MosesJones (55544) | more than 8 years ago | (#15231706)

And then of course you have the home computer that I'm currently fixing for my mum (mom to USitiens) which has a very basic graphics card that powers the 17" TFT rather nicely, sitting next to that is the one my wife uses which has a Voodoo 3500 TV, running SUSE, and that works fine for her.

The ONLY people who need these graphics cards are people who place top end games. I find it stunning when I come across work desktops for people who do MS Office stuff that have only 512Mb RAM but a graphics card capable of doing Doom3 at decent framerates. 80%+ of people don't need even the 7900GT let alone the GTX and it would take a completely brain dead operating system to require people to have top line graphics cards just to run a word processor....

That of course is where my theory breaks down, Vista... you might not play games... but our developers do.

Re:Me my Mum and I.... (5, Interesting)

ScrewMaster (602015) | more than 8 years ago | (#15231802)

The ONLY people who need these graphics cards are people who place top end games.

That's not entirely true. For example, in the mechanical engineering department where I work there's one guy with a really fast PC and a high-end (I think nVidia but I'm not sure) graphics card that does 3-D design and rendering of parts for the automated machine tools on the plant floor. Not that many years ago, he would have had some kind of special "workstation video board" that would have cost a couple of grand. Those have all but died out as the likes of nVidia and ATI have pushed the performance envelope so far that engineering tasks pale in comparison to the requirements of a game. I guess my point is that there are many tasks that need high-performance 3D, they're just not as high-profile as gaming. And even that is a rather small subset of the total number of computer users out there.

Re:Me my Mum and I.... (1)

LiquidCoooled (634315) | more than 8 years ago | (#15231829)

I recently "upgraded" from my nvidia 5900 AGP system to a pci express system, however was unwilling to spend lots on the graphics card.

I opted to get a nice motherboard with onboard nvidia 6100 on (only £40!).

I chose not to play half life 2 for a while because I didn't think it would play, but have been pleasantly surprised with the results since.
Sure, I can't run at super resolution (I think I use 800*600 now) but I have all the options turned up and it seems to handle HDR ok.
It barely slows down and I've held off buying a new card until something comes along that warrants it.

(I have a couple of hundred pounds put away until I need a new card)

Re:Me my Mum and I.... (0)

Anonymous Coward | more than 8 years ago | (#15231975)

fixing for my mum


huh? i'm totally confused!


(mom to USitiens)


oh, now i see! 'cause as a "USitien" i'm incredibly ignorant. thanks, UKitien.

Erm (1)

Luthair (847766) | more than 8 years ago | (#15231708)

If this story was meant to be about how one can build using budget hardware, why are they using a top of the line CPU? Would it not have been better to pair the GPUs with a cost appropriate one?

Re:Erm (4, Insightful)

Proud like a god (656928) | more than 8 years ago | (#15231788)

To take CPU bottlenecking out of the equation. Comparisons of CPUs with the best graphics card likewise attempt to take GPU bottlenecking out of the equation.

Well... (4, Insightful)

Aphrika (756248) | more than 8 years ago | (#15231715)

I have a PC with a ATi 9800 Pro in it which I use for gaming. I've had this since 2003 and it still plays a mean game of Battlefield 2 when I feel like it. If it runs a bit slow then I plonk the resolution down. This is by far the best way to get your game to run faster. Anyway, bottom line is - it runs whatever current game I'd care to buy for it.

Now I've thought about upgrading, but two things have hampered me. The first is strictly technical - I have an AGP machine, so there's not a huge amount of difference over a 9800 Pro whatever I plug in there because it'll always be limited by the bus speed.

The second is probably more of a personal thing - I've got mates who have the latest and greatest GFX cards in their machines, but I'll be damned if I can tell the difference between their games and mine. Sure, it's a slightly higher res, but are there any bonus features like fog or smoke? No. Better anti-aliasing? No. I spent my hard-earned cash on a Dell 20" widescreen monitor and I can assure you that as far as gaming experiences go, this added to mine much more than a new GFX card would.

Maybe it's me getting old, but hardware upgrades now tend come when I buy a new PC, and be a notch under the top o' the range. Although having said all this, I just picked up a Inspiron 9400 for work which did come with a GeForce 7800 in it, which I guess'll be useful for um.... spreadsheets *cough*

Re:Well... (3, Funny)

ScrewMaster (602015) | more than 8 years ago | (#15231814)

Just make sure that you configure your "boss button" properly.

Not a huge amount of difference over a 9800 Pro (0)

Anonymous Coward | more than 8 years ago | (#15232034)

I replaced my trusty 9700 Pro with a saphire x800 gto. I patched the bios to unlock an extra quad of pipes and my 3d mark score doubled (before overclocking). I was able to run bf2 at 2 higher resolutions, up the AA from 2x to 4x, and importantly enable vsync. Very different bf2 experience. Considering I spent only $170( more than twice the what I paid to get my 9700 pro), I am very happy. PCI express does have better bi-directional support, but I think agp 8x support was dropped too soon.

Re:Well... (1)

AaronLawrence (600990) | more than 8 years ago | (#15232093)

t'll always be limited by the bus speed.
I doubt it. Like AGP before it, the PCIExpress bus is still far slower than onboard video ram, therefore it isn't especially important in terms of framerate.

Re:Well... (2, Insightful)

soupforare (542403) | more than 8 years ago | (#15232098)

The X800Pro (and up) beats the 9800Pro. Get something like the X850XT/PE and you're good for another long while.
Remember, PCI-E was introduced for the future, we haven't hit on games that saturate an AGP 8x bus.

What am I missing out on ? (1)

aspeer (131086) | more than 8 years ago | (#15231716)

I have an Nvidia 5200fx card that drives a Dell LCD panel at 1600x1280 via a DVI connector. I use KDE as my desktop manager.

I am genuinely curious what (if any) performance gains I would see by upgrading the video card to one of those mentioned in the article ?

Are these mid/high end card only beneficial when running 3D games or OpenGL apps, or do "Joe Sixpack" users such as myself gain something as well ?

Re:What am I missing out on ? (1)

raz0 (899158) | more than 8 years ago | (#15231741)

The short answer: Yes. The long answer: Unless you play any new games on a regular basis and find them to be rather slugish, then no, you won't benifit anything from upgrading. The only reason to upgrade your graphics card today is the get better performance in games and other 3D applications. Nothing will be gained 2D-wise. Perhaps with aiglx and the like you will be able to use the true power of your graphics card, but that's not now. Also, these new effects would have to be damn shiny for them to put pressure on even a low-end card like the 5200FX.

Re:What am I missing out on ? (1)

arkhan_jg (618674) | more than 8 years ago | (#15231863)

The 2D performance of graphics cards has barely moved for years, all the work (and extra transistors) go to much beefier 3D capability. Unless you're thinking of moving to a shiny new 3D desktop - of which there are several in development for linux - you'll see next to no advantage on your existing X11/KDE desktop. About the only advantage of upgrading for 2D desktops is higher resolutions, but the limit is generally the monitor now, not the graphics card. If you're going to dual-boot vista of course, a good 3D card will be an advantage, but a 5200fx should actually do reasonably well, it's not that old a card - I used to use that card for a 2nd gaming rig a couple of years ago!

As you surmise, the current gen high/mid-end cards are basically for games and openGL apps. About the only other advantage is improved MPEG2 and MPEG4 decoding, which lowers the CPU load when playing videos - useful if you're building a PVR. The nvidia 6 series and upwards do have the purevideo software for windows as well, which does improve video quality somewhat.

To be honest, I now prefer a slower passively cooled graphic card (currently got an nvidia 4200) in my workstation/servers, and I've gone for the hot and noisy SLI 7800GTs in my games rig, which is only on while I play games.

Re:What am I missing out on ? (4, Informative)

TheRaven64 (641858) | more than 8 years ago | (#15231996)

I read a paper from Microsoft Research[1] about rendering text using the GPU. The idea was that the raw bezier paths of each character in a font would be loaded onto the GPU and then each character would be created on the fly by a shader. This gave a huge performance benefit; it reduced the GPU-RAM bandwidth requirement hugely and allowed the CPU to offload pretty much all text rendering to the GPU.

For comparison, take a look at Apple's Quartz 2D Extreme. This uses the CPU to render each character to a texture and stores them in the graphics RAM. These are then composited by the GPU. The downside of this, of course, is that the CPU needs to render the text for every size at which it is used. Even so, this gives about an order of magnitude better performance than the traditional way (and, of course, lower CPU usage).

If this becomes mainstream then a GPU with fast shader support will give:

  • Faster text-rendering performance.
  • Lower CPU usage when rendering large amounts of text.
  • The ability to have effects like Apple's Exposé, but with sharp, fully anti-aliased text at all stages in the zoom effect (the performance on current generation GPUs was fast enough to render entire screens full of small text every frame).


[1] See? They do actually do interesting things. It's a real shame nothing from MS Research ever seems to make it into shipping products though.

7800GTX and lack of power (1)

cbelle13013 (812401) | more than 8 years ago | (#15231720)

I recently purchased a 7800GTX for work and was surprised about the additional power consumption needed. A regular Joe Blow probably doesn't have the resources needed to even use one of these $500 cards.

I've got a Dell Precision 370 and after plugging in my new card, I don't have power cables left to go anywhere else. It would be nice if they could just draw off the motherbboard power. Now my home PC is a different beast, but there's something about taking home expensive work resources that my employer frowns upon.

The emperor has no clothes? (1)

Evro (18923) | more than 8 years ago | (#15231726)

I'm so glad they've found that you don't need to buy an "expensive" video card, just a $300 one. Personally I can't see spending over $150, and even that seems extreme to me now that my days of trying to eke out every frame in Q3 to hit the magic number [savagehelp.com] have passed.

Common sense! (1)

Parallax Blue (836836) | more than 8 years ago | (#15231735)

This sort of thing is just common sense... doing a bit of research on video cards and their performance instead of going with the most expensive one out there is being a smart consumer and really isn't too hard what with all the reviews available on the 'net. Unfortunately, many people just don't do this for various reasons.

Summary does not match article (1)

GauteL (29207) | more than 8 years ago | (#15231736)

In fact the $500 cards perform noticably better than the $300 cards. You may not think it matters much, but new games, such as Oblivion, are incredibly graphics intensive. Only the top-end cards from ATI are able to play Oblivion completely smoothly in 1600x1200 with all the buzzwords activated.

If you play highly intensive games at insane resolutions, then the high-end cards may be for you.

On the other hand, if you ever think about buying a $500 card because it will "last you longer", then you are kidding yourself. You are almost always better off buying $250 cards and replacing them twice as often.

Re:Summary does not match article (1)

TomHandy (578620) | more than 8 years ago | (#15232032)

Only the top-of-the-line ATI cards can do this, but not the NVidia cards? You're saying that an Nvidia 7900GT or 7900GTX can't play it completely smoothly with 1600x1200 with everything activated? I have to say, I'd be pretty ticked off if I spent $600 on a 7900GTX and I noticed even some slight amount of jerkiness in Oblivion.

A 'Wow!' moment. (2, Informative)

eddy (18759) | more than 8 years ago | (#15231742)

It's not often that I go "wow" after a hardware upgrade. 486-&gtpentium class. First Athlon. Virge3D ->3Dfx Voodoo 1 (glquake for teh win)... and just a week ago I went from a nVidia PCX5900 (and ATI 9600XT/256) to a 7900GT. Everything on High in BF2 (and 2x FSAA); smooth as butter. Going from 800x600 low textures, everything down in oblivion to 1280x960 HDR: Wow

Re:A 'Wow!' moment. (5, Funny)

Jeff DeMaagd (2015) | more than 8 years ago | (#15231795)

I have good news and bad news. The good news is that your post is buzz word and hip-speak compliant. The bad news is that I have no idea what you are saying.

Haven't had a 'wow' moment since... (1)

Junta (36770) | more than 8 years ago | (#15231861)

3dfx voodoo 1 also from a Virge that didn't really do much interesting at all). Before that it was 286 to Pentium (I could play ultima underworld so smoothly compared to my friend's low end 486).

Maybe I'm just not easily impressed anymore, but everything I can recognize as being better now, but it seems evolutionary rather than revolutionary. Going to Pentium is when I got the horsepower to play fundamentally 3D looking games (Doom, Ultima Underworld), whereas before the best I did was Wolf3D. And again, going to Voodoo 1 added a depth to the 3D games and that has evolved from there with higher polycounts. If I went to Voodoo 1 straight to a modern video card, I'd probably be more wowed, but from a 5900 to a 7900, I really just shrug that sort of difference off when I see it.

Re:Haven't had a 'wow' moment since... (1)

Jeff DeMaagd (2015) | more than 8 years ago | (#15232078)

Going from a 286 to a Pentium is about a decade's leap in technology. A 5900 to a 7900 is about a couple year's leap. I think that might be why you weren't "wowed".

Who let this get through? (2, Insightful)

ecuador_gr (944749) | more than 8 years ago | (#15231750)

I really don't get it.

Exactly the same (and obvious) conclusion as any review I've seen on sites like HardOCP, Anandtech, Tomshardware. Is it news that this article is one of the most amateurish attempts at reviewing cards we've seen in recent history? 4 benchmark runs (at least they use games) put together in little fps graphs along with a 2-page grade school level analysis and of course no details about more important stuff like image quality etc.

Maybe it's just me, since I have never paid over $200 for any kind card, and I would probably object seing such an article on [H], Anand, Tom etc being made "news". However, this particular article is not even close to that level. It really seems like it does not offer anything noteworthy.

Skewed results? (4, Interesting)

travail_jgd (80602) | more than 8 years ago | (#15231762)

All of the benchmarks in TFA are run at 1600x1200.

I understand that maximum resolution is the best way to highlight the limitations of the cards. But how many "budget" gamers are going to have monitors capable of running at those resolutions?

All of these cards produce "acceptable" results at 1600x1200. I read the article as "the cards are identical at lower resolutions, but reporting you need to spend more money makes our advertisers happy." Or maybe I'm just cynical.

Re:Skewed results? (4, Insightful)

CodeMonkey42 (965077) | more than 8 years ago | (#15231921)

Most real gamers (budget or otherwise) still use inexpensive CRT's which produce the best image quality, have zero ghosting, zero dead pixels, etc. and easily do 1600x1200 for the latest games or 800x600 for "classic" games. My $200 NEC AS900 easily outshines the majority of LCD monitors in image quality, and the majority of games I play on my NVIDIA 6800 GT are indeed at 1600x1200.

Re:Skewed results? (1)

heinousjay (683506) | more than 8 years ago | (#15232003)

I object to your use of the word 'real.' Anyone that plays games is a real gamer. You refer specifically to the subset who somehow ties their penis size to the 1% increase in performance they get from putting their system at risk of meltdown by massively overclocking everything.

Re:Skewed results? (1)

vondo (303621) | more than 8 years ago | (#15232050)

Soon I will be one of these "budget gamers." By that I mean that I don't spend a lot of $$$ on game related hardware and software, not that I have no money. I'm buying a 1600x1200 LCD to increase my productivity at home. I also like to play the odd game (generally a year or two old) and want a video card that is able to play my games at the highest resolution my monitor will support. My next game purchase may be Neverwinter Nights 2 in early 2007, so at the point I will probably looking at a 7600-level card. By that time, those cards should be in my sweet spot, about $100-150. At the moment I am using the 6150 graphics built into my motherboard and for the things I play, that's fine.

Ironic (1)

Life700MB (930032) | more than 8 years ago | (#15231784)


I find it funny to read about paying 500 dollars for a GPU the day after John K. Galbraith [wikipedia.org] died.


--
Superb hosting [tinyurl.com] 20GB Storage, 1_TB_ bandwidth, ssh, $7.95

My 64MB Nvidia card gives good performance (1)

ravee (201020) | more than 8 years ago | (#15231820)

I have a 64 MB Nvidia card which gives decent performance while playing many openGL based games. I wonder what is the specification of a PC which get it classified as a gaming rig...

Umm $300 IS expensive (2, Insightful)

nurb432 (527695) | more than 8 years ago | (#15231822)

When you can buy the rest of the box for about the same price, spending that much on just video is lunacy.

But... (1)

Yvanhoe (564877) | more than 8 years ago | (#15231826)

... how a 300$ card can fit in a 100$ PC ?

Re:But... (4, Funny)

lastchance_000 (847415) | more than 8 years ago | (#15231989)

Easy, you just need a big enough hammer.

Very happy with my 6800GS (1)

DrXym (126579) | more than 8 years ago | (#15231868)

I built a computer recently. While I briefly considered buying a high performance card, at the end of the day I don't believe they are worth it. Cards and CPUs seem to be governed by a law of diminishing returns. What's cutting edge now won't even raise an eyebrow in a few years. So why fork out absurd amounts of currency for one? The same with the various "extreme" CPUs. Spending 2-3x the cash for something which delivers a 30% gain is just stupid.

In the end I picked up a pretty good NVidia 6800GS which is more than adequate for most games. It works well at full res and detail on most games. If for some reason I want to improve the performance even further, it also supports SLI (and my motherboard too), though I doubt I will bother with it.

wtf (0)

Anonymous Coward | more than 8 years ago | (#15231901)

who doesn't know that already??

Do you game in 1600x1200? (3, Interesting)

BassKadet (936182) | more than 8 years ago | (#15231928)

I know that 1600x1200 really stresses the GPUs in these cards but I often wonder how many people are actually gaming at that resolution. I have lots of hardcore gamer friends in the area and I've seen their rigs and I know that only 1 of them has a monitor bigger than 19" and runs 1600x1200. Sure, 1600x1200 looks great on a 19" monitor too, but with a monitor that small 1280x1024 still looks very nice and to push the res up to 1600 really isn't worth the FPS hit. Or at least, that is the concensus amongst my friends. I don't mind paying the $500 for something I want, I have camera lenses that cost twice that amount. But somehow it just seems excessive to spend an extra $200 over a $299 card to gain 5-15 FPS for a game in some high resolution I'll never use anyway.

Cost/Performance Breakdown (5, Insightful)

Starcub (527362) | more than 8 years ago | (#15231947)

I remember when the V1 3d cards were first ccame into the market. They were easily top of the line and the best cards went for about $200. When the next generation V2's came out, I pre-purchased the very 1st V2 SLI card (actually 2 cards bridged together) at the incredibly expensive price of about $600. It was alot, but the card literally quadrupled the performance of the V1 I had and the price very quickly fell another $200 before the V3's were out. Today you pay $500 for a top of the line single GPU card that doesn't even double the previous generation's performance. It seems video cards are becoming a disproportionally expensive component of the PC and just aren't providing the same value.

mi8us 3, Troll) (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#15231953)

%here, please do

Whats the friggen point? (3, Insightful)

a_greer2005 (863926) | more than 8 years ago | (#15232000)

A buddy of mine has a AMD 3800, ATI radion x1900xtx, and 2 gb ram, and maxing the graphics out in some of the latest games cause it to be noticably jittery, so why spend $2000 on a gaming PC when an xBox 360 does jitter free HD for $400?

Re:Whats the friggen point? (2, Insightful)

TomHandy (578620) | more than 8 years ago | (#15232020)

Because not every PC game someone might want to play comes out on the XBox 360? Because some people don't like playing FPS's on consoles when they can play them with a mouse and keyboard on their PC? Because some people find XBox live to be overpopulated with whiny 12-year olds, ruining the multiplayer experience? Because you miss out on some of the customizability and modding that you get with PC games (in Oblivion, for example)? Just a thought.....

right, so the budget video card is $300 (0, Troll)

nazsco (695026) | more than 8 years ago | (#15232016)

earth is calling. moron.

the lowest reviwed board had doom3 runing at 1600x1200 with all the eye candy maxed out at some 99FPS! anything beyond 25 is a damn waste! WASTE!

the budget card should be one that runs doom3 at 1200x1024 at 25fps.

$300 is the price for the crippled over the top video card the extravaganza you will never need. $500 is for dumb snobs that wana pay double for a dick increase in Ghz.

'Expensive' video cards? (1)

Bing Tsher E (943915) | more than 8 years ago | (#15232024)

When I started reading the summary, I latched on the '$200' as an expensive video card. Which it is. Then to discover the article is talking about $300, rather than $500 video cards.

Uh....

It's similar to when you see a 'Save $17,000 on your next car purchase" advertisement, but you've never spent even $15,000 on a car.

Okay, I confess that I've spent more than $150 on a video card, once. It was an STB PCI card that had a daughterboard on it that gave it the extra 4 megs of RAM.

The idea that my 'gaming experience' would increase to the point where it would justify spending $500 on a video card is just bewildering. And I played through ALL the levels of Wolfenstein 3D back in it's time.

I guess gameplay is more important to me than gee-whiz graphics. I still buy games, but usually check to make sure my hardware meets the minimum requirement.

This whole 'subculture' reminds me of the culture of riced out cars, to be honest.

$100 is my upper limit (3, Insightful)

macemoneta (154740) | more than 8 years ago | (#15232033)

When I buy a replacement card (which I haven't had to do since I bought my GeForce FX 5600XT a couple of years ago), I buy whatever is currently at the $100 price point. That lets me play better than 95% of games well. If I were buying today, I'd get a GeForce 6600. It's more than good enough.

No matter what card you buy, in a short period of time there will be a small number of games that need better. Chasing that carrot with no self control is an exercise in futility.

A bit offtopic - need gamer advice (1)

abigor (540274) | more than 8 years ago | (#15232058)

I'm off on a 4 month holiday, and I'm taking a 3 year old laptop with me to play a few games on during downtime (I travel a lot, and I know from experience how many boring moments there can be). I've never played any games on it, and I need advice as to what games I should buy, since obviously it won't support anything close to the latest and greatest. What older games are there that I should seek out?

2.4 GHz P4, 512M, ATI Mobility Radeon 9000

My ideal game is Elder Scrolls IV: Oblivion. Of course, I can't play it on this laptop, but that should indicate the sort of gameplay I enjoy. Doesn't have to be strictly fantasy, though.

Any suggestions are much appreciated, thanks.

Expensive Video Cards make sex better. (1)

LibertineR (591918) | more than 8 years ago | (#15232099)

Performance anxiety? Not here!

I run two 7800GTX cards to run 3 Viewsonic VP201b 20" displays and one 30" HDTV.

"Stop complaining, Bitch! If you had an HDMI port, I'd be done by now."

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>