Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Launches New Affordable GPU

ScuttleMonkey posted more than 8 years ago | from the bargain-basement-with-a-twist dept.

Graphics 321

mikemuch writes "Today Nvidia unveiled a new low-cost, high-power graphics processor SKU. ExtremeTech's Jason Cross has done all the benchmarking, and concludes ' This makes for an impressive bargain and a huge step up from the generic GeForce 6800. The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT?'"

Sorry! There are no comments related to the filter you selected.

Tech Report Review (3, Informative)

hattig (47930) | more than 8 years ago | (#13970873)

Pretty decent review here I read earlier:

nVidia 6800GS []

Re:Tech Report Review (4, Informative)

LehiNephi (695428) | more than 8 years ago | (#13971212)

There are lots of other reviews out there, too. Looks like the 6800GS kicks the X1600 where it hurts. Over and over and over again.

- [H]ard|OCP []
- Avault []
- Computer Base []
- Driver Heaven []
- Guru3D []
- Hartware []
- HotHardware []
- Noticia3D []
- nV News []
- The Tech Report []

I shamelessly stole this list from

FR1st p0st! (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#13970912)


No AGP! (0)

Anonymous Coward | more than 8 years ago | (#13970916)

I still see no current reason to upgrade my 9700 pro (which has done fine with HL-2 and Far Cry) if I have to dump my current hardware investment, which seems to run all my apps with ease. FEAR may change my situation, but I'm a little disappointed my AGP8X seemed to be obsolete on arrival.

Re:No AGP! (1)

JazzCrazed (862074) | more than 8 years ago | (#13971412)

Moving to PCI E is definitely a pain... Lacking in an option in motherboard that had both AGP and PCI E, I bought a dead-end stop-gap, which is a 6600GT AGP card. I love it so far, but I knew I'd have to take the big gulp of getting a new mobo and a new card at once down the line. :(

But finally, at least nowadays we have options like this one [] with both kinds of interfaces on them, so I can buy the mobo now and the graphics card later.

Re:No AGP! (5, Insightful)

DigiShaman (671371) | more than 8 years ago | (#13971703)

The AC is correct. The fastest, last AGP card from ATI was the X850 XT PE. If you want anything faster or new, it's only offered in PCI-E. To be frank, this pisses me off. There is a whole market with people running fast CPUs and DDR 3200 memory that do NOT want to swap out their motherboard. I cannot imagine why in the hell the current crop of video chipset cannot handle the bandwidth provided by AGP 8x. I mean, clearly there is a market for AGP cards.

I'm sorry, but I will not swap out my CPU and motherboard just so I can install faster cards only available in PCI-E.

Uninformative: Here's a summary (3, Informative)

fuzzy12345 (745891) | more than 8 years ago | (#13970929)

It's been some time since we last ran our last GPU Price-Performance shootout. Despite nine months having passed, not a whole lot has changed the landscape.

The real sweet spot for graphics is in the $250 to $300 price range.

We have no idea what the heck is going on here.

The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT? In short, we don't know.

Re:Uninformative: Here's a summary (5, Informative)

ruiner5000 (241452) | more than 8 years ago | (#13971388)

Yeah, Extremetech is after all a big tech publishers attempt at a tech enthusiast site. If you are in the $250-$300 range then you should spend $33 extra bucks and go with this evga 7800GT. [] It is worth the extra chunk of change. Not only will it be much faster than the cards that Extremetech recommends, but it also uses less power than the 6800GT, and therefor puts off less heat. That is a no brainer in my book.

Re:Uninformative: Here's a summary (5, Insightful)

aywwts4 (610966) | more than 8 years ago | (#13971445)

I think this shows everything thats wrong with the tech review industry. They Adver-review cards pretty much only for kids to drool over and feel bad about their existing card that works just fine on pretty much every game they play; and 'Enthusiasts" IE, one born every minute.

Instead of working as a consumer reports type site, where If i want to buy a good graphics card for my ~700-1100 dollar computer (Not my 4 grand alienware) I would be digging through archaic reviews from a few years ago with test results on old drivers.

Wow, this just in, a 700 dollar card dual SLI card can play games at resolutions larger than my monitor can handle, at colour depths the human eye can't discern, at a framerate so fast the human eye doesnt pick it up, on a game that probably wasn't made to take advantage of the card, and with an actual visual performance increase I can barely notice. But the good news is I smoke em when I run a benchmark utility.

Which card for Linux? (0)

Grendel Drago (41496) | more than 8 years ago | (#13970931)

I have an ATI Rage 128, which makes X run quite slowly---dragging windows is laggy and much worse than on the equivalent Windows machine next to it. (I'm running a new install of Ubuntu 5.10.) Were I to get a budget graphics card, what should I pick up? I was told that nvidia tends to have better acceleration support in Linux; is there a good list of this sort around?

Re:Which card for Linux? (1, Offtopic)

IntergalacticWalrus (720648) | more than 8 years ago | (#13971002)

Unless closed drivers bother you, nVidia is the only sane choice. Just pick up the cheapest one you can find.

Re:Which card for Linux? (1)

AndyG314 (760442) | more than 8 years ago | (#13971007)

The nvidia linux driver is pritty good, and I have had great luck with it on several cards. Downside is the driver is closed source. For ATI I have had good and bad results depending alot on the driver used. Generaly I stick to nvidia, with linux as I have found that it leads to the least issues.

Re:Which card for Linux? (1)

halltk1983 (855209) | more than 8 years ago | (#13971135)

The newest version of X has ati-drivers built in for the 7500 and up. Very painless install. Thank to the guys at Xorg.

Re:Which card for Linux? (1)

The Warlock (701535) | more than 8 years ago | (#13971359)

I don't think those have 3D support, though. It's the closed-source fglrx drivers that have that, and those drivers suck.

Re:Which card for Linux? (1)

halltk1983 (855209) | more than 8 years ago | (#13971385)

They do have 3d acceleration. I game on it, and get better framerates than I do in windows on the games I do play. Definate 3d acceleration there.

Re:Which card for Linux? (1)

yfkar (866011) | more than 8 years ago | (#13971758)

When I last heard they had 3D acceleration for R1xx and R2xx Radeons (from Radeon 7500 up to Radeon 9250).

Re:Which card for Linux? (4, Informative)

Tyler Eaves (344284) | more than 8 years ago | (#13971016)

Nvidia is really the only way to go for 3D in linux. If you really only need 2D, I've heard good things about the old Matrox cards, but good luck finding one.

Re:Which card for Linux? (0)

Anonymous Coward | more than 8 years ago | (#13971171)

I've had to dig my old Matrox G400 from storage (thank gods I didn't ditch it) after my ATI Radeon 9800 Pro blew. Drivers work just fine and 1280x1024 with 24bpp is no problem. But like the previous poster said, playing anything modern 3D accelerated before upgrading (which will happen when finances allow) is out of the question.

Your comment is woefully obsolete (3, Insightful)

FreeUser (11483) | more than 8 years ago | (#13971566)

Disclaimer: I make extensive use of both nvidia and ati hardware under GNU/Linux.

Nvidia is really the only way to go for 3D in linux. If you really only need 2D, I've heard good things about the old Matrox cards, but good luck finding one.

Not true. The proprietary ATI drivers (currently version 8.18.8) work as well as the nvidia drivers on both my amd64 and x86 boxes. Nvidia works fine (except for incessent flickering at 1920x1200 on one machine), as does ATI (but no flicker on that one machine). ATI works better ati 1920x1200@60Hz, but nvidia draws specular hilights on a celestia-rendered hi-res Earth better that ATI. In short, its a wash, with each manufacturer/driver having strengths and weaknesses the other does not.

The choice these days is one of personal preference. Your comment is at least a year behind the current state of the art, at least in the GNU/Linux world.

Re:Which card for Linux? (0)

Anonymous Coward | more than 8 years ago | (#13971045)

If you want a budget card, then you could probably live with an ATI 9200 which has open source drivers for it.

Re:Which card for Linux? (2, Interesting)

slashedmydot (927745) | more than 8 years ago | (#13971141)

I don't think your graphicscard is the problem. My testmachine has an ATI Rage Pro 4MB PCI graphicscard and it runs perfectly with the latest X. So I doubt that an ATI Rage 128 isn't good enough.

Your CPU or the amount of RAM is the most probable cause of the slow performance.

I doubt it. (1)

Grendel Drago (41496) | more than 8 years ago | (#13971318)

I'm running a blank GNOME desktop (just put on the default install; haven't customized anything) on an Athlon T-bird 750 with half a gig of memory. It's not grinding disk or anything. It really looks to me like a lack of 2D acceleration.

Also, while being impressed by the Xscreensaver demos, I noticed that some of them displayed artifacts (triangles with one vertex stuck to the left side of the screen). I figured this was due to bad OpenGL support on the card, which also led me to blame it.

(Incidentally, does anyone know where to send screenshots of buggy OpenGL drivers? I assume I can just screenshot them as I would anything else. I'm using the "r128" driver; does this go to the people?

Re:I doubt it. (1)

MBGMorden (803437) | more than 8 years ago | (#13971454)

Are you sure that your driver is actually functioning and that you're not using VESA drivers? I've used Linux on all sorts of cards (ATI, S3, Trident, Nvidia, 3dfx, and probably a few more), and I've never had one result in laggy X11 in 2D (3d is another story entirely though).

Re:Which card for Linux? (0)

Anonymous Coward | more than 8 years ago | (#13971246)

I'm assuming you're not running the latest greatest 3d games and whatnot with that card. If you just want good solid 2d desktop performance and don't care about the latest greatest fancy pants 3d, then you just can't beat running a good Matrox card under Linux. I've tried nVidia cards before and have had random crashes in X here and there and whatnot. Whenever I've used a Matrox card though it's been rock solid.

Ah, should've run x11perf. (1)

Grendel Drago (41496) | more than 8 years ago | (#13971377)

I clearly should have discovered x11perf before writing this post, so I could at least have some numbers to complain about. I can't even tell if the problem is GNOME or X in general. I suppose I can run some x11perf benchmarks and compare them to... something.

Re:Which card for Linux? (0)

Anonymous Coward | more than 8 years ago | (#13971392)

Actually, it's probably GDM. I wondered why my really nice box was showing Xorg with about 20-40% CPU when idling, and 60%+ when moving windows around. Heck, I'd get sound crashes randomly when browsing and playing music. Everything seemed too slugish.

It turns out that PLENTY of people report the sluggishness, but the Ubuntu/Debian Unstable boards don't have a clue as to why. All it affected was GNOME though. So, I took a clue from my playbook when I was trying to figure out why my laptop ATI card would drop DRI and GLX when logging in. GDM would somehow kill my direct rendering (a known issue in XFree86).

So, last night I installed the Kubuntu packages and switched to KDM. Bam, no more nasty, slow windows, even when I login with a Gnome session. Also, no more crashes (actually less crashes, the sound still has an issue with my 8100's PCI registers). You'll see Xorg's resources drop A LOT if you switch to KDM. Hope that helps!

Thanks! (1)

Grendel Drago (41496) | more than 8 years ago | (#13971612)

I'll make sure to try that out when I get home. So I just apt-get install xdm, and change... some config file to point to xdm instead of gdm? I'm sure it's something in /etc/X11 that I don't know off the top of my head. (I'm not looking to install the whole KDE megillah.)

Re:Which card for Linux? (1)

farzadb82 (735100) | more than 8 years ago | (#13971408)

I would highly suggest getting a card from the NVIDIA FX series. The linux desktop of tomorrow will require more and more 3D acceleration. IMO, you are simply wasting money if you buy anything cheaper. I would recommend the FX-5200 (which is what I have btw). You should be able to pick one of these up for around $50 (or less) + shipping.

Re:Which card for Linux? (1)

hometoast (114833) | more than 8 years ago | (#13971462)

I, too, have a FX5200 under linux and it has treated me quite well for the price. I'm not running Doom3, but 2D is a flash, and I can play quake3-series and its mods, NFSU, GTA series with (almost) all the goodies turned on.

Re:Which card for Linux? (1)

stanmann (602645) | more than 8 years ago | (#13971468)

The FX-5200( which I also use in my primary machine) does an admirable job on almost everything. Including Doom 3 at a reasonable resolution and framerate. I haven't had a chance to try it against HL2 or Q4, so I can't comment there

Re:Which card for Linux? (1)

The_Dougster (308194) | more than 8 years ago | (#13971735)

I'm still real happy with my FX 5900, and I'm not planning on upgrading it anytime soon. If you can find one of these they are much better than the other 5xxx series. I've always used Nvidia cards myself, and they work nicely in Linux.

Re:Which card for Linux? (1)

SyntheticTruth (17753) | more than 8 years ago | (#13971749)

If it's just for Xwindows, I'd suggest an nVidia 5200FX card, because right now, they are very cheap/budget friendly. I use it currently at home and it runs UT2004 and other OpenGL apps/games/screensavers beautifully. I'd not rec'd it for windows games anymore, though, but for linux, it's been a beauty of a card.

This is a rebadged 6800 (2, Informative)

amcdiarmid (856796) | more than 8 years ago | (#13970937) []

Nice of them to cut the price. I would like them to keep the SKU so I didn't have to keep up with anotherone: Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.

Re:This is a rebadged 6800 (0, Troll)

BushCheney08 (917605) | more than 8 years ago | (#13971117)

From your link:
Both cases are showing that companies didn't planed things trough and that they had to respond with a new product to stay in the game, it's called bad planning.

The Inquirer showes that they cant proofreaed and that they shouldn publish without edting, it's called bad writing.

You should see the first reviews tomorrow late afternoon today, Greenwich Mean Time.

WTF does that even mean? How can I even take that site seriously?!?

Re:This is a rebadged 6800 (0)

Anonymous Coward | more than 8 years ago | (#13971455)

The expression, spelling, and placement of words is not an indication of other skills a person may or may not have. It does provide a common ground for you to directly compare yourself to that person and that is why many people have a tendency to be critical of others launguage skills but the comparision ends there with that one particular skill and is not related to other skills you or that person may have. If language usage and writing skills were universal indicators of overall knowledge, you could claim that anyone that writes or edits for a living would have a higher overall knowledge and be a more trustworthy source of information then someone that does not write or edit for a living. In fact, the writers and editors are really only making the material they are given a little more presentable and probably have little real knowledge of the subject material. You can only make a decision about a person after you seperate the content or the material presented (the facts or idaes) from the presentation itself.

Re:This is a rebadged 6800 (1)

BushCheney08 (917605) | more than 8 years ago | (#13971621)

I'm not saying anything about the author's knowledge of the subject matter. What I am commenting on, however, is exactly what you say -- the presentation. I would argue that the point of posting text online (and especially via a news site) is to COMMUNICATE information. If the ideas that one is trying to communicate get lost in gibberish like "You should see the first reviews tomorrow late afternoon today, Greenwich Mean Time," (which is it?) then you have failed at the task of communicating. Knowledge of a subject matter means very little if you cannot properly communicate it to others, especially when you are ostensibly in a communications-related business, such as the news.

Re:This is a rebadged 6800 (1)

dividedsky319 (907852) | more than 8 years ago | (#13971272)

Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.

Isn't that what happens with technology... prices go down? I got a 6800 for Christmas last year, a black friday CompUSA deal for 200 bucks after rebate... By this time, I'd almost expect it to be down around 100 bucks.

Also, on another topic, on some of these cards you can use RivaTuner to unlock the extra pipes and pixel shader, too... great if it works, but of course it's not guaranteed. Mine, unfortunately, shows artifacts when unlocked. Unlocking the GeForce 6800 []

Re:This is a rebadged 6800 (2, Informative)

ameoba (173803) | more than 8 years ago | (#13971307)

It's not simply a 'rebadged' card. Not only did they bump the clock speeds from the 6800's 325MHz core and 700MHz memory to 425MHz core and 1000MHz memory, they also switched from DDR to DDR3 memory to achieve the new memory clocks. This is as much of a difference as there is between the 6600 and the 6600GT.

It's not so much of a price cut on the 6800GT as it is an clock-speed (and price) boost to the vanilla 6800 that brings its performance to the same level as the 6800GT while still keeping a lower price point (the 12-pipe 6800 being cheaper/easier to produce than the 16-pipe 6800GT).

A low cost high end card (1)

UberHoser (868520) | more than 8 years ago | (#13970938)

So does this mean that I don't have to spend 400+ for the latest and greatest ?

'thunk' (Falls over)

Of course I can't read the article cause of the content filter (surf nazi's) here at work. But low cost is how much ? High end is what?

Inquiring minds want to know !

Re:A low cost high end card (0)

Anonymous Coward | more than 8 years ago | (#13970960)

Yes! Instead you only have to spend 250! Sounds like a real budget to me.

This is insanse (4, Insightful)

PoderOmega (677170) | more than 8 years ago | (#13970953)

We are often asked "Which video card should I buy?" We always answer with "well how much do you want to spend?" The inevitable reply is that everyone wants to run all the latest graphics-heavy games at high resolutions with all the features enabled, but they only want to spend $100 to $150 to do so. Sorry to say, but that's just not going to happen. The real sweet spot for graphics is in the $250 to $300 price range.

I cannot express how frustrating this is. People, please do not spend more than $150 on video card. This is just insane. I guess we do need people like this to keep the graphics market hot by paying $300 for a card. I just hope game manufactures don't think that their games should require $300 cards.

i agree (1, Insightful)

Anonymous Coward | more than 8 years ago | (#13971052)

i have never paid more than 150 for a video card and have gone long periods of time with out upgrading. $150 USED to be the sweet spot for price/performance now its 250-300$ like they point out. lame. anyways don't be fooled by the hype. you don't need a $300 card.

Re:This is insanse (2, Insightful)

IntergalacticWalrus (720648) | more than 8 years ago | (#13971062)

What I always say about this: if it costs more than the current game consoles, it's too much.

Though I guess I might have to change my reasonning soon, seeing Sony and Microsoft appear to be aiming quite high in their next generation...

Re:This is insanse (3, Insightful)

Buddy_DoQ (922706) | more than 8 years ago | (#13971068)

What else are young gaming geeks going to do with their money? They live at home in mom and dads basement with 100% disposable income; 300 bucks for a new GPU is nothing. It's a hot-rod culture, rather than mustang parts, it's computer parts.

Re:This is insanse (3, Insightful)

rovingeyes (575063) | more than 8 years ago | (#13971121)

I just hope game manufactures don't think that their games should require $300 cards

Simple - OEM pressure. I can confirm this because I have a friend who works for Microsoft and I asked him why is that every year we are forced to upgrade. Can't you guys do with what is already available? He told me that they can optimize the systems to run far better on existing hardware but the OEMs don't like that. Dell apparently wants users to upgrade every 2 years or so. Bottom line - they don't care about end user. They know that the end user will spend to use the latest and greatest software.

Re:This is insanse (1)

chihowa (366380) | more than 8 years ago | (#13971725)

But Dell has basically zero bargaining power against Microsoft. What, are they going to sell all of their PCs without Windows on them? They'd go under almost instantly. For consumer PC operating systems, Windows is the only game in town right now. That means Microsoft can do whatever they want and Dell just has to take it.

Re:This is insanse (3, Interesting)

CastrTroy (595695) | more than 8 years ago | (#13971128)

This is especially true when the newest console is only $300. I like PC gaming more than console gaming, but in the last year, i've switched to consoles because its just so much cheaper. In about the time that a console stays around, 3 years, you'll upgrade your video card a couple times, or upgrade it once and spend twice as much. Meaning that just the video card(s), not including all the other upgrades necessary will cost as much as the console. I got tired of trying to keep in my head which video card is good, because there is about 75 models out there, and which one has the proper drivers to support the games I will want to play. Also, what bothers me is that if I upgrade my operating system, my video card which is a few years old might not have supported drivers, or if I buy a new card, it may not work in my older operating system, forcing me to upgrade. I really gave PC gaming a chance, but there's just too much hassle. I'd rather put up with games that don't look quite as good, or maybe are a little less fun to play, for not having to deal with the frustrations of playing games on PC.

Re:This is insanse (-1)

Anonymous Coward | more than 8 years ago | (#13971203)

Nonsense. You're a spoiled brat--go to a library and find old magazine ads from the 90s and tell me $300 is expensive. It's dirt cheap even ignoring inflation.

If anything, we need to get people to spend more money on graphics cards, not less. Chipmaking and designing costs are going up with each generation, not down. 256-bit memory means more RAM chips compared to 128-bit. Go ahead and count them if you don't believe me. The mask sets that used to cost a few hundred K are now millions. Don't be forcing a Wal-Mart effect on the industry or you'll flush Nvidia to China.

Besides, if you aim low you'll never improve. S3 and XGI aim for cheap cards and look where it has gotten them. Nowhere.

Re:This is insanse (1)

ak3ldama (554026) | more than 8 years ago | (#13971766)

mod ac up!

I don't know why he was modded down, he makes a lot of valid points. Find some old Byte magazines and take a look. Hell, back then you had to buy your own C compiler; our new gcc world rocks doesn't it!

Battlefield 2: Graphically-intensive Warfare (1)

design by michael (924422) | more than 8 years ago | (#13971354)

I think EA Games is under the impression that most gamers own $300 video cards -- their latest installement of the Battlefield series is both processor and video card intensive. I've got a pretty decent system that I built for myself and am using a $200 Nvidia card and STILL have to run BF2 at 800x600 (60 hertz) with all the options at LOW if I want optimal performance. I am under the impression that game manufacturers don't build for what the average gamer may own, but rather continually attempts to push the whole technology envelope. When will it ever be enough?

Re:Battlefield 2: Graphically-intensive Warfare (1)

AvitarX (172628) | more than 8 years ago | (#13971482)

I am supprised.

I have pretty good (lower top end) machine from a year ago and was able to run at decent res and decent settings (but not high).

The card is a 6600GT and was $250 at the time (purchased after the computer, so may not be a year old).

Re:Battlefield 2: Graphically-intensive Warfare (0)

Anonymous Coward | more than 8 years ago | (#13971668)

I have pretty good (lower top end) machine from a year ago and was able to run at decent res and decent settings (but not high).

Wow.. I've got a GeForce 6800 (standard, not GT or Ultra) and an Athlon XP 3000 (Barton) with 1.5G of RAM and I can't get BF2 to run decently on anything higher than 800x600 with most everything at low. Now I'm wondering if I've got a driver issue or something.

Re:This is insanse (0)

Anonymous Coward | more than 8 years ago | (#13971557)

What I don't understand is why people are paying $250 for a 6800nu when they can get a 7800GT for $200.

The Irony! (4, Insightful)

Zemplar (764598) | more than 8 years ago | (#13971009)

Design goals:
  1. CPUs: High cost, low power
  2. GPUs: Low cost, high power

Granted this is a rough approximation, but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving.

Re:The Irony! (1)

ciroknight (601098) | more than 8 years ago | (#13971199)

Give it time. Remember, graphics co-processors entered the game quite a bit after their general processing counterparts.

Just as desktop CPUs are leaving the era of High heat, High power, balls to the wall performance busting, GPUs are entering it. I'm sure when people start to realize their 1GHz graphics card has a cooler bigger than their old P4s solid 400g piece of aluminum and a fan louder than a trainwreck the industry will come to its senses.

And maybe, just maybe I can get a nice, quiet, low power, high performance box.

Re:The Irony! (1)

Zathrus (232140) | more than 8 years ago | (#13971394)

but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving

This is largely because of the completely different design methods and timelines in the two fields.

CPUs are designed pretty close to the transistor level. They optimize the crap out of them, and try to do the most work with the least transistors. You have a lot of flexibility in changing the die size, the power consumption, and so forth. You can also ramp up the clock speeds to insane levels -- 3-4 GHz currently. This at the expense of time -- you generally only produce one new CPU core every 3-4 years, with various tweaks in between to increase speed, add small features, etc.

GPUs are designed at the block level. Need an shader? Plop -- here's the shader block. Need another? Plop. Identical. Sure, you could combine a bunch of transistors between the two, but that's far below the block level. Obviously the downside here is that you're going to have huge transistor counts, lots of waste (in power/heat and die size), and so forth. But the GPU market moves at a rapid pace right now -- entirely new cores every year or so. With a round of fairly mild tweaks on that core 6-9 months down the road. Clock speeds are low (a few hundred MHz, with the fastest now approaching a GHz), but each clock tick is doing a LOT of work (mostly parallelized).

I suspect GPUs will eventually hit the wall with their current design methods, but that won't be until they stop adding new features every cycle. We hit that particular "wall" with CPUs several decades ago -- the features being added now are relatively minor in comparison. Right now if Nvidia or ATI were to change design strategies they'd be run over by the other one.

Video card naming schemes: CONFUSING (5, Insightful)

Work Account (900793) | more than 8 years ago | (#13971011)

I wish video card makers would be more CLEAR when they decide on names for their cards.

We are one step away from having "Nvidia Model 8912347892389110".

For lay men like myself who buy a new video card every few years, it is hard knowing what is what in the video card market since the names are very confusing i.e. 6800 GS vs. X800XL vs. 6800 GT.


Re:Video card naming schemes: CONFUSING (1)

springbox (853816) | more than 8 years ago | (#13971116)

Yeah, they lost me after the GeForce 4. And by the way, I'm still using a GeForce 4 because of the cryptic scheme and lack of comprehensive reviews for the cards that don't cost a fortune. It's the same with AMD and Intel and their new naming schemes. It's harder to tell now which components are newer than the others.

Re:Video card naming schemes: CONFUSING (1)

Man in Spandex (775950) | more than 8 years ago | (#13971245)

How is that different from cars?

Re:Video card naming schemes: CONFUSING (1)

stanmann (602645) | more than 8 years ago | (#13971515)

Because in 99.99..% of cases you can test drive a Car and be reasonably certain it will meet your specifications. in those cases you can't, you can usually find a review or rental to work out the fine details.

Re:Video card naming schemes: CONFUSING (5, Informative)

StaticEngine (135635) | more than 8 years ago | (#13971319)

If you're confused about what to buy, you should check out this site: []

Specifically, the "Compare Cards" feature on the left. I just upgraded my ATI 9600XT to a nVidia 6600GT AGP (because I'm not yet ready to drop a grand on an all new PCIe 64-Bit system), and that site helped me decide what was "enough" of an upgrade for how much money I was willing to spend.

Re:Video card naming schemes: CONFUSING (2, Insightful)

Txiasaeia (581598) | more than 8 years ago | (#13971524)

Gah! Wish I would have found that site *before* I ordered a new video card on Saturday! Excellent, excellent site! Wish I had mod points.

Re:Video card naming schemes: CONFUSING (1)

MBGMorden (803437) | more than 8 years ago | (#13971599)

What's also EXTREMELY frustrating is that most review sites only benchmark all the new cards versus each other. When I'm looking to upgrade, I need a comparison to older cards (ie, like one I might own. currently an ATI Radeon 9000 Pro) to judge not only how fast a card is versus the competition, but also how much faster is it going to be versus what I currently have. It does me no good to know that the Geforce FX Platinum Value series is half as fast as the normal series at 1/4 the cost if I actually don't know that it's only 15% faster than what I have (numbers and names just made up; I haven't kept up with the cards recently).

Heck all I want is a card that will play Neverwinter Nights 2 at full detail 1280x1024 (native res for my flat panel). The only strategy I've come up with aside from wading through reviews for days is to wait until the game comes out then buy whatever $200 card Nvidia is pushing at the time.

Does it have Free drivers? (1, Interesting)

mechsoph (716782) | more than 8 years ago | (#13971032)

If they're only offering binary drivers and locking up the specs, I'll be sticking with my aging, but still quite capable, Radeon.

Re:Does it have Free drivers? (0)

Anonymous Coward | more than 8 years ago | (#13971159)

NVIDIA's drivers are free they're just not open source.

They work damn well (1)

everphilski (877346) | more than 8 years ago | (#13971335)

I have a Geforce 4 at home and a Geforce 6800 GT at work. Both work very well under linux. No its not open source but the installation program compiles a custom interface if it can't find a standard one that will just work.


Nvidia Launches New Affordable GPU (2, Informative)

springbox (853816) | more than 8 years ago | (#13971067)

This is great, but this title seems like an oxymoron at first (NVIDIA = Cheap?) They used to make cheap video cards in the past that were crippled and preformed poorly (the GeForce 4 MX cards.) A good NVIDIA card used to cost 1/2 the price of an affordable computer, around $400. The last time I checked, all the value cards were around this $100 price range. I hope they can actually make something that's cheap and decent.

You can probably get that previously $400 GeForce 4 card now for around $80. Probably would be more than enough for most people.

Re:Nvidia Launches New Affordable GPU (0)

Anonymous Coward | more than 8 years ago | (#13971613)

The Geforce 2 MX was the last *really* good budget card from Nvidia; after that the MX cards did not perform at a level that was good enough for people to play the newest games at a decent level.

What I never understood was why it was so difficult to produce a decent budget graphics card; that is one that performed well and didn't cost too much. If you just produced a graphics card with half as many pipes, and at the same time used a much cheaper form of memory, you should be able to obtain decent performance at a fraction of the cost.

Maybe I'm over-simplifying, and it may be more difficult; or maybe they decided that the 'budget' market simply wasn't valueble enough.

$250 (5, Insightful)

RCVinson (582018) | more than 8 years ago | (#13971082)

$250 makes for "a new low-cost, high-power graphics processor"?

Agreed WTF? (4, Insightful)

bogie (31020) | more than 8 years ago | (#13971372)

$250 is a new breakthrough in affordability?

I was naively waiting to read about a $100 gpu that performed well enough to play today's games at lcd resolutions.

When you can build a very fast system with everything sans gpu for $400-$500 spending more than half the system cost on a single component sounds fucking stupid.

Re:Agreed WTF? (1)

Jerry Rivers (881171) | more than 8 years ago | (#13971647)

Graphics card company CEO, Dr. Evil: I know! We'll make a card for cheap and sell it for (finger to lip) the HUGE sum of $250!

CEO henchman: Uh, sorry Dr. Evil, but $250 really ISN'T a lot of money these days. Now where'd I put my iPod?

Re:$250 (1)

ameoba (173803) | more than 8 years ago | (#13971594)

I just hopped over to and they were listing the first 6800GS for $209. The lowest priced 6800GT is $269. The lowest priced 256MB 6800 in PCIe is $209 (there are cheaper 128MB cards on AGP, but I wanted to keep the numbers relevant).

With the performance being nearly identical between the GS and the GT, the result is a 20% drop in the price at this level of performance (or a major boost in performance at the $209 level). Either way, I think it's fair to call it low cost, as long as you qualify that by placing it in the context of high-end graphics cards.

I'm just left wondering how well, compared to the GT, these new cards are going to overclock. The GT is known for overclocking rather nicely - I have doubts about this card, since it's a hotter clocked version of the 12-pipe model (which normally ships at 325MHz compared to the GS's 425) which may already be running close to its limits.

Old article... (0, Offtopic)

TinBromide (921574) | more than 8 years ago | (#13971104)

Call of Duty 2 Demo: Infinity Ward has dropped the Quake engine in favor of their own new DX9 code. We recorded our own custom timedemo and ran through it with all the visual quality settings cranked up. We look forward to benchmarking the full game when it is released. Oh boy oh boy! i can't wait until cod2 comes out... Wait, i've been playing it for like 2 weeks now. Nothing to see here! Move along people!

Re:Old article... (0)

Anonymous Coward | more than 8 years ago | (#13971231)

Did you bother to see that the frame rate is 30 on top cards?

Interesting, but (-1, Offtopic)

spect3r (909619) | more than 8 years ago | (#13971162)

Did the GPU just come to be??? Or Is it a product of Intelligent Design [] ????????? I'll have to check with my faith consultant before I buy this product. :)

YES! 7P (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#13971194)

charnel ho0se.

Very nice (1)

GmAz (916505) | more than 8 years ago | (#13971227)

I personally am the kind of person that enjoys having the best of the best in my computer. However, since I purchased my latest equipment at the beginning of the summer, i haven't even looked at all the new hardware. I have reached a point where my computer is so powerful that nothing can really halter its performance to a noticable level. As for gaming, I own a 6800GT. I got it on the first day it was available, but I didn't pay the $400+ price tag. I worked at CompUSA and got it on day 1 for $300. Word to the wise, find someone that works at an electronics store, befriend them and have them buy you a video card with their discount. Chances are you will save in excess of $100. And one more thing, F.E.A.R. runs great on my 6800GT with rather inpressive graphics. No, its not 1600x1200, but its high enough to make me happy, and thats hard to do with computer performance. And I agree with several of your replies, don't spend a lot on a video card unless you are 1) crazy like me 2) have money to blow 3) want to keep your system for 5+ years and don't to upgrade. From what I hear, Windows Vista will be rather graphically demanding.

What's a "SKU"? (1)

Ossifer (703813) | more than 8 years ago | (#13971251)

"Stock code unit"? Or is it some type of geekware?

Re:What's a "SKU"? (0)

Anonymous Coward | more than 8 years ago | (#13971306)

Sucky sucKy sUcky

Re:What's a "SKU"? (1)

SlayerDave (555409) | more than 8 years ago | (#13971374)

It took me 17 keystrokes and 2 mouse clicks to Google "acronym finder" and look up SKU to determine what it meant (I'm not going to do your homework for you). You know, Google's not that hard to use - try it some time!

Re:What's a "SKU"? (1)

Ossifer (703813) | more than 8 years ago | (#13971426)

Sorry I caused you acrtual keystrokes... maybe you should just not respond if this puts you out so...

Re:What's a "SKU"? (2, Informative)

Anonymous Coward | more than 8 years ago | (#13971528)

Three clicks and three keystrokes in Firefox, once you provided the "acronym finder" to highlight, beeotch.

It's Stock Keeping Unit

Warning: story submitted by hardcore gamer (2, Insightful)

Anonymous Coward | more than 8 years ago | (#13971252)

...because no other standard-model human being would consider a $250 video card to be "affordable". Hint: for non-powergamers (including most geeks) "low cost" GPUs stop in the vicinity of $100.

Multi Core / Processor (4, Interesting)

squoozer (730327) | more than 8 years ago | (#13971361)

Is there a technolgical reason why multiple GPUs can't be put on a card? I freely admit I know very little about graphics cards but it seems like it might be a cheap way to make a very powerful card. I seem to remember there was a card with two processors on that failed dismally because basically twice the price. What about a card with 4 or 8 cheap processors? Ok the power consumption would be silly but as long as it could be throttled so that when not playing a game only 1 GPU was used it might work. Just thought I'd share that with you all :o)

Re:Multi Core / Processor (0)

Anonymous Coward | more than 8 years ago | (#13971410)

They can. 15/143208&from=rss [] (It's not the only one, but it was the first one a Google search for 'dual sli single card' came up with)

Re:Multi Core / Processor (3, Informative)

KitesWorld (901626) | more than 8 years ago | (#13971478)

There's a Dual-GPU version of the 6600 available from Gigabyte. The problem mostly comes down to power consumption and heat.

That's more or less why SLI and X-fire are multiple-card solutions as opposed to expandable single-card solutions - it's that or have a single card with a heatsink so heavy it breaks the PCB.

Re:Multi Core / Processor (1)

grommit (97148) | more than 8 years ago | (#13971499)

No, there's not.

Do a search on slashdot for previous reports of both Nvidia and ATI doing just this.

Old Trick (4, Informative)

Nom du Keyboard (633989) | more than 8 years ago | (#13971376)

Once upon a long time ago I worked for Control Data Corporation (anyone remember them?). CDC had a trick, which wasn't new to them, of re-badging essentially the same system with a new model number and a lower price. An example at the time was their popular CDC 3300 mainframe becoming the CDC 3170. The only difference between the models was that the CDC 3300 had a 1.75uS clock, compared to the CDC 3300's 1.25uS clock. Move one wire (the right wire!) inside and the CDC 3170 became the CDC 3300 in all respects except for the name badge on the equipment bays and console.

Why do this I wondered? The problem was in government contracts. After you'd paid back the design costs addition computers could be pumped out at a cheaper price while still both making a profit and remaining competitive. The fly in this ointment is that the government, who often bought quantities of the earlier models where cost was not the first concern (when has cost ever been a concern to governments spending tax money?). I was told that the government contracts stipulated that if you ever lower the price on something you've sold them you have to rebate them the entire difference on every system delivered. Of course that would bankrupt any company, so they resorted to this rather transparent subterfuge.

Perhaps some form of that's what's happening here as well.

A (possibly) silly question (0)

Anonymous Coward | more than 8 years ago | (#13971378)

At what point do these processors become so powerful that we can replace the motherboard with them? The GPU becomes the main board and everything else plugs into it.

Not exactly cheap! (1)

leblin (765931) | more than 8 years ago | (#13971505)

I can't even afford to spend $250 on a new computer :/

Re:Not exactly cheap! (1)

Angelox (764087) | more than 8 years ago | (#13971684)

Tell me about it! I think the most I ever spent on a video card was about 130.00 on a Voodoo 5 card at EB which I later found out, the company that made it, just sold out and quit (I got screwed). Since then , I never spend over 60.00 and my last card was an MX4000 for 30.00 at computergate.

This nVidia dominance has me worried... (1)

gozu (541069) | more than 8 years ago | (#13971521)

While I've been enjoying my 6800GT and 7800GT cards, I'm worried by the fact that ATI can't seem to keep up. Ever since they lost the dominance they had aquired with their 9700/9800 series, They've been behind in performance, street dates, availability AND prices. It's already been 2 generations now. Any gamer knows that, today, nVidia reigns supreme.

I hope that ATi regains the upper hand in the next round because things are looking grim for them. nVidia is a bigger company with bigger coffers and better marketing skills so they can withstand bad times more easily than ATi. They handled the whole 5700/5800/5900 debacle very well considering ATi's offerings ate them alive back then. God forbid ATi should go bankrupt and we end up with a defacto nVidia monopoly!

Re:This nVidia dominance has me worried... (0)

Anonymous Coward | more than 8 years ago | (#13971831)

FYI: ATi is actually the "bigger" company. ATi has somewhere around 1000 more employees then NVIDIA.

How About A Power Consideration? (1)

c_spencer100 (714310) | more than 8 years ago | (#13971635)

It's nice that they're trying to target gamers on a budget, But how about finally targeting people on a power budget? I want to upgrade my graphics card, but my options are limited without having to upgrade my 300 watt power supply. And since it's a small form factor case, my options are even further limited.

If they really want to do something good, how about they manufacture a power efficient GPU that doesn't excessively sacrifice performance? I know I can't be alone - Heck most of the time my GPU runs hotter than my CPU!

Re:How About A Power Consideration? (0)

Anonymous Coward | more than 8 years ago | (#13971753)

I want to upgrade my graphics card, but my options are limited without having to upgrade my 300 watt power supply

You could look for fanless video cards [] ; they're probably low-ish power since that's less heat to lose efficiently.

Or read The Techreport's power and noise page [] in their 6800GS review. None of them topped 300 watts.

Buy this card, get F.E.A.R. free (1)

Dewser (853519) | more than 8 years ago | (#13971675)

Its insane on how much one would spend on a card. I picked up F.E.A.R. recently and stupid me doesn't bother reading the video requirements on the package, so naturally my decent video card was no longer adequate (been meaning to upgrade anyhow). But it is pretty rediculous that ever year a new graphics intensive game is released which would constitute you to have to upgrade video, RAM and soon CPU to run. They should start running bargains like Get this > and get your choice of > for free. greedy bastards :D Wait maybe not free but maybe like a 15% discount. A 50 dollar game could end yo costing you like 300 bucks just to be able to play it after you upgrade something. I do love my new card though :D

Re:Buy this card, get F.E.A.R. free (1)

Dewser (853519) | more than 8 years ago | (#13971721)

Damn no edit feature, correction since I added too many carots:

Get this - insert high end GPU of choice here - and get your choice of - insert graphics intensive FPS - for free.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?