Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

More on the GeForce 3

CmdrTaco posted more than 13 years ago | from the stuff-to-drool-on dept.

Technology 177

Tom has extensive poop on the GeForce 3. Some history, and a ton of why this card may or may not be worth the $600 price tag. Lots of stuff on the Vertex Shaders (which look to be a truly amazing idea). Very worth a read if you're interested in state-of-the-art 3D, or just wanna sneak preview at what next year's FPS will be doing.

cancel ×

177 comments

Sorry! There are no comments related to the filter you selected.

nice link (5)

CoreWalker (170935) | more than 13 years ago | (#398716)

I don't care what the context is, I'm always a little apprehensive about clicking on a link labeled as 'extensive poop'.

Buy a video card, and we'll throw in the computer (2)

Redwire (6282) | more than 13 years ago | (#398717)

How much does a basic PC cost these days? Not a whole lot more than the G3, right? How long will it be before video card manufacturers are throwing in a free PC with the purchase of one of their cards?

NVidia & MS - Too close for comfort? (1)

gdyas (240438) | more than 13 years ago | (#398721)

Don't get me wrong here. I love nvidia's work, have a GeForce2 Pro in my computer as we speak, and think they do a great job. My worry is that MS & nvidia might be collaborating to an extent where they create a problem similar to the wintel situation of not so long ago, where the government had to step in to stop the high level of exclusionary collaboration in that relationship. Software + Hardware = a lock on the market.

All that has to happen is that they become good enough buddies such that MS is customizing DirectX & their OS to work best with nvidia processors and nvidia works with MS to customize it's hardware to work best with DirectX & Windows. Together they end up creating an "industry standard" that through patents & IP locks out or inhibits competitors. Anyone else thinking something like this is possible or already in the works? I'd hope that nvidia, being as innovative as it is, will examine history and try not to let itself get tied too closely to this beast.

Oh, and as for the price thing, I'm avowed not to pay more for my GPU than my CPU. $5-600? As much as a technophile that I am, it's just too dear for me. I'll be sticking to a $250 maximum, thanks.

Re:I'm sick of upgrading! (1)

pod (1103) | more than 13 years ago | (#398723)

Please SLOW DOWN the video card technology! Not everyone can keep up. :( Thanks.

Well, at least this didn't have to be moderated to get to +2... it's just your standard whiny rant.

Whine, whine, whine, I can't afford the new stuff, so therefore no one else can, so all you video chip makers put a sock in it. First of all nVidia wouldn't be doing this if they wouldn't make any money. Rest assured, sales will be brisk right from the start. Second, upgrades don't come _all_ that often. You have the latest crop of cards (most expensive of which comes in at around $500-600). Did you whine then? Then you have the generation before that (or half generation), the TNT2 (I myself have TNT2 Ultra), and that was, what, 12-18 months ago. I bet you whined then too. And of course before that still you had Voodoo 3, Voodoo 2, and a bunch of other also-runs.

You see, this sort of progress waits for no one, except the marketing department. Are you also complaining about the speed race in the CPU market? I upgrade my CPU every 3 years, roughly. I went from 50 to 200 to 500 to probably 2000+ MHz (in another year) with more than enough performance to match. I bet you're complaining about that too, hmm?

Your problem is (and others like you) you have the 'old' generation of cards, which have suddenly been 'obsoleted' by this announcement from nVidia. Seeing as you probably paid a good bundle of money for that priviledge, you now feel cheated. Or maybe you just got a 'new' card and the box is still laying in your trash can, in which case you feel cheated as well. I mean, man, here I bought a brand spanking new card for at least $300 and it was just a waste of money! Grow up. The vast majority of people will never buy the latest and greatest, they can't afford it or don't have other prerequisite hardware to support it. Just wait 3-6 months after release and do what everyone else does: buy reasoable performance at a reasonable price.

Re:Cheaper to yank the video chip out of XBOX? (1)

geomcbay (263540) | more than 13 years ago | (#398724)

good luck. while the xbox is generally built around PC architecture, the graphics chip is not going to be sitting on a standard AGP card.

The better question is how hackable will the XBox be..Would one be able to get a more full featured OS than the cut down Win2k kernel running on it? And get support for SVGA monitors and other standard peripherals needed to turn this into a really nice workstation?

GeForce / Doom naming scheme (1)

Professeur Shadoko (230027) | more than 13 years ago | (#398726)

I like what he says about the way nvidia name their products : he says GeForce2 was only a speed bump, and that GeForce3 is something completely new. He wishes GeForce2 had been named something else...

But he did/is doing the same thing with Doom : Doom 2 was nothing more than an add-on to Doom, and Doom 3 will be _completely_ different. Hehe...

Re:I'm sick of upgrading! (1)

Yam-Koo (195035) | more than 13 years ago | (#398727)

If Nvidia can release new products every 6 months, why should they stop? Releasing new cards doesn't make your card any slower. People out there, however, are willing to buy new cards, so nvidia keep making them.

And there's no way that the gf3 will still be $600 a year from now. I'm guessing ~$250 OEM or less. And the release of the gf3 means all of the other cards out there now will be CHEAPER! Yes, the GF3 card that cost $400 a year ago is far below $200.

Fast release cycles make make people feel like they've got outdated technology, but really, they just mean that people get blazing fast products for a fraction of the price.

A perspective on the price. (1)

dwalsh (87765) | more than 13 years ago | (#398729)

If the primary area where you would benefit from improved performance is games (which is true for a lot of us as most office apps have not been CPU limited for a while now), then the benefit of $600 spent on a GF3 is likely to be greater, and the cost less, than upgrading to a P4 and new motherboard.

Not that I'm going to be buying a GF3 at that price...

fill rate vs. memory bandwidth (1)

_|()|\| (159991) | more than 13 years ago | (#398732)

Increasing fillrate is pointless, when things are already so memory-bound.

Exactly. Anand's GeForce3 preview [anandtech.com] puts it this way: "even the GeForce2 GTS with its 800 Mpixels/s fill rate was only capable of a 300 Mpixels/s fill rate in a real world situation."

If you look at the GeForce2 MX reviews, you'll see that it barely edges ahead of the SDR GeForce at higher resolutions, and falls well behind the DDR GeForce. Forget about doing old-fashioned FSAA on an MX.

? That`s just ridiculous (5)

andr0meda (167375) | more than 13 years ago | (#398733)


So it`s ok to be aware of market tactics principles and to approve or disapprove, but to actually NOT buy a graphics card which is superior to others because it makes the other businesses go slower ?

That`s definately NOT the right angle. In fact, if nVidia gets to sell this card as hot as the previous 2 versions, it can set (and raise) the standard in 3d lighting again, and frankly that`s what I want. Ofcourse monopolistic situations like e.g. Soundblaster are absolutely bad to competition (and quality) in the market, but that`s because a Soundblaster (could als have used Windows here) is a good product, just not top of the line. That doesn`t mean there are no other soundcards out there which are actually better, only that you`ll have to pay more for those and go out and look for them.

I support ATI and to a lesser extend, Matrox, because they are the only rivals left in the field. But If I had to buy a card today, it wouldn`t be either of those 2, because I simply want a 'standard' compliant full fledged and top of the line game experience, not the feeling that I did something good for market competition. In the end I might financially regret that choice, but if nVidia creates the best cards at the end of the day, I`m only happy to spend some cash on them. If someone else can do the same or top them AND has less expensive cards, obviously that`s a thing to consider. But today I cheer for nVidia, as I have more pro than con.

Actually, they did improve T&L and fillrate (1)

Gremlin77 (301513) | more than 13 years ago | (#398734)

I'd check out the Anandtech article if I were you. It looks like they put a lot of work into improving T&L and fillrate.

"The GeForce3 is thus the first true GPU from NVIDIA. It still features the same T&L engine from before, however now it is fully programmable."

Programmable is very good, maybe now we'll actually get to see hardware T&L in a game, rather than a press release.

Also mentioned in the Anandtech article: "This is what gave the GeForce2 it's "Giga-Texel" moniker since it was now able to boast a peak fillrate of 1.6 Gtexels/s. The GeForce3 takes this one step further by now being able to do quadtexturing in a single pass, offering a theoretical peak fillrate of 3.2Gtexels/s although it seems like it's definitely time to stop quoting theoretical fillrate numbers in this industry."

I'd say that this is quite an improvement. I'll stop quoting now.

Re:I use a Voodoo 3 (1)

Foxman98 (37487) | more than 13 years ago | (#398735)

I agree with you entirely. I too have a Voodoo3 (2000 PCI) and well, it hasn't had a game thrown at it yet that it hasn't been able to handle. Quake3? Smooth enough for me. Unreal Tournament? Again, runs fine. Hell, my room mate has a Voodoo Banshee and he enjoys Unreal Tournament just fine. But then he doesn't care about how many FPS he's getting. And neither do I. I'll upgrade when everything starts to come to a screeching halt.

On an entirely different note I'm trying to get my old Voodoo Rush Card to work with linux but am not finding to much info on it. Anyone got any points? It would be *much* appreciated, even if the answer is "no".

The sad thing is... (2)

-=OmegaMan=- (151970) | more than 13 years ago | (#398736)

I've heard a lot of people saying "wait a couple months, the price will drop."

There's no reason for that to happen, now that Nvidia has no real competition. Why not keep prices on the GF3 high? People who want it that badly will pay it, those who won't will have to buy some other Nvidia product if they want to keep up with the pack. Nvidia wins either way.

Nvidia embracing and extending? (5)

Stormie (708) | more than 13 years ago | (#398737)

I find it a little worrying that so much of the work that has gone into the GeForce3 has been implementing unprecedented new features such as these vertex shaders, rather than improving more general stuff such as fillrate or transformation and lighting. This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features.

How is this a bad thing, I hear you ask? Well, it looks to me like an "embrace & extend" tactic. If the developers use vertex shaders to make their games look cooler, then other 3D chipmakers have to either scramble to provide the same features, or all the cool new games will run like ass on anything non-Nvidia. Only Nvidia can get away with a tactic like this because of their present dominance of the market. Witness ATI's Radeon - they added some very innovative features (like all the z-buffer accelerating) tricks but they were all dedicated to improving performance with current software. They couldn't introduce radical new features because nobody would use them, supported as they were only by a minority chipmaker.

If you don't want to see the 3D industry completely monopolised by a single player, avoid the GeForce3, and avoid any games written to depend on its features. Support chipmakers that are seeking to make everything run better, like ATI and PowerVR.

Re:Oh, great... (1)

pepermil (146926) | more than 13 years ago | (#398738)

I gotta say, I haven't bought myself a real 3d-powerhouse video card. But being the techie I am, I do obsess over them & I just have to give nVidia some immense props for all the work they did on this card. I haven't had a chance to read over all the tech previews (nor have I heard of there being any official benchmarkings of actual cards yet), but it sounds like they've definitely put a lot of work into this card.
I do agree that it's really really insane for a video card to cost $600. But the fact is, if you don't want to pay $600, you don't have to. Right now, a GeForce 2 will more than get you by, & with the release of the GeForce 3, the cost for a 2 should hopefully drop dramatically (I know I have my fingers crossed! :-) And there are people out there willing to pay the $600 for the video cards, otherwise companies like nVidia wouldn't be making them (simple laws of supply and demand). And to those people that do buy the cards, I have to thank them, simply b/c it lowers the price on cards that more than suit my needs.
As for the technology, hopefully once I make it through my classes today, I'll have some time to sit down & really try to grasp all this 3d-techno-wonder stuff that nVidia's apparently whipped up for us all. Even if I don't buy a GeForce 3, I still love the engineering efforts that went into designing it and love trying to learn about what those efforts produced.
So, I guess what I'm trying to say is, that these cards do apparently have a place in the market, even if it's not for you or me, and it actually benefits us as well, by bringing more powerful cards down into our budgets. Plus the fact that it just gives all us techies lots of new reading material. So, I don't see any reason to complain. :-)
-pepermil

Sorry but you're wrong (1)

renoX (11677) | more than 13 years ago | (#398739)

First of all, I had a TNT and I'm ordering a Radeon SDR, so I'm not living on the cutting edge too.

But still you're wrong: processing more than 1024*768 pixels even for screen at this resolution IS interesting.
Why? Anti-aliasing!

And why using the number of pixels for the number of polygons?? One is a 2D-number, while polygons lives in 3D.

I don't know what is the number of polygons needed to render a tree (or worse a landscape) realistically but I think that it is quite huge!

Re:It's Too Much (1)

Yam-Koo (195035) | more than 13 years ago | (#398742)

And who says the folks who buy these cards don't ALSO give to charity? Do you have any facts to back up this?

There are much richer folks out there than the average hardware geek... why don't you bother them first? Surely the ~$1000 that folks drop on there computers doesn't compare to the $50,000+ that many folks drop on their cars every year, or the $1,000,000+ that some people pay for homes.

-

Additionally, you're forgetting that consumers in this country don't buy directly from third-world laborers. We buy from supermarkets, who buy from distributors, who buy from shippers, who buy from farming distributors, who pay the workers. There's no way a consumer can influence this huge chain of sales. There's no chance to "boycott", as we need food from SOMEONE, and all supermarkets I know all behave this way. Unless you have a solution to break this chain, I suggest we worry about domestic problems first.

And simply sending money over isn't the answer. Most of the aid that goes to other countries gets lost in government, and pouring more money in only makes the gov't richer and more influential.

Anyway, please try to give solutions instead of crying about the problems.

Re:Carmack on GeForce3 (1)

great throwdini (118430) | more than 13 years ago | (#398744)

Carmack has quite a bit to say on the subject as this .plan update is rather long [...]
(Score: 2, Interesting)

Amazing. Moderate up a link to Carmack's .plan.

It only formed the basis of yesterday's story, entitled: Carmack on D3 on Linux, and 3D Cards [slashdot.org] .

Re:Nvidia embracing and extending? (1)

esw (247639) | more than 13 years ago | (#398745)

The problem with accelerating current games is that it is very difficult to do better at high resolutions without faster RAM. The GeForce3 does implement several new features to accelerate current games, such as the Z compression, fast Z clear and so forth.

But, what the new hardware does give you is the ability to do more per memory cycle, which is key in the future.

If you automatically say that anyone who innovates is trying to take over the market, then how can we ever get change? In this case, since the API is part of DX8, any video card manufacturer can implement the features. The exact same situation occurred with multi-texture on the Voodoo2, and no-one complained about anti-competitive influences then.

Re:Nvidia embracing and extending? (1)

thopo (315128) | more than 13 years ago | (#398746)

If the developers use vertex shaders to make their games look cooler, then other 3D chipmakers have to either scramble to provide the same features, or all the cool new games will run like ass on anything non-Nvidia. Only Nvidia can get away with a tactic like this because of their present dominance of the market

doesnt this sound familiar to you ? /me ..oO($MS)

Re:Nvidia embracing and extending? (2)

tringstad (168599) | more than 13 years ago | (#398747)

This is like car and driver saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second! Its the greats car we ever seen, full of new features.....but we shouldnt buy them! because that will put ford out of buisness"

No, it's more like saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second, but it requires a special kind of road to drive on, and if we all buy one then the department of transportation will upgrade all the roads, but other cars won't be able to use them, so we'll have to buy Chevy's forever, and once we're dependent on them they can lower their quality and service and we'll have to accept it!"

I hate poor analogies. But you sounded real cool.

-Tommy

Re:The sad thing is... (1)

Tiroth (95112) | more than 13 years ago | (#398748)

They will lower the cost because they won't make money at the $600 price point.

$600 is really cheap for a professional card...but it is damn expensive for the much broader consumer market. So Nvidia makes a few bucks off of the prosumer market, creates a buzz surrounding the product, then sells cheaper versions. (making up the lower cost in volume)

Re:The sad thing is... (1)

Yam-Koo (195035) | more than 13 years ago | (#398749)

Well, they've got to compete against better bugdet cards. A GF2 can be had for ~$150 dollars these days, and it's not 1/4 of the speed of the GF3. Only a small fraction of people are willing to pay that much extra for whatever increase the GF3 provides.

And nvidia still has competition from the Radeon2, and possibly BitBoys, if they ever release something.

Probably not worth the price . . . (1)

raptwithal (134137) | more than 13 years ago | (#398750)

Probably not worth the price . . . yet. You don't really need such a high- end card to play today's games, and if ou wait a couple of months the price should drop to a more affordable $400 - $500.

Oh, great... (3)

InfinityWpi (175421) | more than 13 years ago | (#398751)

Another massive, expensive upgrade, that all the latest games will require you to use (after all, they won't run on old cards 'cause they can't be programmed)...

Screw it. I'm not paying more than two hundred for a video card. Anyone who'd shell out six hundred for one of these is insane. You can get another box for that much, pre-rebate.

Re:The sad thing is... (1)

-=OmegaMan=- (151970) | more than 13 years ago | (#398752)

The Radeon2 is competition?

Re:Probably not worth the price . . . (1)

pallex (126468) | more than 13 years ago | (#398753)

What? Sir, are you suggesting that spending $600 on a graphics card, to make all those 3d car games/shooters/etc is a waste of money? You`d only waste it on a ps2 and a bunch of dvds!

Re:I use a Voodoo 3 (1)

Foxman98 (37487) | more than 13 years ago | (#398754)

On another note - anyone ever own one of these? I'm talking about the Intergraph Voodoo Rush Extreme. You know I don't give a crap about what kind of performance it gave, the thing was HUGE. I remember opening up the box and thinking damn this thing is gonna rock. And of course it did for the time :-)

Re:Oh, great... (1)

pallex (126468) | more than 13 years ago | (#398755)

"And there are people out there willing to pay the $600 for the video cards, otherwise companies like nVidia wouldn't be making them "

Does that go for the DreamCast too? (I know it was cheaper than $600 but the principle remains)

Support? (1)

lordmacmoose (319187) | more than 13 years ago | (#398756)

We all (or most of us) go into drool mode when we read about cool new hardware, but has anyone thought to find out about alt.OS support? Specifically Linux? Apple is now an OEM buyer of NVidia cards, which would lead one to believe they will have Mac OSX drivers, for the GeForce3. And since OSX is based on Unix it seems that it would not be too incredibly difficult to port the OSX drivers to Linux if NVidia is too stingy to do it for us. I hope that programmers with more knowledge than me can soon get their hands on a reference board or two, Penguin huggers should no longer be forced to suffer from cut-rate graphics because commercial companies don't want to expend the resources to port drivers to what is a growing and largely under-marketed group of knowledgable computer users

Yuck (1)

Fervent (178271) | more than 13 years ago | (#398757)

from the stuff-to-drool-on dept.
Tom has extensive poop

I don't know about you guys, but this certainly made the article seem more than a bit unappetizing.

Re:I use a Voodoo 3 (1)

NecroPuppy (222648) | more than 13 years ago | (#398758)

Another problem with video cards is that the performance is becoming optimal anyway. There are 768000 pixels on a screen (a 1024x768 screen that is). At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?

Especially since all the uber-gamers will turn off as many options as they can to get extra fps, in the hopes that they can frag the other guy first.

Re:I use a Voodoo 3 (1)

Evernight (165829) | more than 13 years ago | (#398759)

Judging a video card based on framerates is as flawed as judging a processor based soley on the marketed clockspeed. The GPU also handles common effects such as anti-alaising, lighting, fog, textures, and bump maps. High polygon counts and framerates are great, having extra cycles left over to make them look good is even better.

(But I do agree with you in principle. My $90 GF2 MX card will serve me well for the next 2 years.)

Neurosis

we all love rambus (1)

thopo (315128) | more than 13 years ago | (#398763)


In this case I hope that NVIDIA has applied for a patent early enough, because otherwise Rambus may follow its tradition, copy NVIDIA's design, patent it and then sue NVIDIA.

no comment needed.

Video Cards (1)

ecid (320184) | more than 13 years ago | (#398765)

All i really want to understand is what the difference is between a $600 Geforce3 card http://www.tomshardware.com/graphic/01q1/010227/in dex.html and a $1000 Oxygen GVX 420 card http://62.189.42.82/product/card/oxygen_gvx420/oxy gen_gvx420_specs.htm Why couldn't or shouldn't i pay $1000 for the higher priced card if it can do the same things a $600 card can do. And if an Oxygen card can't do what a Geforce 3 card can do. Why not? See the problem here is that i simply don't know enough about video cards and apparently i'm too busy to RTFM

Re:Nvidia embracing and extending? (1)

/dev/niall (1043) | more than 13 years ago | (#398767)

If you don't want to see the 3D industry completely monopolised by a single player, avoid the GeForce3, and avoid any games written to depend on its features. Support chipmakers that are seeking to make everything run better, like ATI and PowerVR.

This has to be one of the dumbest things I've heard all month.

They have improved "general stuff" like fillrates and T&L.

Where do you think new features come from? This card will run all your old games, and better than any other card out there. On top of that, it will run all the new games coming out that support features exposed in DirectX 8.0 - which, in case you haven't figured it out yet, is what developers will be using to create those new games.

And who is to say you have to use vertex lighting? Granted, it won't look as good, but you can keep your old card and use light mapping instead.

ATI didn't pack any "new" features into their current crop of cards, because the APIs weren't there to take advantedge of them when they were being developed. You can bet your ass they have their R&D departments all over the new funtionality exposed in DirectX 8.0 and are busy creating new cards to go head-to-head with the GeForce3.

This is a good thing. NVidia has raised the bar, now the others must try and top them. That's how we get better hardware folks, it's not a bad thing.

Re:I use a Voodoo 3 (1)

mdw2 (122737) | more than 13 years ago | (#398768)

I believe he meant kilohertz, since that's what sampling rates are measured in, not kilobits, which is what we measure (obviously) bitrates in. Which would make much more sense.

Re:Nvidia embracing and extending? (3)

f5426 (144654) | more than 13 years ago | (#398770)

> This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features

I disagree. Their programmable vertex shaders are a very good idea. Of course developers may want to directly access this features, and maybe make games that requires GeForce3. But there are very good sides too.

First, having programmable vertex shaders can help them implementing better opengl drivers (for instance glTexGen). This will help existing programs.

Second, a lot of new tricks can be experimented. 128 SIMD instructions is huge. I for one, would love to hack on this a few weeks. My mind blows on all the tricks that can be done basically free. Creative use of the vertex shader will undoubtely be implemented by competition, and would ends up as "standard" open gl extensions.

(Btw, I don't see any noise function, or texture lookup ops. Maybe I should check more closely).

> avoid the GeForce3, and avoid any games written to depend on its features

I don't see a reason to avoid the GF3. Of course, avoiding games that *only* support it is a good idea. In the same vein, we should avoid games that only support DirectX, and games that runs on windows.

Not very realistic...

Cheers,

--fred

The damn thing (1)

hrieke (126185) | more than 13 years ago | (#398771)

is a real time Ray Tracer.

Re:Blurring of truth and virtual reality (1)

SnowDog_2112 (23900) | more than 13 years ago | (#398777)

I'm not sure your doomsday vision is really that bleak, though. We already have online communities where all the physical artifacts of your existence are missing. You don't know if I'm a one-eyed midget with parkinson's disease. You don't know if I'm a young black male or an old silver-haired grandmother. You only know me for my thoughts, and my ability to express those thoughts.

Some people may say that we're losing something by interacting in this way -- but what we're gaining is so much better. It used to be people were forced to form communities with those around them -- purely due to geographic coincidence. Now I can form communities with people who think like I do, who appreciate what I appreciate, and who value what I value. All from the comfort of my home. I haven't been shut in -- I've opened up even more!

Surely as our technology improves, this will continue (note I'm not suggesting better graphics cards will lead to an increase of this effect, just that it's already a beneficial phenomena and this can't harm it in any way). Sure -- if we all had this virtual world, and we all could look however we wanted, you might see some physical prejudices creeping back in.... On the other hand, imagine the joy a wheelchair-bound or paralyzed person might have from moving their avatars around a truly interactive artificial world....

$600 clams (1)

jmu1 (183541) | more than 13 years ago | (#398778)

OK, I may be out of line here, but I think that any fool that would spend more than they spent on an entire system should be institutionalized... or perhaps that is the problem? Maybe, just maybe there are folks out there that are so caught up in this techno-blizzard of equipment out there that they will pay any price just so they can say: "My X is bigger than yours!" I say fooey. This a prevelant ideology here just as it is anywhere in the "real world". Concider most sects of the religious world they all hate each other as they worship a bigger, better imaginary friend. The OS jehad, the inter-linux factions, the Intel/AMD war... the list goes on and on. Will it ever end? Sure, most likely once man has eradicated his existence.

Driver obsolesence (3)

swb (14022) | more than 13 years ago | (#398779)

I think the biggest problem is driver obsolesence. I have an "ancient" ATI Rage 128 video card (an AIW to be precise) and ATI has never delivered more than a "beta" set of drivers and applications (TV, etc) for the AIW128 cards under Windows 2000. I'm doubting there will EVER be another set of drivers or tuner software for this card from ATI.

The video card people seem to have like three people that write drivers, and they're always busy writing drivers for the "current" crop of cards, until the cards actually are available, at which point they switch to writing drivers for the "next" slate of cards and the "old" cards simply do not get new or improved drivers written for them. A new OS often means getting stuck with the OEM drivers provided in the OS.

I'm perfectly happy with my ATI-128 performance in the games I've played it with. I've toyed with hunting down a $120 GEForce2 card, but for the reasons you stated I'm missing the why part other than getting drivers more modern and optimized drivers than I'll ever see for my existing card.

Re:Support? (1)

ranessin (205172) | more than 13 years ago | (#398797)


nVidia has been very good for the past six months when it comes to Linux support (barring any debates over whether or not closed source drivers constitute good support), and given their relationship with SGI, that doesn't seem likely to change.

Ranessin

GeForce2 MX PCI (1)

Ella the Cat (133841) | more than 13 years ago | (#398798)

I'm prepared to risk being moderated as offtopic, or being flamed for not trawling every last corner of the WWW but I've come across a case of the technology not quite surviving the cost reduction process ...

I acquired a GeForce2MX PCI for an old homebox to see if the hardware T&L would make a difference with a tired old Winchip3D class processor - the box is so old is doesn't have AGP.

Thing is, the card doesn't even display the usual BIOS message, the (linux-only) box boots up OK but with a black screen. I've fiddled with zillions of BIOS settings, no luck at all. What has happened that breaks things so fundamentally? At least I learned about repairing the file system I accidentally trashed when I unthinkingly turned the box off to replace the old TNT board ...

Re:I use a Voodoo 3 (2)

SnowDog_2112 (23900) | more than 13 years ago | (#398799)

How did you get "insightful" mod points from that??

A: You can tell the difference between 30 and 200 fps. Maybe not between 70 and 200, but 30 and 200, yes. And a system that gives you 30 fps in one place will bog down to 10 fps in another. If you can get 70 fps, it will likely only bog down to 30 fps when things get ugly.

B: Getting tech like this out there allows game developers to push the boundaries even further. Now granted, we didn't need the explosion of colored lights that happened when the voodoo2 came out, but still, the point is good. As the tech grows, the developers can use a toolset much richer than they had before. Look at the differences between Quake 1 and Quake 3. The difference between a Voodoo 1 and GeForce2. Imagine that level of difference from what we have today....

C: Your example uses 1024x768. Why should we settle for such a lame approximation of reality. My desktop is 1600x1200. I drop to 1024x768 for my gaming, because anything higher causes noticable performance degredation. I used to settle for 512x384. Now I can't imagine going back to that. And in a few years, I won't imagine being forced to go back to 1024x768.

Nobody's forcing you to buy these new toys. Not everyone needs them. Personally, I can't see spending 10 grand on a home stereo -- after a certain level, they all sound the same to me. But I surely don't say it's "against all common sense" that someone might. I buy my toys, you buy yours, and we'll all live happily ever after.

Re:Support? (1)

jameson (54982) | more than 13 years ago | (#398800)

I beg to differ. I have yet to see /any/ nVidia drivers working on Alpha/Linux, perhaps excluding the utah-glx drivers for some of their more ancient cards.

Tom already caters for this kind of troll :) (2)

mav[LAG] (31387) | more than 13 years ago | (#398801)

When I clicked through, the first banner ad was a Unicef-Please-Donate-Money-to-Save-The-Children-Fu nd.

Pretty impressive really. Anyone who might feel a twinge of conscience when following a link to a $600 video card they're thinking of buying is almost immediately comforted with a charity banner where they can assuage said conscience.

Re:It's Too Much (1)

NineNine (235196) | more than 13 years ago | (#398802)

You probably call those ads with Sally Struthers pleading to send money 'for the children', don't you?

Here's a little lesson in economics: Those video cards are probably made somewhere in Southeast Asia. By buying those video cards, you're providing money for the people actually manufacturing all of the parts for that card, and the card itself. NVidia or whoever makes the damn things are providing relatively high paying jobs for those people. What could be better? Sending mondey to some fake group, or building their economy?

Re:I use a Voodoo 3 (1)

ranessin (205172) | more than 13 years ago | (#398803)

Well, at least the V3 has 32-bit color (right?)

Not for 3D acceleration, it doesn't.

Ranessin

Re:Nvidia embracing and extending? (1)

jmu1 (183541) | more than 13 years ago | (#398804)

I hate to post a short reply, but... Hear hear!

Re:I use a Voodoo 3 (2)

Malc (1751) | more than 13 years ago | (#398805)

"I often wonder why people spend an absolute fortune buying the lates video cards when the simple fact is that the card will not be used to its utmost capability for several years"

Why do people buy SUVs or luxury cars when a Geo Metro or Mini will do? Why do people buy the latest fashions of clothes when they could get last years in sales?

A lot of people want to play the latest games when they first come out, and have the best machine there. If you don't feel that pressure, then I congratulate you: you're in a sensible minority.

"Noone can tell the difference between 30fps and 200fps anyway"

That's so untrue. I can tell the difference between 50 and 60, no problem. After playing at 60 or above, a drop to 50 or below is very noticable and definitely not as smooth and hampers playing ability until one adjusts.

"Another problem with video cards is that the performance is becoming optimal anyway"

No. There's a very long way to go yet. With more power, there are so many features they can add. Go and read something about 3D graphics, and you will realise how limited this cards are still.

"At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?"

You can't base polygon count on pixel count. Some polygons get rendered, but then are obscured by polygons in front of them. So yes, you do have to pay for something you can't see ;)

"It is like insisting on a 500kbit sampling rate, when 70kbit sampling rates are perfect to the human ear"

Not all sounds are sensed via the ear.

In general, I do agree with your sentiment. I bought an original GeForce 256 DDR when it first came out. I'm still trying to justify the expense. If I had waited, I could have got a better GeForce for less. I'll do that again. I'm sure I'll buy a GeForce 3 eventually, but I'll wait until the Ultra model is cheap enough (I just know that there will be several generations of the card).

Re:Probably not worth the price . . . (1)

Heutchy (73751) | more than 13 years ago | (#398806)

a more affordable $400 - $500

That's affordable? Why not just buy an XBox? It'll have a similar chip in it, and be cheaper than the GF3. $200 seems like a decent price to pay for a graphics card....so it'll be a couple years till I get one of these I suppose.

glnormalize for free? (1)

pixel fairy (898) | more than 13 years ago | (#398807)

supposedly you dont ever want to enable glnormalize since doing that calculation yourself is cheaper. but, looking at the instruction set, seems like the right driver will just do it in the hardware for free.

the article crashed mozilla (.8) so i couldnt read the rest to find out what else might be usefull without using special extentions...

I might buy one (1)

graveyhead (210996) | more than 13 years ago | (#398808)

just to have the honor of implementing Mesa based drivers for this beast. Everyone else seems to have the attitude "don't waste your money", but I'm a geek. I want to figure out how it works, and make it scream on my OS of choice. It would be neat to implement GL vertex position, lighting, and normals transformations in the hardware vertex shader mentioned in the article.

Moving into 3DLabs territory (2)

ashpool7 (18172) | more than 13 years ago | (#398809)

The whole point of the modern "3D accelerator" was to bring 3D graphics to the consumer at modest prices as compared to 3DLabs, SGI, and their ilk. Now, it looks like nVidia is either knowingly or unwittingly attempting to enter that territory by increasing prices to the "professional" 3D range.

Nobody is going to program exclusively for this card until it saturates the user base. Which, at this price level, ain't gonna happen soon.

Wonder if the "professionals" will strike back :)

Re:Nvidia embracing and extending? (2)

nexthec (31732) | more than 13 years ago | (#398810)

I never thought I would see a development of a new feature shot down because it was good. this is stupid. Nvidia is innovating new features and products that people like john carmack [planetquake.com] want. They have the fastest Xfree setup now, even if it is closed source. People are so paranoid m but most orget that most of the profit is made in oem, and a 600 dollar card wont make oems happy.

This is like car and driver saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second! Its the greats car we ever seen, full of new features.....but we shouldnt buy them! because that will put ford out of buisness"

Re:I use a Voodoo 3 (1)

PluHigh (320445) | more than 13 years ago | (#398821)

I often wonder why people spend an absolute fortune buying the lates video cards when the simple fact is that the card will not be used to its utmost capability for several years

Some people want to have the biggest and best. Why do you think video card technology has advanced the way it has? Also, the word development comes to mind.

Another problem with video cards is that the performance is becoming optimal anyway. There are 768000 pixels on a screen (a 1024x768 screen that is). At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?

Last i checked, there was a difference between a pixel and a polygon.

Re:Driver obsolesence (1)

Malc (1751) | more than 13 years ago | (#398822)

Buy Matrox! Their drivers are open source... you could optimise them yourself once!

$600 too much for you? Can't afford one? (2)

Enonu (129798) | more than 13 years ago | (#398823)

I plan on buying five of them, and then having fun by taking them out with my shotgun while sailing on my 120 ft. yaught. Later on, myself and five gorgeous women will laugh about the poor geeks, drink a fine wine, and perhaps top the night off (well not really) with a nice cuban cigar.

Re:Support? (1)

ranessin (205172) | more than 13 years ago | (#398824)


Perhaps I should have clarified and stated that I mean x86 Linux... However, the average home user (the ones that these cards are being aimed at) isn't going to be using an Alpha.

Ranessin

Remember the universal constant on upgrades. (1)

jbuilder (81344) | more than 13 years ago | (#398825)

Yes, when a card is new it will always cost a considerable chunk of change. But what happens 6 months after any of these new cards ships? They drop in price by half. If you don't want to pay 600 then wait a few months and pay 350. And that 350 will probably include a free game or two....

$600 too much ? Not if you have work to do... (3)

aibrahim (59031) | more than 13 years ago | (#398826)

Some people use all the neato features of a card like this to get faster OpenGL performance out of Lightwave or Maya or some such.

It is great that we can use it for games too, but that isn't the point for many. I am sure there will be an even more expensive version of this in Nvidia's Quadro line, it'll have greater throughput and more processing power...so it'll get bought. It'll make DOOM 3 scream, but that isn't why you buy it.

Unless you are a, "soul of independent means."

But I know the difference v3 - gf2 (1)

popoutman (189497) | more than 13 years ago | (#398827)

Reason why higher fps is good:
Personally I can easily see the difference between 30 and 70 FPS, for example while turning quickly, (say to aim at somewhere behind you). At 30 fps, you get 15 frames drawn over 180 degrees of movement, assuming that it takes a half second to turn that much. That is a spread of 12 degrees, or 1/10th screen width (fov120)per frame difference. Very noticable gaps.
At say 120fps - common setup for Q3, this spread between frames goes down to 4 degrees, or 1/40 screen width. This difference comes across subjectively as 'smoothness'.
For those that consider >200 dollars a lot of mony to spend on something like that, think of those that mod their cars, getting things like dump valves, large bore exhausts, stuff like that. These give maybe 10-15 percent performance improvement, for a similar outlay (high quality components, not too sure of actual costs but comparison is still valid). People will always spend money on things that improve their experience. If you are at the **phile end of the hobby, the cost will always be considered worth the return. Personally I used a p3-500 and tnt2 for a year. It played fairly well, getting reasonable framerates. After upgrading to p3-800, it was a bit better, but not that noticeable a difference. Then I got a geforce2, and my gaming experience became so much better. If you get the use of the new hardware, what does the cost matter if you can afford it and are willing to pay for it - even if I managed to get the card on an employee discount..... :)

Re:Oh, great... (2)

gmhowell (26755) | more than 13 years ago | (#398828)

Ditto, but my price point is $100. Of course, running a measly 500 PIII is going to leave me in the dust anyway.

Hence my purchase of an N64 (and probably a PS2 in another year). It was cheap, I can play it on a 35 inch screen from my couch (my desk chair is comfy, but nothing like my couch). Things rarely crash (weird problem with Gauntlet Legends). I can buy tons of cheap games on eBay and at FuncoLand.

Did I mention that it's cheap and I can play on a big screen?

Hey, I love PC games. I got a Voodoo II VERY early. But spending $1000 per year on upgrades is nuts. I've got too many hobbies to keep up this computer upgrade business (ever seen the NOS prices on old Honda motorcycle parts?)

If that is your thing, by all means, go for it. But as for me, I'll be excusing myself from the party now.

Re:I use a Voodoo 3 (1)

dewboy (22280) | more than 13 years ago | (#398829)

You're right, it _is_ like insisting on a 500kbit sampling rate over a 70kbit sampling rate. 500kbits is overkill, and includes frequencies clearly outside the range of human hearing. Furthermore, comparing the Voodoo3 to 70kbit sampling rates is also a valid comparison.

However, that's where the analysis fails: Each can be improved upon. 30-50fps is sub-par and _can_ be differentiated from 60+fps (trust me, I have a Voodoo3 and my roommate has a superior card). Also, you need ~225-256kbit sampling to inlcude the full range of frequencies the human senses can pick up. Listening to music sampled at 70kbps is just painful if you're expecting CD-quality audio.

Re:Driver obsolesence (1)

swb (14022) | more than 13 years ago | (#398830)

Sure, and my car's engine designs are pretty well known as well. I'm about as likely to spend the time optimizing either one.

Re:Nvidia embracing and extending? (3)

Phinf (168390) | more than 13 years ago | (#398831)

Keep in mind that the features of the Gforce3 are using are things that make it compatable with DX8. these arent Nvidia's features they are DX8's. any card developer can make a card that supports these features.

I think that this is a good thing. if we can make features standard in DX8 then diffrent cards will support these same features and maybe we can get game devs to support these features if they are able to be implimanted in diffrent cards.

Re:I use a Voodoo 3 (1)

Namarrgon (105036) | more than 13 years ago | (#398832)

Why not just get a cheap card from yesteryear, that will provide the same percieved perfomance on todays bunch of games?

Good idea, if you can't perceive the difference.

Noone can tell the difference between 30fps and 200fps anyway

That's been done to death already. In short, a *minimum* of 60fps *is* important.

So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?

Far too simplistic a view. Overdraw alone can easily chew 10x the polys you can actually see. Offline renderers now use micro-polygons smaller than an individual pixel, to better approximate reality. And of course, who wants to be limited to 1024x768 anyway? I'd hate my eyes to be stuck at that rez when driving my car in the Real World.

It is like insisting on a 500kbit sampling rate, when 70kbit sampling rates are perfect to the human ear.

Not my ear, and I still class myself as human. 70 kb/s will barely get you FM radio quality, and that's compressed as well as possible. At uncompressed rates, which is what you should be talking about, a stereo 96kHz/24bit stream gives quite good (though still not perfect) quality, and clocks in at 4500 kb/s. Even a CD uses 1378 kb/s.

When "blazing fast!" becomes "who cares?" (1)

jvmatthe (116058) | more than 13 years ago | (#398833)

One has to wonder how close to the "who cares?" horizon NVIDIA is getting. The PC world, as a whole, seems to be slowing down because people just don't need the power that the cutting edge tech can offer any more.

I'll use my father-in-law as an example: he bought a P166 about six years ago and after three years he bought a P2/450 because the P166 seemed slow. Now, three years later, his P2/450 is humming right along and he doesn't need an upgrade. Even if he buys Win2k, he'll probably not need anything much more spiffy. The only upgrade we could find that he wanted (and which he got this past Xmas) was a burner. Now he's set for at least another two years, by my estimate, if not longer.

What does this have to do with NVIDIA? Good question, but the answer should be obvious. Another generation or two of graphics cards and we may be at the point that AMD and Intel are at now. No matter how they try to trump up their processors, practically no one really needs one running at greater than 1.5GHz. Even marketing will fail after people start to realize that blue men dancing on their TV screens are just trying to sucker them into a needless upgrade.

Perhaps the games market can push the graphics cards another one or two generations, but I think already we're reaching the limits of what human artists can produce in a limited timeframe (i.e. to meet deadlines to get a game published). Some other revolution, besides "faster!" will be needed.

Re:Moving into 3DLabs territory (1)

jbuilder (81344) | more than 13 years ago | (#398834)

Nonsense. 3D Accellerators for professional use used to cost in the 5-10k range. And some still do (tho why I have no idea). The GeForce3 is no where *near* that price range. And as for SGI, the 'profesional' Indigo graphics workstations went from 5k to 50k depending on the configuration you purchased. Again, the GeForce3 isn't anywhere near that.

Vertex Shaders vs. Pixel Shaders (1)

Eoli (320216) | more than 13 years ago | (#398836)

in point of fact its going to be a bit before cards have the oompf so that
pixel shaders replace vertex shaders for all lighting work; so its not going
to be unnatural to perform lighting in vertex shaders for some time.

eventually vertex shaders may evolve to be completely geometry oriented; but
I doubt that goal will be reached since some lighting operations per vertex
are acceptable. why pay the cost to perform those at the pixel level if you
dont have to?

Carmack on GeForce3 (3)

mr.nobody (113509) | more than 13 years ago | (#398843)

http://www.bluesnews.com/plans/1/ [bluesnews.com]

Carmack has quite a bit to say on the subject as this .plan update is rather long (a little too long for a /. comment I think).

The Other Big Reviews (4)

pepermil (146926) | more than 13 years ago | (#398844)

In case anyone wants a quick link to the other big reviews...
Sharky Extreme: http://www.sharkyextreme.com/hardware/articles/nvi dia_geforce3_preview/ [sharkyextreme.com]
AnandTech: http://www.anandtech.com/showdoc.html?i=1426 [anandtech.com]
HardOCP: http://www.hardocp.com/articles/nvidia_stuff/gf3_t ech/i [hardocp.com]
-pepermil

Re:Probably not worth the price . . . (1)

thegrommit (13025) | more than 13 years ago | (#398848)

Actually, I've seen two previews (anandtech and 3dgpu) that mention a MSRP of US$500. It seems the Apple folks are being ripped off again.

Re:Support? (1)

jameson (54982) | more than 13 years ago | (#398857)

Uh, sorry; that one was intended as a reply to the reply which is now above it :->

Re:Driver obsolesence (1)

demaria (122790) | more than 13 years ago | (#398858)

If you have to buy more video cards, ATI/nVidia/whoever makes more money.

So it's in the graphic card industry's best interests to force you to buy the latest. And lots of people sucker themselves into this little game. We can also blame the game developers for not writing backwards compatability or reduced graphics modes, but I think that was covered a few articles ago.

This is why there are so many Nintendos et al out there.

Re:I use a Voodoo 3 (2)

Emil Brink (69213) | more than 13 years ago | (#398859)

Well, at least the V3 has 32-bit color (right?), so we don't have to go into that particular neck of the woods. Sigh. About your math and the releated issues, I have only two things to add right now:
  • One-pixel polygons are good. Perhaps not in yesteryear's games, which feature huge flat surfaces, but various forms of higher-order surfaces with curves are definitely the trend today. Pixar [pixar.com] render their movies using subdivision surfaces tesselated until each polygon is less than one pixel in the final image. We want that.
  • Multi-pass [lysator.liu.se] rendering is good. Many effects in games are achieved by rendering each pixel more than one time. 3dfx realized this back with the Voodoo2, and added support for multiple textures per pixel. That is good, because it allows you to send a triangle to the hardware once, but get it textured twice. This saves one pass of geometry transform. As soon as an effect requires an additional pass, that reduces your effective polygon throughout quite a lot, of course. So, my point is that even though it might sound excessive based on simple "# of pixels on screen" arguments, huge polygon and pixel fillrates are good, because they allow more passes and thereby more flexibility and coolness in effects.
I don't think quality issues can be compared straight across between audio and video. Or, rather, I'm not good enough with audio to see where seemingly superfluous performance (oxymoron?) can be put to use.

Re:I use a Voodoo 3 (2)

Emil Brink (69213) | more than 13 years ago | (#398860)

Oh darn, I knew I should have looked that one up. ;^) Well then, I guess I can add a bullet:
  • Modern 3D hardware supports 32 bits per pixel (RGBA with 8 bits per channel), at least 24 bits per pixel of Z (depth) buffer, and minimum 8 bits of stencil. These are all handy to have around when creating good-looking graphics (although more than 8 bits per channel would be a Good Thing).
Thanks for the heads-up.

from the hype (2)

Stalcair (116043) | more than 13 years ago | (#398861)

back when nVidia first announced theier GPU, they touted it as good, not just because it had such amazing rendering capability but also because it took more proceser intensive operations off the CPU. The line I remember both from them and from proponents was "from the CPU to the GPU, freeing up the CPU for things such as AI and physics"

Well, I for one would really like to see that become a reality. Now I know every one who has ever touched a compiler (and often many who don't even know what a compiler does) will give a different opinion on how this is possible, how it has been done, or how it will never be done, etc. And most say that now programmers need to step up and use the features. (I agree with this one) But my question is, are API's keeping up with it. How is DirectX handling more advanced features (not just is it 'supported' but is it clean and efficient) How about OpenGL? How are the projects for middle ware coming like WINE with DirectX and so forth?

I am really impressed with the graphics detail and performance out there right now. I personally want to see more stability. While I am aware of the argument (and it does hold some water I admit) about a higher fps giving better all around performance, it is still common for intense graphics scenes to chug on your machine. I wouldn't mind seeing the ability to average out the fps better, as set by the user. Some method on the hardware to reduce the quality or certain methods only if it detects a forrest of high quality polygons and its own slow speed.

And I would REALLY like to see some better AI, and maybe some api's that make use of hardware driven ai... ok just kidding on that one, but perhaps a set of advanced ai libs with an API isn't too far out there. Tie these in with any of the methods for network distributed processing and you have an amazing LAN party set up. Throw some together on your home server farm and now you have your game server set to go... ahhh. Soon it will be like the old BBS days... but with better graphics and real time interaction.

Well, end of wish list... maybe the internet fairy will bless this if it is seen and make it real.

Understanding is a good thing (too bad for you) (1)

Viking Coder (102287) | more than 13 years ago | (#398862)

"Absolute fortune" - interresting phrase you use to descirbe these things. Typically, after about a month, they start to drop in price rapidly. You can get a nice GeForce2 MX for about $120. If they didn't develop the GeForce3, prices would never drop. Don't cry about how it's all insane, when you have a Voodoo 3 - you're benefitting from the graphics nuts who always want the best card in the world. The second best card in the world has to sell cheaper. Lower prices mean better cards for everyone. If people listened to arguments like yours, we'd all still be driving Model T's.

"Noone can tell the difference between 30fps and 200fps anyway" Okay, for starters, you're wrong. Any one of the people I play Quake 3 : Team Arena with can easily tell the difference. Also, as every Quake player will tell you, it's not the maximum fps that you feel - it's the lowest fps that you might ever feel in the game. If maxing out at 200fps means that your minimum performance is 100fps as opposed to 10fps, that's a gigantic difference in the game. Also, the less time you have to spend rendering a frame (1/200th of a second, by your argument), the more time you can spend in the processor doing things like Artificial Intelligence for the bot opponents, deformable maps, cool stuff that people "ooh" and "aah" over.

"Performance is becoming optimal"Well, I wish that were true. Listen, "optimality" is going to be when you can't tell the difference between the effects in the latest $100 Million movie and the effects you get from your $100 video card. Until then, we're pretty far from "optimal".

"There are 768,000 pixels on a screen (a 1024x768 screen that is)" Actually, it's 786,432. Also, you're saying that you'd never want to play at 1280x1024? 1920x1200? Things do look better at higher resolutions. Also, such things as anti-aliasing make an enormous difference to the perceived quality.

Also, making more than one rendering pass is a good thing.

"70kbit sampling rates are perfect to the human ear." This is great, you quote numbers all over the place, and you really don't have any idea, do you? I can tell the difference all the way up to 128kb pretty easily, and I have friends who can tell up to around 256kb. Where did you get this 70kb number?

Look, faster, better video cards are a good thing. You're essentially arguing, "Hey, let's never upgrade anything ever again! Computers are good enough!" Just because you're not upgrade hungry doesn't mean nobody should be. Also, there wouldn't be a "cheap card from yesteryear" unless people like us bought the expensive cards of today! Why mock us and attack us with fake numbers and flawed logic?

Re:Sorry but you're wrong (1)

ErikZ (55491) | more than 13 years ago | (#398863)

Yes! AA is the big thing I want my next video card to do. I was watching my housemate play on his new PS2 and was shocked to see jaggies.

I'm going full out on my next system, but I won't buy one of these cards until they get around the 300$ mark. I grab a cheap MX or something until then.

Later,
ErikZ

Re:GeForce2 MX PCI (1)

ranessin (205172) | more than 13 years ago | (#398864)


I also have an MX PCI, though I haven't had the same problem... You might want to ask on #nvidia on irc server irc.openprojects.net. There are usually some very helpful people on that channel.

Ranessin

I'm sick of upgrading! (1)

antdude (79039) | more than 13 years ago | (#398865)

Just my rants...

I am still with a Matrox G400 and it is a little slow, but I don't run a high resolution (1024x768 for games) because of my 17" monitor. Sometimes, I run 16-bit colors mode to get decent FPS. Sure, I see pixels, but how often do I pay attention to these details when fragging? :)

My next video card will probably be a GeForce 3 when the price is decent. I am NOT going to pay 600 bucks for it! I am sick of upgrading my video card once a year! I don't mind doing it every other year like processors. I have had a Diamond Stealth 64 3000, Diamond Monster 3D 1 (Voodoo 1), Creative Labs 3D Blaster (Voodoo 2), and then a Matrox G400 (non-MAX).

Please SLOW DOWN the video card technology! Not everyone can keep up. :( Thanks.

Cheaper to yank the video chip out of XBOX? (1)

WDHQ (238915) | more than 13 years ago | (#398866)

So i've been reading this article.. one part it says.. The currently known Xbox specifications lead to the conclusion that NVIDIA's Xbox chip will come with two parallel vertex shaders. so thats basically 2 times faster than the GeForce3? So y would i want to buy a GeForce3? its startin to sound like i should buy an XBoX and Yank the chip out of that and put it in my PC.. :)

Then PC vs. X-Box (1)

Aggrazel (13616) | more than 13 years ago | (#398867)

I for one wonder how a casual gamer would justify spending the $600 on what looks to be a kickass video card when just around the corner there is a kickass video game machine on the horizon with much the same technology.

And as is referenced by this story [wired.com] , the fact that the CPU and the GPU (I guess) share memory makes the X-box outpace the crap out of PC technology.

And I'm also wondering how long its gonna be before someone hacks this thing so that it can run like a regular PC, since it is kinda a cousin anyway. Sure it would suck for apps but man, what nifty graphics power...

Re:Oh, great... (2)

Namarrgon (105036) | more than 13 years ago | (#398868)

But spending $1000 per year on upgrades is nuts.

Why is it people are complaining about "having" to upgrade twice yearly at great expense (supposedly because it's required to run the latest games), and then at the same time they complain that the features won't be used by the latest games for 12-18 months anyway!

Come on people, make up your mind. If you like it now, buy it now. If you can wait, buy it later and save some money. I just don't understand all this moaning about it.

Re:Support? (1)

gerddie (173963) | more than 13 years ago | (#398869)

Okay, their drivers became better and better from relöease, to release. And since 0.9.7 they are usable even on SMP machines (thou there are still things you should not do). But I really wonder why they don't get the AGP-support right, not even on the machines with the BX based bords we get it to work, not to mention the new ones with Via Apollo.
IMHO stable drivers for the already available cards are more important, then the newest card with feature XYZ and unstable drivers.

Re:The Other Big Reviews (Correction) (1)

pepermil (146926) | more than 13 years ago | (#398873)

Oops, sorry guys (& gals)...the HardOCP link I copied was wrong (they didn't have one of them set up quite right on their page). Here's the working one:
http://www.hardocp.com/articles/nvidia_stuff/gf_3t ech/index.html [hardocp.com]
-pepermil

Re:Oh, great... (1)

eXtro (258933) | more than 13 years ago | (#398876)

Early adopters always pay the most. Every nVidia product is first released in an insanely expensive configuration up front. People with the available funds will purchase these, most other people won't.

Every nVidia product is next release in a feature reduced version, often available in white box OEM versions, for a fraction of the cost. Just like the GeForce 2 MX there will be a GeForce 3 MX.

The performance won't be as good, but it'll be available for a more reasonable price.

Re:Support? (1)

ranessin (205172) | more than 13 years ago | (#398880)


I'm certainly not going to argue with your points, but they are still development drivers (hence the pre-1.0 version numbers). Nor am I condoning their practice of using closed source drivers, but for many people, nVidia cards are the best way to go (even under Linux).

Ranessin

Re:Nvidia embracing and extending? (3)

Namarrgon (105036) | more than 13 years ago | (#398881)

The programmable shaders are very cool, but don't forget the other features:

- "Lightspeed Memory Architecture", similar to ATI's HyperZ (but more effective), with an interesting crossbar memory controller & Z compression, requires no support, and makes your existing games run faster.

- "Quincunx multisampling FSAA", a high-quality, more efficient AA method makes your existing games look nicer at considerably less performance cost than previously possible.

Increasing fillrate is pointless, when things are already so memory-bound. T&L is improved, in the way that developers have been asking for most: programmability. And as mentioned elsewhere, these are all standard DirectX8 features, so you're not required to be "nVidia compatible", just DX8 compatible, which is expected anyway.

Re:I'm sick of upgrading! (1)

Namarrgon (105036) | more than 13 years ago | (#398884)

Please SLOW DOWN the video card technology! Not everyone can keep up. :( Thanks.

If you can't keep up, then get out of the fast lane. Some of us are going places :-)

The only REAL review of GF3... (1)

Anonymous Coward | more than 13 years ago | (#398885)

Want to see how graphic card review should look like? http://www.ixbt.com/video/geforce3.shtml well... it's in russian, but anyway it worths @ least to look at benchmark charts. Still hope that ixbt's guys will transalate it to english soon.

Re:Oh, great... (1)

Bilestoad (60385) | more than 13 years ago | (#398886)

Another massive, expensive upgrade, that all the latest games will require you to use (after all, they won't run on old cards 'cause they can't be programmed)...

Clueless. Tried to play a new game on a TNT2 or a Voodoo II lately? No, because if you had you would know that it is no problem. Let's take a look at some minimum system requirements: Oni - "3D graphics card (OpenGL compatible)". Mechwarrior 4 - "Super VGA, 16-bit color monitor or better". Diablo II - "DirectX compatible video card".

You have failed to consider that to some people $600 is hardly worth getting out of bed for. Unfortunately I'm not one of those, but $600 is a reasonable price for performance like Carmack showed off at MacWorld Tokyo. Relatively speaking it's a bargain. Games are not the only use for a 3D card.

And if you're buying the kind of lame system that needs a rebate to move it, the video card is hardly your biggest problem. You get only what you pay for, and you always pay for what you get.

Re:$600 clams (1)

Namarrgon (105036) | more than 13 years ago | (#398888)

any fool that would spend more than they spent on an entire system should be institutionalized...

I would guess most semi-serious graphics developers would pay double to get these features. John Carmack isn't particularly foolish (he got his for free, didn't he? ;-) and he's already recommended that every developer rush out & buy one now. And he's not known for non-impartiality (is that "partiality?" "non-anti-dis-apartiality?)

Re:When "blazing fast!" becomes "who cares?" (1)

Calamere (318591) | more than 13 years ago | (#398889)

Perhaps the games market can push the graphics cards another one or two generations, but I think already we're reaching the limits of what human artists can produce in a limited timeframe (i.e. to meet deadlines to get a game published). Some other revolution, besides "faster!" will be needed. So what you're saying is we have to move gameing to the next level. And I think that that's AI. So we create an AI chip that specifically controls non player characters in games. Makes the bad guys smarter, act like a real person would. Game designers program their games to work with this chip and games would be a helluva lot more fun... Just an idea. Think about it.

Yeah, we should have stayed with 8 bit CPUs... (1)

Namarrgon (105036) | more than 13 years ago | (#398890)

I remember an article in a computer mag back in 1980 that was concerned at the growing trend of making faster CPUs.

The author felt that his 8 bit Z80 ran his text-based CP/M wordprocessor & spreadsheet quite fast enough, thank you, and 16 bit CPUs were just overkill. "And now they're even talking about 32 bit CPUs... when will it end? What's the point?"

You're right, of course. We really shouldn't have bothered going further, once we achieved 9600 baud terminals. Information can be conveyed perfectly well as text, and 9600 baud is faster than any human eye can read, after all...

Gee... (2)

SpanishInquisition (127269) | more than 13 years ago | (#398891)

that's the kind of upgrade that could really increases my productivity, I mean all those xterms could render way faster with that card in my box. I have to talk to my boss about this.
--

Re:Oh, great... (1)

Ella the Cat (133841) | more than 13 years ago | (#398892)

Anyone who'd shell out six hundred for one of these is insane.

So programmers are insane? $600 gets you time to develop your game ready for when such cards are more affordable. Or time to hone your skills to get a better job. Or just for the fun of messing with the card.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>