Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Tom's Hardware: Win, Lose or Ti - 21 GeForce Titan Tests

Hemos posted more than 12 years ago | from the comparing-them-all dept.

Graphics 109

msolnik writes "Got a huge wad of cash burning a hole in your pocket? Why not spend it on a fancy new video card... Uncle Tom has reviewed 21 different cards so you can make a well educated decision. This is by far the most best Geforce comparison out there. A definate read for all you hardcore graphics guys."

Sorry! There are no comments related to the filter you selected.

FUCK THE JEWS (-1)

Ralph JewHater Nader (450769) | more than 12 years ago | (#2725371)

Kikes are all backstabbing scumbags! Oh yeah, first racist post.

bowl? (-1)

Anonymous Coward | more than 12 years ago | (#2725385)

damn, i wish i could stack all my geforce3's carelessly in a bowl! I really hope they did that carefully.

Re:bowl? (-1)

Fucky the troll (528068) | more than 12 years ago | (#2725418)

That's not a bowl, you uneducated cuntfish. It's a colander.

Bah... (2)

Junta (36770) | more than 12 years ago | (#2725398)

I'd rather have the ATI All-In-Wonder 8500DV. Sure, it might not have the performance of some of the GeForce3, but for Video capture and playback, it is great (even under linux soon, given the track record of the All-In-Wonders of the past). Of course, there isn't really any card I know of with *good*, well supported TV-out (yeah, there are tricks to use the framebuffer and unhooking the monitor, but that's ugly).

nVidia isn't the only game in town, particularly not for those of use who do video playback and editing more than play games.

Re:Bah... (2)

brunes69 (86786) | more than 12 years ago | (#2725491)

Yeah, I'd rather have a video card that reduces video quality in my favorite games on purpose just to get better benchmark info [hardocp.com] too. Oh wait, no I wouldn't

I'm sick of ATI's bullshit, they've been poulling this kind of crap with their drivers for years.

Re:Bah... (2)

Junta (36770) | more than 12 years ago | (#2725537)

Yeah, that was quite underhanded, but they retracted that. Though it was very bad, it does make us stop and think about how all these expert sites evaluate cards. It's always x card acheives y more FPS than z card, so it is clearly better. Price and quality are ignored (which is why ATI got away with their trick so long, quality sucked, but no one paid any attention...).

Also, you could mention how ATI, by omitting the facts, kinda led sites to make the assumption that the good Anti-Aliasing was a well implemented multi-sampling solution when it was actually an inefficient super-sampling solution. True, ATI didn't ever claim that it wasn't supersampling, but they didn't seem to want to mention one way or another until confronted. Of course this strategy of misinformation through omission is common in corporations, but still seems a little devious to watch all these sites mention Smoothvision in contest of multi-sampling without offering corrections...

On the other hand, if you are using ATI-written drivers, then you are running a Microsoft OS, and therefore you are already a customer of a company that has pulled some dirty tricks in its days. Pretty much all successful companies pull dirty tricks, and in comparison to other corporate acts, this one isn't that high on the sleaze scale. In this case, they cheated, but released fully functional drivers when called on it. It was really deceptive, but people and review sites need to learn not to judge a card mostly on a single game's acheived framerate so that companies won't get away with this sort of cheating.

My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback. I'm not about to sacrifice functionality just so I can give money to a company that has probably pulled similar nasty tricks in its time, but got away with them.

Re:Bah... (2)

brunes69 (86786) | more than 12 years ago | (#2726600)

My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback

Umm... hello? What planet do you live on? NVidia has official linux driver support for all their cards, Twinview and TV out included. There aren't really any NVidia TV capure cards available anyways, but if you really care about video quality you'd be using a seperate vid cap card anyways, so this is irrelavent to your arguement. NVidia' funcitonality under linux far surpasses ATI's, including features such as full screen anti-aliasing (FSAA). When NVidia has a superior product, I fail to see why you would support a downright dirty company such as ATI.

Re:Bah... (0)

Anonymous Coward | more than 12 years ago | (#2727294)

he might be referring to the asus deluxe cards which use a phillips decoder/encoder chip. thre are no linux drivers for this chip that I have been able to find.

Re:Bah... (1)

DoomHaven (70347) | more than 12 years ago | (#2728056)

Ummm...hello? What planet do you live on? Apparently, one where all video cards have only one chip on them. NVidia chips have Linux drivers, true, but the chips that are used on the same video cards do NOT have Linux drivers.

Re:Bah... (1)

dinivin (444905) | more than 12 years ago | (#2725575)

And if you think that nVidia and the other major video chip/card manufacturers (are there any other major ones at the moment?) don't pull this kind of shit too, you are truly naive.

Dinivin

Re:Bah... (2)

brunes69 (86786) | more than 12 years ago | (#2726647)

And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.

Re:Bah... (1)

dinivin (444905) | more than 12 years ago | (#2727172)

I've read the article..

And if you honestly think that none of the other major video chip/card manufacturers pull that kind of shit, you are still truly, and foolishly, naive.

Dinivin

Re:Bah... R8500 (2)

leuk_he (194174) | more than 12 years ago | (#2726085)

That flaimbait, i'll bite:

If you read the article you would have seen a reference to this article [tomshardware.com] at THG:

it resolves the Quake 3 "issue;"

also

offers SmoothVision FSAA; and, enables 16tap anisotropic filtering. On top of that, it improves performance.

in the conclusion it states:

Nvidia also has some work to do in regard to FSAA

Re:Bah... R8500 (2)

brunes69 (86786) | more than 12 years ago | (#2726625)

And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.

Re:Bah... R8500 (0)

Anonymous Coward | more than 12 years ago | (#2727981)

...did you not know that they changed that? You sir... are an idiot...

Re:Bah... (0)

Anonymous Coward | more than 12 years ago | (#2725502)

That is if you can keep it running for 10 minutes. ATI has the worst drivers in the industry.

Re:Bah... (0)

Anonymous Coward | more than 12 years ago | (#2725531)

Worse than nVidia's drivers? That's going to be pretty tough. Read any number of posts here [hercules.com] , and you'll see what I'm talking about.

Re:Bah... (1)

13Echo (209846) | more than 12 years ago | (#2725567)

I actually prefer the Kyro 2 cards to the GeForce line. Those benchmarks don't reflect real gaming performance of the Kyro 2 cards. They are excelent for the price.

Re:Bah... (1)

shymog (531012) | more than 12 years ago | (#2726074)

Actually, even for gaming, the 8500 is awesome. It may not have the same level of speed, but on most games, the framerate'll still be above 60, which is all that is needed.

The 8500 also has much higher image quality than any GeForce card on the market.

TV-out question (2)

Brento (26177) | more than 12 years ago | (#2725400)

The article whines a lot about inadequate tv-out capabilities for these cards. Call me crazy, but why would somebody blow the big bucks on something so high-powered as a Titanium, and then hook it up to a crummy TV? Seems like anybody who'd buy these things would rather use a big quality monitor instead. Even if you're going to use one of the nice big plasma flat panels from Pioneer or Sony, they come with VGA inputs anyway. You certainly wouldn't want to use TV outputs. What am I missing here?

Re:TV-out question (2)

Mike Connell (81274) | more than 12 years ago | (#2725424)

Sometimes I need 1920x1200 resolution on the desktop, but othertimes I just want to display video on the TV.

Re:TV-out question (2)

Brento (26177) | more than 12 years ago | (#2725435)

Sometimes I need 1920x1200 resolution on the desktop, but othertimes I just want to display video on the TV.

Yeah, I've got a TV card for that, though. That way, you don't have to replace it every time you upgrade your main video card. It's not like TV cards are getting more and more functionality with each new version.

Re:TV-out question (2)

Mike Connell (81274) | more than 12 years ago | (#2725461)

Well thats an entirely different point. If you're asking "Why do they bother putting a TV-out there at all?", the answer is simply because the increased cost of the product is low compared to the increased functionality of not making people have a second board for no other reason than to output onto a TV.

Same reason we no longer have to have both a 2D and a 3D graphics card any longer.

Re:TV-out question (1)

BeyondALL (248414) | more than 12 years ago | (#2725474)

Well, but people can't have 10 pci cards it their computer, and combined tv-out and graphics board works really well..

Who use their 19" screen instead of a 29" tv for DivX;-) anyway..?

Re:TV-out question (0)

Anonymous Coward | more than 12 years ago | (#2725425)

Watching DivX on your 32" widescreen tv, instead of a 19" monitor.

Re:TV-out question (1)

JanneM (7445) | more than 12 years ago | (#2725428)

You might want to watch movies, especially if you have a DVD player in your computer and don't feel the need for a separate DVD player for your television. It may come as a surprise, but there are quite a few people who prefer to watch movies in their couch, rather than by their desk.

I've gone in the other direction; I've sold my tv and bought a tv card instead. After all, my desk chair is the most comfortable seat in my home and I spend a lot of time there anyway.

/Janne

Re:TV-out question (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2725475)

I'm guessing you don't date much. "Would you like to come back to my apartment and watch a movie on my computer?"

Re:TV-out question (1)

JanneM (7445) | more than 12 years ago | (#2725791)

I'd rather bring a girl to the movies. At home there are better things to do than staring at a television...

Humm (1)

da5idnetlimit.com (410908) | more than 12 years ago | (#2725432)

First, some of us haave the money for the card, but not for the Trinitron 22" Monitor.

Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?

+ Bleem (RiP)
I know. But Tekken 2-3 are still Very Good Games
As are most PS1 Games.

And you can play Tekken at a reasonable speed, with Friends, (I mean dual PIII - 1Gig + small Ge2-64DDr IS overkill. But I never played so fast 8)

Re:Humm (2)

Brento (26177) | more than 12 years ago | (#2725449)

First, some of us haave the money for the card, but not for the Trinitron 22" Monitor.

If you don't have the money for a decent monitor, why would you blow $300 on a video card?That's like getting a 2ghz P4 or Athlon, but stifling it with 64mb ram. You don't blow your whole wad on a single component, you spread it around so you can get a decent system.

Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?

I prefer them on TV, and that's why I use a DVD player. PC's don't have remotes, and I don't want to have to get up and go to the PC every time I want to pause or jump around to different features. (Then again, I like watching every extra feature on a DVD, and most people probably don't.)

Re:Humm (1)

RazzleFrog (537054) | more than 12 years ago | (#2725492)

"If you don't have the money for a decent monitor, why would you blow $300 on a video card?"

All in Wonder Radeon goes for $140-50. The cheapest 22" monitor I found was $528. The Sony Trinitrons are upwards of $1K.

I don't want to have to get up and go to the PC every time I want to pause or jump around to different features.

I use a wireless mouse and mini keyboard. They stash away in the coffee table when not in use.

Re:Humm (2)

Brento (26177) | more than 12 years ago | (#2725536)

The cheapest 22" monitor I found was $528. The Sony Trinitrons are upwards of $1K.

Pricewatch shows 21" Sony Trinitrons for $650 from fly-by-night guys, and CDW has them for $799. I got my used one for $300 from a CAD shop that was switching over to big LCD's.

But all of this is irrelevant, though - what I was asking is, why do people want a TV-out on a high-end video card? If you're putting together a machine to play DVD's and Bleem, you certainly don't need a Geforce Ti. Like you said, an AIW Radeon goes for $150, and that's more than good enough. This particular article was talking about $300 cards that don't even do video capture. For those cards, a TV out is almost useless.

Re:Humm (1)

RazzleFrog (537054) | more than 12 years ago | (#2725551)

I agree with you on that then. I had the TNT Ultra with TV out and it really wasn't that great of quality. I'm not sure I ever even used it. I am glad you found the Trinitron price. I was looking all over pricewatch for 22in and didn't even stop to think that Sony doesn't make a 22in.

Yes / Yes / No (1)

da5idnetlimit.com (410908) | more than 12 years ago | (#2725674)

Ok.

Thanks Razzlefrog, you just answerd the same I'd have 8)
I (also) have that TNT2 Ultra +Tvout, and I'm still using it to this day... on a PII350 that makes DVD+Divx+Mp3+TV Net browsing (It was such a fad at the time 8)

I also put Logitech Radio Kb + Mouse, and those are nice remotes (104+Keys remote ! Wow 8)

But back to the point...
I took that old PC because turning it to a DVD player cost me less than a standalone, it can do all the thing a DVD does+the rest (Divx, Neorage,Browsing, Porn on Tv 8)

Also, some people (me at the time) have the budget either for Pc or for DVD. This allowed me to take both, with some problems (drivers for DVD card, W98 stability against How The F**K Do I Get Linux on TV Out ?)...

Today this is a W98 box (simpler) stable (=> so hacked that MS wouldn't recognize it's Registry 8) and Ghosted.

I have no concerns, it works flawlessy, I play DVD all zones, Games (BroodWar : old; slow; thousands of players online, and VERY nice on TV), I have Internet And Mp3 on the HiFi, and I'm thinking about the Videoprojector and 5.1 speakers.

Of course definition IS terrible, but my TV is the student model (Big&Cheap) and can accomodate without problem 800*600. It's even better than regular PAL, so 8))

Sorry for the 22" Trinitron, I got carried away 8|

Re:Humm (2)

Mike Connell (81274) | more than 12 years ago | (#2725642)

what I was asking is, why do people want a TV-out on a high-end video card?

You might be asking, but you're obviously not listening [slashdot.org] ;-)

For those cards, a TV out is almost useless.
That's clearly false.

0.02

Re:Humm (2)

Tackhead (54550) | more than 12 years ago | (#2727909)

> why do people want a TV-out on a high-end video card?

I went that route, but I think you're right.

Look at a 19" monitor from a couple of feet away.

Look at a 33" TV from 10 feet away.

About the same angle (field of view) in your eyes. Hellaciously more pixels on the 19". Better sound with a good pair of headphones and a 19".

A TV tuner on your video card makes having a TV obsolete. (And the rest of your computer can then obsolete your VCR and DVD-player.)

Re:Humm (2)

Chanc_Gorkon (94133) | more than 12 years ago | (#2725790)

I am planning on using the TV out on my Geforce 2 MX400 to watch DVD's on my couch. The "Computer DVD does not have a remote" thing is so much of an excuse. I bought a TV card that came with a remote for 30 bucks (as soon as I get my rebate back...:) Pinnacle Studio TV Pro, dbx Stereo TV and FM radio, $49.99 at CompUSA and a $20 dollar rebate.....no brainer there! :)). The remote that comes with this card is nice and it will work, for the meantime. It works off of the serial port which means you should be able ot hack something together for Linux or any other OS to make it work (execute keyboard macro when it recieves a certain code on the serial port). I do want to get a wireless (RF ONLY) keyboard for surfing the net on TV from the recliner built into my couch. I plan on using TV out for visualizations too(xtace on Linux, Winamp on Windows). If you use the Nvidia drivers for Linux, you can get the TV out to work pretty easily, although I have yet to get the cable I need for it. That said, any self respecting geek questioning the inclusion of a thing like TV out on a video card has GOT to be on drugs. It's just cool!

One other thing: Your DAMN STRAIGHT this crap should work. It ain't hard! At least TV has a standard! Unlike some things on computers like MUSIC! MP3, OGG, MP3Pro, Real Audio, WMA which one is THE standard? I know default is MP3, but it's not a standard, to me, until it's the only thing used or even talked about, then we'll have a new standard for digital music. MP3 is close but we still hack and work on OGG right? Computers now have so many so called standards that, to me, nothing is standard anymore. This is, to me, the main reason some people never buy computers because there's so many frickin choices that they have no idea if this one will play the game they want or do what they want at an acceptable speed. This is why MACS are good for newbies cuz there's fewer choices (decent Apple built-in audio, Geforce 2 MX currently the default) and other things Apple does right. I don't own an Apple and I am not saying they are better then PC's. Sometimes they are not. But at least you can buy a Mac and count on it being able to run about any game you buy for the Mac. PC's it's a friggin crapshoot.

Re:TV-out question (2)

Howie (4244) | more than 12 years ago | (#2725437)

I wish more reviews covered the quality of the TV-out. I'm trying to put together a system to run as a jukebox through my TV, and I have yet to find a single card that procuces acceptable quality video on the TV.

The TV is OK (sony trinitron), and my Dreamcast and PS2 both produces razor-sharp, rock-solid text and graphics on it. Why can't a PC video card do that?

(Besides, the Ti-200 is priced not much higher than some GeForce 2MX - about 130UKP. It's not a high-end card. After christmas it'll be the standard 'okay I guess' card).

Re:TV-out question (2)

Junta (36770) | more than 12 years ago | (#2725442)

Easy, because the cash value of a "big bucks" GeForce is a *lot* lower than the "big bucks" of those really big monitors, plasma, or flat panels. For example, even a 36" traditional CRT TV runs 800 dollars or so, over twice what one would pay for a good GeForce. The really nice, big HDTV-type monitors with VGA or component connectors run *at least* two thousand dollars. Sure, people get nice monitors for their systems, but too small to really enjoy with a group of people. That 36" CRT-TV with S-Video connection may work well for you, and TV-out to those is very useful.

Now if I had all the money I could ever want, I'd be hooking up my computer through composite connectors to a 30,000 dollar front-projection system in a really nice home theater room, but for now I'll be using that S-Video with a large CRT TV.

Re:TV-out question (0)

icrooks (227741) | more than 12 years ago | (#2725485)

Just as a side note:

I just got my Infocus LP335 in which supports natively 1024x768.

I now do my computer work and play my games on my wall at well over 50". pretty awesome. The picture is amazing.

Re:TV-out question (1)

RazzleFrog (537054) | more than 12 years ago | (#2725508)

What did that run you? $4K? It is a great idea though. I teach classes sometimes at work and before the class I will pop a DVD into my laptop and watch it on the big screen. The sound sucks but the picture's awesome. I assume of course that you have you computer hooked up to your digital surround sound stereo?

Huh? (0, Offtopic)

no_nicks_available (463299) | more than 12 years ago | (#2725404)

"This is by far the most best..."

most best?

Mod Parent Up! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2725408)

LOL! Hemos really is a MiniTaco.

Re:Huh? (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2725426)

duhh, should read "most bestest"

Re:Huh? (0, Offtopic)

ndogg (158021) | more than 12 years ago | (#2725427)

Isn't it sad seeing how low the grammar on /. has become? Well, actually, it's always been at a low point.

Re:Huh? (1)

DgWatters0 (46011) | more than 12 years ago | (#2725496)

"This is by far the most best..."

most best?

Definately!!

Re:Huh? (1)

Zapaanese.Whore (315742) | more than 12 years ago | (#2725547)


Gotta love that ;) Oh and last time I checked, "definately" was spelled "definitely".

I hate to be a grammar/spelling Nazi, but it's a pet peeve of mine when people can't be bothered to check their own spelling, it certainly diminishes the importance of anything they might have to say.

- Z

Re:Huh? (1)

killmenow (184444) | more than 12 years ago | (#2725867)

Maby the speeling is rong becuase Eenglish isn't the posters frist langauge...

But seriously, if the point of language is to convey a thought, and he conveyed his thought, his language is fine. Perhaps it conveyed to you a little more than his intended thought...like that he's an ass too lazy to check his own spelling. I give the benefit of doubt...to me it conveyed that he didn't know the dictionary spelling of a few words and is perhaps a bit redundant (most best), but then I've known some friggin' smart people who couldn't speel they're weigh oot uv a payper baag.

BTW, anyone intent on helping others understand language should read The Language Instinct [fatbrain.com] by Steven Pinker. There's a lovely chapter on people who insist on correcting the grammar/spelling of others.

At any rate, while I agree that presentation says a lot about you, I hardly think /. is a place to be miffed about another's grammatical and spelling errors.

I hate being a grammar/spelling Nazi too...so I stopped.

Re:Huh? (0)

Anonymous Coward | more than 12 years ago | (#2726010)

I hate being a grammar/spelling Nazi too...so I stopped.

"I hated being" ...

I'm not quite there yet I'm afraid ;-)

Re:Huh? (1)

killmenow (184444) | more than 12 years ago | (#2726219)

Whatever...

Your interpretation is one possible way to read my statement. But it is just as technically correct to speak in the present tense with an implied "have" in front of the word stopped.

As it is also correct with the ellipsis serving as a separator of two statements and indicating something ommitted. The two statements stand on their own and do not necessarily require being of the same tense.

Also, while switching tense is often taught as improper, in common language it is done with great regularity and is not in any sense improper. Partly, my point is that /. is not a forum confined to "American English" or any other specific, formal, codified system of language but is rather an informal medium much like common spoken language between friends.

Suggesting to someone that they might be taken more seriously or be viewed as more intelligent if they took the time to spell-check their posts is fine. Deriding someone for not doing so is condescending and only makes the corrector look more of an ass than the correctee.

Re:Huh? (1)

Zapaanese.Whore (315742) | more than 12 years ago | (#2726800)


Well, if you ask me, presentation says almost as much as the content of your words. If you can't even be bothered to check something as rudimentary as spelling, I'm sorry, but anything you have to say is moot, because your opinion and the validity of your words comes into question.

I'm not saying you need to be perfect, everyone makes the occaisional spelling meestakes ;) But when there will be great numbers of people viewing your words, it says something that you couldn't even be bothered to try. It doesn't matter what the forum is. I miss the days when at least *TRYING* to sound coherent was the norm, rather than the exception.

As for English not being the writer's native tongue, that's entirely possible, but if so, then the onus is still on the writer to make that fact clear so that the reader doesn't infer anything. Again, just another laziness/sloppiness factor if you ask me.

- Z

Re:Huh? (1)

killmenow (184444) | more than 12 years ago | (#2726926)

I understand where you're coming from and agree you need to make sure your message suits your audience.

However, after reading that book I mentioned, I have to agree with Mr. Pinker's assessment that language is *supposed* to change. If it didn't, and words like "upcoming" (which used to be a pet peeve of mine) or derivative spellings of words were not allowed to become a part of the accepted language, we'd all still be speaking in some ancient tongue.

Trying to force language to adhere to the same set of rules forever is impossible. So complaining about it changing (which people have been doing for centuries) is rather pointless. So I learned to just roll with it. I always try to be as correct (according to the rules I learned in school) with my spelling and grammar as I can, but don't think less of people who do otherwise in informal settings but still take issue with mis-spellings or grammatical errors in things like resumes.

IMHO, the word "definitely" is mis-spelled SO OFTEN as "definately" that it is only a matter of time before lexicographers add it to the dictionary.

Re:Huh? (1)

Zapaanese.Whore (315742) | more than 12 years ago | (#2727279)


I completely agree that language is not forever, nor are (or should they be) the rules be so inflexible that they require precise usage to garner any merit. A perfect example of this is just as you pointed out. Another is the induction of the phrase "D'oh" into Oxford's (I think it was Oxford's, I could be mistaken here) next revisioned dictionary.

What I cannot agree with, nor do I consider it acceptable, is the sloppy or lazy use of written language, when one attempts to convey information to a large number of people. It's one thing to instant message a friend and another entirely to post to /. where thousands of people will read your words.

The shift of mentality that it's OK to be sloppy and lazy in presenting information is the problem I have. Change is fine, as long as it's acceptable in my eyes, I guess ;)

Oh, and I definAtely shudder at the thought of *that* day coming ;)

- Z

Why Read the test ? (1)

da5idnetlimit.com (410908) | more than 12 years ago | (#2725410)

1 / As usual, see the most powerfull, Expensive and complete video card (which specs look slightly like my last computer, btw)

2 / As my parents doesn'nt budget me anymore (Alas !), stop daydreaming and get a Geforce 2Mx, which is MORE than enough for now (ok, let's say enough)

=> I mean the day you have more than 2 softs that can use Geforce 3, maybe then...

Until that date, Ge2Mx is more than enough for Quaking.

I mean, for the price of a GE3Ti, I could buy a Desktop computer 8| This isn't the rat race, it's just a game race...

Hoping to Frag you Soon 8)

Re:Why Read the test ? (0)

Anonymous Coward | more than 12 years ago | (#2726199)

wahh, wahh, wahh, mummy and daddy don't pay my way. Ahhh, poor likkle ickums.

Re:Why Read the test ? (1)

toofast (20646) | more than 12 years ago | (#2728075)

From your intelligent and mature post, I can see that your mummy and daddy still pay for your goodies.

SPECviewperf numbers? (2)

4of12 (97621) | more than 12 years ago | (#2725411)

I suppose the Quake 3 numbers are some indication of OpenGL performance for these mass karket cards, but I was curious how these stacked up against the traditional high end OpenGL cards (Oxygen, FireGL, etc. or even a whole SGI system) so that a price/performance comparison could be made. If CPU's are any indication, the market size for these cards could drive their performance to almost acceptable levels in more professional OpenGL applications and certainly at a lot less cost.

Any references?

Re:SPECviewperf numbers? (0)

Anonymous Coward | more than 12 years ago | (#2725540)

The latest GeForce cards blow away the firegl, oxygen, etc. in terms of pure viewperf numbers, but where they cannot compete is in accuracy of drawing.

If you want to see the hotest chip out there, check out www.rvu-inc.com. This former E&S chip is capable of handling multiple real-time HD resolution video streams in real-time, it has the opengl performance of a geforce 2 (but is quite a bit more accurate), and process's natively at 12 bits per components so the image quality is definetly there.

Re:SPECviewperf numbers? (1)

Larson E. Whipsnade (544970) | more than 12 years ago | (#2726314)

Bull f**king shit. FireGL2 annihilates any GeForce/Quaddro based card out there when it comes to SPECview (nearly a factor of 1.5 better on AWADVS e.g.). I know because until this month I worked for a company that sells pc based 3D workstations to several major studios. Moreover, FireGL's drivers won't lock up your dual AMD 760 box under Linux like Nvidia will.

Re:SPECviewperf numbers? (2)

Namarrgon (105036) | more than 12 years ago | (#2727662)

nVidia have a popular mid-range line of "professional" 3D chips, the Quadro series (sold by Elsa in its Gloria series). These are basically GeForce chips with a couple of extra features enabled [geocities.com] , like hardware anti-aliased lines & two-sided lighting. They're quite a bit more expensive than a consumer GeForce, but a LOT cheaper than most workstation cards.

There's a few [spec.org] places [amazoninternational.com] you can look for benchmarks on GeForce, Quadro and mid- to high-end workstation gfx cards. Currently the Wildcat 5110 pretty much rules the roost (at around $3k), with the Quadro2 Pro (under $1k) & FireGL4 (over $1k) competing hotly below that. Lesser cards (FireGL 1 & 2, Quadro, Quadro2 MXR & EX, and the older Oxygen models) can be had for well under $1k. Prices are only from memory, and are probably wildly inaccurate.

Even a standard GF2/GF3 or Radeon does pretty well, impressively so for the price. Rendering quality has been compared (for the GeForce at least), and is roughly equivalent - no major texture or polygon errors, all cards generating the occasional off pixel.

Bottom line: The majority of my customers (2D/3D FX) are switching to GeForce or sometimes Quadro cards - sought-after features include decent (not necessarily superlative) 3D app performance, dual monitor support (WITH hardware accel on both monitors!), and bang for the buck. Good DirectX support doesn't hurt either (very few cards from 3Dlabs support DirectDraw well, and some serious apps do need this).

Re:SPECviewperf numbers? (1)

Larson E. Whipsnade (544970) | more than 12 years ago | (#2728165)

FireGL2 is not a "lesser" card. In the dozens of SPECview tests I've run on every imaginable AGP chipset, it totally kicks Gloria III and Gloria DCC's ass. The Linux drivers are a *lot* more stable, too. For pure OpenGL performance and quality I just don't think anything from NVidia compares. Wildcat 5110 is another matter altogher. Too bad they won't release a Linux driver, though I see Xi Graphics is trying to fill the gap.

Video Cards (1)

sinnerDOTcom (544925) | more than 12 years ago | (#2725433)



When compared to my 512k Trident VL-bus video card, I've had less problems with that than my Geforce256 DDR. Go figure.

So it begins... (3, Informative)

Gannoc (210256) | more than 12 years ago | (#2725460)

We tested Gainward's new GF2 Ti bearing the confusing Ti500 moniker, as well as the GeForce 3 Ti500 board carrying the equally inaccurate name Ti550 TV. Obviously, Gainward is trying to create an impression of technological superiority for its products. Nonetheless, these cards carry the same NVIDIA chips as the competition and not some newer version, as the name might imply to less informed buyers.



Technological superiority? Try fraud. They name their boards the "Ti500" when it has the regular Ti, and NOT the Ti500 chip, then call their Ti500 board the "Ti550". If I was reviewing that, I'd certainly point that out a little more plainly than as a "technological superiority" attempt.

Uncle Tom?? (0)

Anonymous Coward | more than 12 years ago | (#2725467)

Uncle Tom has reviewed 21 different cards so you can make a well educated decision.

Uncle Tom? Am I missing something here? Can these video cards only display black and white?

Re:Uncle Tom?? (-1, Offtopic)

Andy (2990) | more than 12 years ago | (#2725513)

Racial jokes will usually not get you moderated up on /.

Time to upgrade! (-1)

BankofAmerica_ATM (537813) | more than 12 years ago | (#2725500)

All right! The benchmarks show that the card is .00001% faster than my current card, time to drop $350! No, I'd never buy a console. Who needs a dedicated machine for games when I can just use my PC? Besides, it's way too expensive!

Stereo Glasses (5, Informative)

soboroff (91667) | more than 12 years ago | (#2725504)

I find it pretty interesting that some of these cards (according to the review) are being bundled with LCD shutter glasses... the glasses are synchronized with the screen to darken the screen over one eye while your monitor displays the view for your other eye. Refresh that at 120Hz, provide a slightly parallaxed view for each eye, presto, it's better than Jaws 3D.

I used to work with these things a while back... it's ok as long as you don't move much, but if you like to move your head around you'll get headaches pretty quick, since the view doesn't change based on where you're sitting. We used head-tracking to accomplish this, but none of that stuff here. Another problem is screen distortion, which doesn't mean much when you're playing Quake, but if you're thinking of a really nice interface for Blender or Maya, this can make a big difference in being able to actually point the mouse where you think it's pointing.

Without calibration to your personal interocular distance and eye-to-screen distance, and good correction for screen distortion, you can use these for max 30 minutes before getting eyestrain or just a plain headache. Add poor head-tracking and you can get seasick, too!

Last thing: there is more than depth cues to seeing 3D: good lighting and shadow effects, _accurate_ perspective views, and use of color all come into play. These glasses are a lot of fun, and if a lot of folks have them then maybe the state of the art will go forward a bit.

OFFTOPIC: Seasickness (1, Funny)

NRAdude (166969) | more than 12 years ago | (#2726924)

I've noticed alot of who play games get "sick." When I'm in Fry's, EB's, or wherever else, I'll always meet the strange lad who got sick while playing Mech Warrior, Quake3, and whatever other 3d-worlds you can throw at them. Each and every time, I ask them if they get sick while playing a side-scrolling game like an actual Pinball machine and they'll sey, "Ya I get sick of pinball real quick, especially when it costs four quarters for three balls."

Me, on the other hand, has only been sea-sick one time and it was way back in my childhood... I was about 5 years old and was fishing with my dad about 15 miles out in the ocean; just south of the Catalina Island Channel. I blew chunks simply because I was angry at my dad, didn't want to fish, and was pouting in the cabin. So, I got sick because I didn't focus on fishing and mainly because I went into the cabin.

Now let me tell you about the ocean and a boat's cabin to keep from getting sea-sick... STAY OUT OF THE CABIN! STAY AS FAR AWAY FROM IT AS POSSIBLE! AND FOR THE LOVE OF GOD, WHEN A SWELL MOVES UNDER YOU, AND THE BOAT BEGINS ROLLING OVER IT, YELL "YEEEHAWWWWW", AND YOU WILL NOT GET SEASICK. I think it has to do with not LOOKING at one object at any one time. A person who doesn't get sea-sick and has a great time will be a little dizzy when you get back on shore. That would be a good study... compare the amount of dizzyness of a person who doesn't get seasick, to a person who always gets sea-sick. It takes a good 30 minutes for your ears to adjust on land because they no longer have to compensate for the swells rolling and and shifting the boat.

...That was the only time I ever got sea-sick and no videogame or Japanese cartoon's flashing lights ever made me sick. I can play those $5-a-pop flight simulators all day, but I started boycotting Disneyland ever since I discovered they hide porn and evil-inspired messages in their Animated movies they make for children.

Re:OFFTOPIC: Seasickness (1)

cornjones (33009) | more than 12 years ago | (#2727055)

ok, I'll bite. I had heard something about the porn in lion king or something but what about the evil-inspired messages? can you fill in a few details?

thnx
ej

Re:Stereo Glasses (1)

WayneGayle (107802) | more than 12 years ago | (#2727311)

Have any info/links on getting these to work in Maya? I'm a Maya user, I've never really put any thought into using LCD Glasses to get a good 3D interface (or any other method, for that matter.) Since I use Maya all day, every day, this is something that I'd like to toy with. Has anyone tried getting this working in Maya? Of course, playing AvP2 would with these might be fun too...

Jesus Christ (1, Flamebait)

ellem (147712) | more than 12 years ago | (#2725515)

Uncle Tom
...most best...
definate

Get a friggin' copy editor.

Uncle Tom is just wrong on a lot of levels
...most best... WTF does that even mean?
definate perhaps in several hundred years the word will be spelled the way it is pronounced by dullards but for now it is definite. The opposite of infinite.

Guys, use Word... it will fix things like this automatically.

"most best" (0, Offtopic)

Refrag (145266) | more than 12 years ago | (#2725544)

It's mo' better!

Past the point of v ideo cards mattering? (5, Insightful)

swb (14022) | more than 12 years ago | (#2725546)

I have a 2+ year old (in tech terms) ATI Rage 128 based card (AIW-128) running under XP and with the newest ATI drivers and the games I've played with it (most recently the Medal of Honor demo), performance is just fine by my eyes @ 1024x768 and 16 bit color.

I've seen nVidia GeForce2 cards going for $100 but I just don't see the point. There was a time when moving from a 2D card to a 3D card like the orginal Voodoo was really worth the $300 or so it cost -- performance and quality skyrocketed. Similarly the move from the voodoo I to the II, and from the II to that card's next generation (the ATI 128).

Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste. 3D performance has been pushed beyond the point where it matters, even for gaming and the features being added seem trivial -- just TV out?

All new cards it seem should come not only with good 3D, but video in and out, TV tuners, and the ability to do hardware MPEG2 compression of full-frame video at zero cost to the CPU. At that point the video card arms race would make more sense..

Re:Past the point of v ideo cards mattering? (3, Interesting)

Howie (4244) | more than 12 years ago | (#2725846)

All new cards it seem should come not only with good 3D, but video in and out, TV tuners, and the ability to do hardware MPEG2 compression of full-frame video at zero cost to the CPU. At that point the video card arms race would make more sense..

But I don't want to pay for a TV tuner with my video card any more than I want an Instant Messaging app with my OS [microsoft.com] or Browser [netscape.com] .

What I would expect is that if they are going to offer these features, then they should at least be of some reasonable quality - see my other post about quality of picture on TV-outs.

I'd also expect to be able to trade off features/performance for either price or power consumption (and therefore heat/noise), but I'm apparently the only person who cares about that. Or PCI for a second-head.

Re:Past the point of v ideo cards mattering? (2)

pointwood (14018) | more than 12 years ago | (#2726169)

You are NOT the only one that cares about that!

What I care about is: Passive cooling (no noise), good 2D image quality (that is what you will be using most of the time) and good drivers of course (both for Windows and Linux).
3d Performance is last - I don't play games very much.

Re:Past the point of v ideo cards mattering? (2, Funny)

tarkin (34045) | more than 12 years ago | (#2725877)

Yes your performance in 16bit color will be very good. But all the newer games, including the forementioned Medal of Honour are optimized for 32bit color AND 32bit textures (putting even more strain on your setup).
That is another thing altogether. You cannot expect to run Medal Of Honour / Return To Caste Wolfenstein / ... in full 32bit color AND 32bit textures with some medium detail setting on your ATIrage128 or for example a TNT2.
And since you'll be losing out on most of the graphic details in modern games in 16bit mode ( look at the skie and smoke in RTCW ) a Geforce-x upgrade seems imminent ;-)
I know what I'm talking about cause my overclocked p3-450@558 and TNT2-125@155Mhz can't keep it up for much longer ;(

Re:Past the point of v ideo cards mattering? (2)

leuk_he (194174) | more than 12 years ago | (#2725906)

You forget to mention 64-bit color [theinquirer.net] as one of the final features. (no i am not making this up)

But I think video card stop evolving when they reach realtime reallife quality.

It is however the case that a lot of games (I don't know about medal of honour) don't use all the latest features, to reach a larger customer base.

Re:Past the point of v ideo cards mattering? (2)

Codifex Maximus (639) | more than 12 years ago | (#2726563)

Aren't human eyes limited to seeing color of no greater quality than 24bit color? 64bits seems to be quite a bit of overkill.

The site you referred to said that 64bit color helps out in rendering of shades and complex images in non-realtime for movies and such. I guess I can accept that - but for video cards on PC's, I feel it is just too much cost and complexity for the minimal gains in quality.

>But I think video card stop evolving when they
>reach realtime reallife quality.
Yeah, I agree with you. They (the card makers) need to work on optimising cards to remove inefficiencies. For example: The addition of hardware shaders and texturers like in the GF3 series is a major step ahead in card evolution. The optimisations in the Kyro II cards that don't draw or texture unseen triangles is another example.

Maybe after they have hit the wall on how much they can improve 3D graphics on a 2D surface, they will begin to put more research into a more 3D display mechanism. Comfortable, affordable, easy-to-use, reliable and functional 3D glasses or headsets - with spatial feedback and stereo sound. WOHOO!

Re:Past the point of v ideo cards mattering? (1)

barjam (37372) | more than 12 years ago | (#2727093)

24 is, but they use the other channels for effects... like alpha blending etc. Not sure what new effects 64 bit get you though.

Re:Past the point of v ideo cards mattering? (2)

Howie (4244) | more than 12 years ago | (#2727179)

Aren't human eyes limited to seeing color of no greater quality than 24bit color?


Not quite. From memory, the number of colours shown (16.7 million) is close to the number of distinguishable colours, but the two sets of colours are not in the same colourspace, so it it isn't actually good enough that you can't see the difference between adjacent colours in RGB-8bit space, even though the number of colours is right.

Also, the shades within the RGB-8bit space are distributed evenly amongst red, green and blue, whereas the eye is more sensitive to green, then red, then blue.

Look up 'gamut' in a decent graphics book, like Foley & Van Dam.

Re:Past the point of v ideo cards mattering? (3, Interesting)

swillden (191260) | more than 12 years ago | (#2726102)

Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste.

You should try some different games.

I have a GeForce2 and I've been thinking the same thing for a while, but I just bought the EverQuest expansion "Shadows of Luclin" and now I'm looking for a new video card. My GeForce2 (on a 1.3Ghz Athlon with 1GB of RAM) can't draw the new 512x512 pixel textures and high-polygon character models guickly when I get into areas with lots of other players or lots of vegetation, even at 1024x768 resolution.

EQ has never been the most efficient game in terms of power required to render its displays, but the approach EQ takes is what games *should* be able to do: EQ describes its world in terms of polygons, texture maps and light sources and lets the computer/video card do the rest. Not spending a lot of developer time on making nice-looking graphics render quickly on low-end (or even not-so-low-end!) hardware means more developer time that can be spent on enlarging the virtual world (and Norrath/Luclin is *huge*).

I hear that with some of the $300+ cards, SoL action is smooth at 1600x1200 resolution with all of the bells and whistles turned on... too bad my wife already bought my Christmas presents :(

Re:Past the point of v ideo cards mattering? (3, Funny)

Howie (4244) | more than 12 years ago | (#2727194)

You play EQ enough that you are considering buying a $300+ video card to support your habit, and you still have a wife? ;-)

Re:Past the point of v ideo cards mattering? (0)

Anonymous Coward | more than 12 years ago | (#2726416)

You must be on drugs. I don't believe you.
How many frames per sec do you get in a Q3 engine
game running at that resolution?

Re:Past the point of v ideo cards mattering? (0)

Anonymous Coward | more than 12 years ago | (#2726878)

liar liar pants on fire. Sorry but I don't believe you. Rage 128? Sure 16bit all details Off and 20fps maybe with a fast CPU. But you really expect anyone to believe that your a gamer?

Re:Past the point of v ideo cards mattering? (1)

Evro (18923) | more than 12 years ago | (#2726913)

performance is just fine by my eyes @ 1024x768 and 16 bit color.

You just answered your question. by my eyes isn't good enough for hardcore gamers. There are valid reasons to need 125 fps in Q3A. My girlfriend currently gets 200.9 fps in Q3A with a Geforce 2 GTS, Athlon 1900+, and she's getting a Geforce 3 Ti500 from me for xmas (shhhhhhh). At that point I can only guess that her FPS will be in the 300+ range. Definitely more than necessary, but having 90 horsepower in your car is probably more than necessary too. Doesn't mean I wouldn't rather have 300.

Additionally, does your video card have things like full scene anti-aliasing? That's one of the major selling points of the gf3, as it improves image quality a lot.

I recently built a computer for my grandmother. I put a geforce 2 mx in for $60. Sure, you can find acceptable video cards for $30, but for another $30 you get one that you really don't have to worry about. Plus, you never know, grandma might decide to play CS or Q3.

Graphics boards? (0)

Anonymous Coward | more than 12 years ago | (#2725552)

I've never understood why there's so much interest in the latest and great high-end graphics boards. Probably 99% of users would be quite comfortable with a $60 board and stock drivers. Yes, I know, there are hardcore gamers out there who "need" every last fps they can get, but why spend more for a board than either a PS2 or XBox costs? Why not just buy a console and use your PC as a PC, not as a gaming machine?

TV-out on non-nvidia cards (1)

Krilomir (29904) | more than 12 years ago | (#2725616)

Speaking of tv-out, which video card should I invest in if I want really good tv-out? I need the video card for games as well. I currently have a Matrox G400, but the drivers doesn't work to well with Windows XP (the tv-out part), and the 3D perfomance is a bit lagging. I've been looking at the Kyro II (seems like it performs like a Geforce 2 GTS in most cases), but I haven't been able to find anything on it's tv-out capabilities.

I want a card that is able to output high-quality video to my tv while my normal monitor is showing my desktop in another resolution and higher refresh rate.

Re:TV-out on non-nvidia cards (0)

Anonymous Coward | more than 12 years ago | (#2725729)

I have a similar question -- I'm looking for a not too expensive card with good tv-out functionality. I'll be using it primarily for things like mame and a music jukebox displayed on the tv (and maybe web browsing, but that's secondary/not as important.)

Any hints?

I've been considering one of those mainboards with essentially everything (video, network, sound, etc) on board; comments?

Re:TV-out on non-nvidia cards (1)

ChiPHeaD23 (147491) | more than 12 years ago | (#2726150)

Voodoo 3 3500's can be had for cheap. Great 2d quality, decent 3d speeds (you can play N64 or Playstation emulators fine on it, as well as most "somewhat older" games. Drivers could be a bitch though but hey... they're cheap.

Me am can't wait... (4, Funny)

BigJimSlade (139096) | more than 12 years ago | (#2725655)

to get me a most bestest video card for crissmas. Geforce am a very goodest chipset for me to play em my bestest games.

For Great Justice!

Funny you would mention "well educated"... (1)

SumDeusExMachina (318037) | more than 12 years ago | (#2725784)

This is by far the most best Geforce comparison out there. A definate read for all you hardcore graphics guys.

For the sake of someone who couldn't pass 3rd grade spelling or grammar, I sure hope you aren't in the market for an expensive new grapics card...

Observations on an Old System + GeForce MX200 PCI (3, Interesting)

DG (989) | more than 12 years ago | (#2726101)

My primary system is a Pentium I 233MMX, 64 MB RAM, Linux 2.4.14 box. It's based on a Baby AT format case, so any processor upgrades are a case + motherboard + processor deal, and I've been just too damn lazy & cheap to bother.

The graphics card built with this system was a Matrox Mill II - so no 3D acceleration to speak of.

Playing Quake and Quake 2 on this system was Just Fine, but anything more modern was just not possible. I tried playing the Quake 3 demo, but was getting something on the order of 1 FPM, so I've been pretty well shut out of all the 3D stuff.

Then the other day, I noticed that the price on an XTacy GeForce MX400 PCI card (no AGP!) was like $150 CAN - so what the hell, I bought it.

It turned out to be DOA (system would not POST) so I exchanged it for the only other PCI card they had in stock, an XTacy MX200 card (which was like $120 CAN)

They also happened to have Quake3 (in the tin box, no less) SoF, and Descent3, all the Loki ports, in the bargin bin for like $10 each, so I got those too.

Stick in the card, grab NVidia's drivers, configure XFree to use them, fire up Q3 - and bam! Playable! Just like that.

Things get a little choppy if more than about 10 people are in a room shooting at each other, and SoF and Descent3 (played in 800x600 with full textures) will "skip" once and a while, but for the most part, the game experience has been just fine.

Interestingly enough, when I turned on the frame rate display on Q3, I was getting anywhere from 10 fps to about 27 fps, with an average of about 15 - and the play experience is just fine. Faster framerates would be nice, but this IS old hardware, and really, it'd just be gravy. I don't particularly find myself wishing that the framerate was higher than it is - in fact, before I turned on the fps display, I thought I was making 30 fps. To see the average was about half that was a real suprise.

I can't help but wonder if the processor or bus is the bottleneck, or if the MX400 card had've worked the display might be a touch faster - but it doesn't really matter. The MX200 is "good enough".

So overall, I'm a happy camper.

.

Re:Observations on an Old System + GeForce MX200 P (1, Funny)

Anonymous Coward | more than 12 years ago | (#2726925)

So overall, I'm a happy camper

I fucking hate campers in Q3 -- douche!

HDTV TV-Out converter. (1)

barjam (37372) | more than 12 years ago | (#2726184)

Has anyone used the VGA-HDTV (component video) converter? That seems like the way to go if you are wanting to use your TV as a monitor. (You all have HDTV don't you?)

Barjam

Most Best, Eh? (1)

CodingFiend (236675) | more than 12 years ago | (#2726504)

I like mo' better comparisons.

Re:Most Best, Eh? (0)

Anonymous Coward | more than 12 years ago | (#2726669)

Most bestest.

Re:Most Best, Eh? (1)

CodingFiend (236675) | more than 12 years ago | (#2727488)

Lol :-)

Bah. (1, Insightful)

Anonymous Coward | more than 12 years ago | (#2726529)

I think this is a bit sad, really. Once upon a time, the test would have between between 21 different cards from 21 different manufacturers, with 21 different chipsets. Now, the vast majority of poeple just go for an nvidia gfx card.

A similar thing happened in the computing world - these days, most people just get an x86 PC - once upon a time, you cuold choose with relatively equal ease between Amiga, Acorn, Atari, Mac, PC, etc. Each had different advantages and disadvantages. Now we get generic boxes based on the mediocre x86 architecture that are differentiated by marketing and hologram badges on the cases...

We did the GeForce 3 a few weeks ago (2)

Animats (122034) | more than 12 years ago | (#2727262)

The only real news here is that the GeForce 3 technology is available for about half the original price point. All these "titanium", "speed bump", and "overclocked" versions have are within 25% of the base GeForce 3.

There's still not much out there that actually uses the vertex shader capability in the GeForce 3, anyway. NVidia's chameleon demo is beautiful, but that's about the only impressive vertex shader app. So the GeForce 2 technology is good enough for most gamers right now.

NVidia does a great job; their boards work well, the drivers are reasonably solid, and their ELSA business unit, which sells boards, offers a six-year warranty, rare in this industry. And they support OpenGL seriously. Now that they have the price down to a more affordable level, go for it.

The best choice is the smart one, not nVidia (0)

AnonymousCowheard (239159) | more than 12 years ago | (#2727323)

By far, the best choice for a videocard is based on your goals on the system. If you deal with webpages, pictures, movies, and whatnot, the best choice for such high-resolution performance is either Matrox or ATI. Simple, Matrox and ATI excell at 2D graphics performance and quality. ATI is the best all-around videocard for consumer 2D and 3D Graphics, with better 3D performance than nVidia, but who do you choose? Look at the track record for Matrox and ATI in Linux and you will see that Matrox has documented their chips to freedom and they are most stable, while ATI doesn't release the data on their chips 100%; meaning ATI will not be the best choice for stable usage.

How does nVidia fit into all of this between ATI and Matrox?
nVidia develops and distributes its own Linux drivers. nVidia certifies their Linux drivers as they do their Apple and MS Windows drivers. nVidia stands behind their development and offers excellent technical support over phone and eMail. nVidia and ATI don't offer resolutions and picture quality upto Matrox's level, however.

What about 3D graphics, performance, and stability?
Matrox is the best choice for 2D graphics performance and stability. ATI, their drivers being produced free and opensource like Matrox's, by the DRI developers, is the best choice over nVidia simply because it offers the most compatibilty, documentation, and tweaking ability among the many different platforms, CPUs, and Linux operating systems. ATI has more potential in 3D graphics than Matrox, but Matrox's chipsets offer more features that consumers will use on a daily basis. Matrox is available on other platforms and CPUs with ease.

Those companies are competing furiously for their own niche. What about the disbanned 3Dfx video accelerators?
3Dfx videocards are supported in Linux the best and nowhere else on earth as good. The Voodoo2 graphics chipset, when used in `SLI` mode, has 3D performance higher than nVidia's GeForce. The Voodoo2 is the best addition to a system with a Matrox videocard which lacks good 3D performance. The 3Dfx Voodoo Banshee, 3, 4, and 5 are all the best supported videocards for XFree86-4.1+DRI in hardware-acclerated openGL and are the most stable of all the supported videocards in Linux.

What about the 3DLabs `workstation` graphics cards for higher quality 3D modeling in Maya for Linux?
To date, all 3DLabs videocards using the Glint MX and Glint Gamma chipsets are supported in Linux for hardware accelerated openGL. By videocard model, off the top of my head, such videocards are the Elsa Gloria XXL, the 3DLabs GMX2000, and the 3DLabs GVX1. These are the best performing videocards for 3D modelling, higher than nVidia and ATI. The drivers are maintained by the XFree86 DRI developers and not 3DLabs. Driver support is complete and stable; no more work needs to be done so this makes the 3DLabs videocards the best choice for 3D modelling in Linux

What about a computer with two or more monitors at once?
Matrox's videocards are the best supported for using 2 or upto 16 monitors on a Linux system via additions Matrox "secondary" videocards. You can have more desktops in X Windows, configure each additional videocard to display a console login, or configure your system to allow multiple users on your system at the same time, each with an additional keyboard, mouse, and X Windows login.

The choice is yours. What can Linux do for you? Anything.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?