Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Affordable Workstation Graphics Card Shoot-Out

kdawson posted more than 6 years ago | from the how-not-to-spend-four-digits dept.

Graphics 141

MojoKid writes "While workstation graphics cards are generally much more expensive than their gaming-class brethren, it's absolutely possible to build a budget-minded system with a workstation-class graphics card to match. Both NVIDIA and ATI have workstation-class cards that scale down below $500, a fraction of the price of most high-end workstation cards. This round-up looks at three affordable workstation cards, two new FireGL cards from AMD/ATI and a QuadroFX card from NVIDIA, and offers an evaluation of their relative performance in applications like Cinema 4D, 3D StudioMax, and SpecViewperf, as well as their respective price points."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


Workstation class?? (2, Funny)

CranberryKing (776846) | more than 6 years ago | (#22318248)

Um.. I use my 'workstation' for spreadsheets and web browsing. The dell integrated shares-sys-memory controller is fine and didn't cost me no 500 bucks.

Please pacify your mouth with someone's dick (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22318272)

since you don't have one, obviously..

Re:Workstation class?? (2, Interesting)

Broken scope (973885) | more than 6 years ago | (#22318298)

Well i guess you really don't have a clue what they are referring to when they say "Workstation".

Re:Workstation class?? (1)

palegray.net (1195047) | more than 6 years ago | (#22318352)

I'd hazard a guess that by "workstation" they might just mean people who make a living with graphics ;).

Re:Workstation class?? (4, Informative)

bytesex (112972) | more than 6 years ago | (#22319198)

And I daresay that he isn't the only one. The write-up is confusing, at best. Had me going for a bit, anyway. To ordinary people, even 'ordinary' slashdot-readers, a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on. That thing will usually contain a graphics controller that is on-board these days, the cost of which has been discounted in the price of the board, and certainly isn't expensive to an extent that a gaming-person's graphics controller will have a 'fraction' of the cost. Chagrin or no chagrin about lay (non-graphics) people reading topics that aren't meant for them, but to act as if this is logical, implicit or otherwise self-explanatory, is disingenuous and not much different from those slashdot-write-ups that start off describing some event in second life as if it happened in real life, and pretend that everybody knows what they're talking about. Clarity is king, and no man is an island and that sort of thing.

Re:Workstation class?? (1)

slawo (1210850) | more than 6 years ago | (#22319886)

I don't understand how this post became informative. It brings no information whatsoever and explains nothing.

a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on.
Did the people vote without reading or did they consider this bit informative without reading further?
As posted bellow a workstation is a high end computer with a professional (heavy on OpenGL) graphic card. If not it costs nothing to Google "Workstation Graphic card".

Re:Workstation class?? (1)

drsmithy (35869) | more than 6 years ago | (#22320378)

To ordinary people, even 'ordinary' slashdot-readers, a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on.

Truly, then, much has been lost...

Re:Workstation class?? (4, Insightful)

merreborn (853723) | more than 6 years ago | (#22318304)

applications like Cinema 4D, 3D StudioMax, and SpecViewperf
You may not use those applications on your "workstation", but there are thousands of professionals who do

Note that the term workstation usually means a high end system used for something a little more complex than web browsing and spreadsheets:

http://en.wikipedia.org/wiki/Workstation [wikipedia.org]

I believe the progression, marketing-wise, goes:
Desktop -> Workstation -> Server

You're thinking of desktop hardware/software.

Difference? (5, Interesting)

AdeBaumann (126557) | more than 6 years ago | (#22318252)

I'm sure I'm not the only one who doesn't know... so can anybody explain the difference between a high-end workstation card and a high-end gaming card?

Re:Difference? (4, Insightful)

Broken scope (973885) | more than 6 years ago | (#22318324)

I think it works like this.

Game cards are designed to render stuff as fast as possible, many times a second.

Workstation cards are designed to render everything in the desired quality, and take as long as it needs.

Re:Difference? (4, Insightful)

prefect42 (141309) | more than 6 years ago | (#22318706)

I'd say it's more complicated than that. Gamer cards push game graphics around fast. This often means high memory bandwidth for texturing, fast full screen anti-aliasing, and these days fast shader performance. Workstation cards often are better at line-antialiasing, much better with high polygon count work, much better working with mutiple windows. Quadros always used to support more clipping planes in hardware for example. How much of this is a real hardware difference, who knows.

We've got a home-grown application rendering a 4 million polygon model. Quadro 4500 is an order of magnitude faster than a 7800 GTX. You wouldn't guess that from the tech specs.

Re:Difference? (1)

somersault (912633) | more than 6 years ago | (#22318918)

Quadro's also have 'full' OpenGL compliance, whereas GeForce's apparently don't (well that's what one of the engineers here claimed). I remember a tool a few years back for fooling your computer into thinking that your GeForce was a Quadro, but indeed it wouldn't run at the same speed as a Quadro so there must have been more parallelism for basic geometry rendering in the Quadros (as you point out). But when I've tried Quadros for actual gaming in the Source engine a few years back, it sucked compared to my GeForce at home. Probably would have been because of all the shaders.

Re:Difference? (1)

paganizer (566360) | more than 6 years ago | (#22319322)

or it could have been because you were on a "Graphics Workstation", not just a regular "workstation". took me forever to figure out what everyone was talking about.
I'm a semi-pro CGI guy, and I'm getting a kick out of...
sorry, wrong board.
It's getting really weird in the world of CGI; most of the major, and some of the Minor graphics apps are letting you make use of your GPU during rendering; these cards HAVE to be designed with that in mind.
But I'll admit I don't get it. outside of rendering, most Major & Minor apps don't really need more horsepower than you get in a mid-level gaming card with good OpenGL compliance. And most rendering is via farm, anyway.
I would be happy to be educated by someone who is more than semi-pro and know the answer.

Re:Difference? (1)

somersault (912633) | more than 6 years ago | (#22319408)

Well, we do a lot of 3D CAD, rather than CGI, though they're okay with 3DSM too, and you can use Inventor to render shiny images of your designs if you wish. We used to always get in Dell workstations, but more recently I found an independent company down in England who do nice workstations, they were basically the only place I found that had a choice of AMD processors back when Athlons were on the rise (these says you have a choice of Athlon X2s or Core 2 Duos for the basic workstations), and they also have a few choices of Linux distro if you wish that preinstalled alongside/instead of Windows. Basically I just consider a workstation to be the same as a normal desktop but with a better graphics card (of course if I built a desktop it would involve a decent mobo, processor and RAM anyway, none of your celeron/sempron crap, thankyouverymuch). Anyway, rambling. I also would like to hear an opinion on this stuff from a professional CAD/Graphics guy who is interested in the actual hardware he uses as well as his craft (most of the engineers here wouldn't have any clue, they just use what I provide)

Re:Difference? (5, Informative)

kcbanner (929309) | more than 6 years ago | (#22318330)

The workstation cards tend to have very low error tolerance, while the real time graphics cards allow for quite a bit of error in the name of speed. This is fine unless your rendering something.

Re:Difference? (3, Informative)

Psychotria (953670) | more than 6 years ago | (#22318538)

I am not sure "error tolerance" is the correct term; there is no "tolerance" (you are correct though; I am just debating the term), it's just that the high-end workstation cards sacrifice speed over accuracy. To say "error tolerance" implies that both types of card have errors (that they may or may not have and may or may not compensate for), and one tolerates them more than the other. This, strictly, isn't true. A better analogy would be something like high-end gaming cards have (for example... making the figures up) 24-bit precision and the high-end cards have 64-bit precision. There is no "tolerance" involved; just that one does the math better for accuracy and the other does the math better for speed.

Re:Difference? (2, Informative)

0xygen (595606) | more than 6 years ago | (#22318824)

Error tolerance refers to pixel errors in the output image compared to a reference rendering.

eg, the fast texture sampling methods on gaming cards lead to aliasing errors, where the pixel is in error compared to a refernce rendering.

There are also a lot more factors to this than just floating point precision, for example how the edges of polys are treated, how part-transparent textures are treated and how textures are sampled and blended.

Re:Difference? (0)

Anonymous Coward | more than 6 years ago | (#22318900)

It's like a Ferrari versus a Rolls Royce - the ferrari (/gaming card) goes fast but sacrifices a lot of comfort (/precision), whereas the rolls (/workstation card) doesn't go as fast but is much more comfortable and better engineered (/precise).

Re:Difference? (2)

citizenr (871508) | more than 6 years ago | (#22318728)

>This is fine unless your rendering something.

you dont render anything on the GPU (at least not yet), video cards are only for visualisation, that measn your theory is not valid

Re:Difference? (1)

snl2587 (1177409) | more than 6 years ago | (#22318332)


Re:Difference? (1)

AvitarX (172628) | more than 6 years ago | (#22318838)

This post had an add as a reply

I like that though hate the inline adds.

Whatever happened to tasteful extra adds when they put them in (those were the square ones)?

I wish whoever had the genius idea to interfere with discussions gets a plague.

Re:Difference? (0)

Anonymous Coward | more than 6 years ago | (#22319896)

I recommend Firefox with AdBlock Plus. I see no ads!

Re:Difference? (1)

tezbobobo (879983) | more than 6 years ago | (#22318334)

I can. You tell the boss you need a high-end worstation card. What you mean is you need a high end gaming card. It's not a difference in the physical card you hold in your hand - it is the difference of being worked into the budget. It is pure coincidence that your nifty workstation card plays the latest and greatest games.

I really apologise this is so unhelpful. I know what you meant, and I know what I've being telling the power that be for years. I guess in some regards I am being truthful and correct - but its not the answer you want.

Re:Difference? (4, Informative)

TheSpengo (1148351) | more than 6 years ago | (#22318368)

High end gaming cards specialize in pure speed while high-end workstation cards specialize in extreme accuracy and precision is the basic answer. They are incredibly accurate with FSAA and sub-pixel precision. Workstation graphic cards also have other features such as overlay plane support which really helps in things like 3dsmax.

Re:Difference? (2, Informative)

acidream (785987) | more than 6 years ago | (#22318378)

Workstation cards typically have certain features enabled that their gaming counterparts do not. Some are just driver features, others are in silicon. Hardware overlay planes are a common example. This is required by some 3d applications like maya in order to display parts of the gui properly.

Hardware Difference? (0)

Anonymous Coward | more than 6 years ago | (#22318632)

If memory serves while it can be a firmware change to get one vs the other. You gain some hardware features and lose others in the process.

BTW wouldn't a workstation card be more appropriate for doing CUDA?

Re:Hardware Difference? (0)

Anonymous Coward | more than 6 years ago | (#22318932)

wouldn't a workstation card be more appropriate for doing CUDA?
No, it works the same on both AFAIK. In the future NVIDIA plans to introduce double-precision floating point support for Quadro and Tesla cards; this may be a differentiator going forward. However, I don't believe it makes economic sense for NVIDIA to fabricate two versions of their chips, so my bet is GeForce will get double precision too, though NVIDIA may disable it in software, making it just another "workstation" feature that can be enabled on gaming cards with a firmware hack. Sigh... Can't fault them for trying to make a buck, I guess...

Memory, Screen Resolution and Accuracy (1)

linzeal (197905) | more than 6 years ago | (#22318422)

Memory and accuracy in rendering. The reason that most of these cards have 1 gig+ is because of the need to display sometimes assemblies with 1000's to 10'000's of parts. A consumer grade simply does not have the memory in most cases to even to display at the resolution CAD operators work at which is often over common resolutions and goes into QXGA [wikipedia.org] monitors which cost a pretty penny [amazon.com].

The cards are often the same GPUs you find in gaming cards with two important differences, drivers and chip quality. These are the best of the best that come out of the fab plant and even if they do have different hardware logic the real quality comes in the drivers. These drivers are made to specifications that have nothing to do with games and are more in line with quality, repeatability and robustness.

Re:Memory, Screen Resolution and Accuracy (0)

Anonymous Coward | more than 6 years ago | (#22318752)

that's such bullshit it's amazing. i worked on cad in the early 90s and though we didn't
have 1GB in anything, but somehow we did see our parts on the screen.

Re:Memory, Screen Resolution and Accuracy (1)

graphicsguy (710710) | more than 6 years ago | (#22319162)

And neither CAD nor graphics cards have changed much since the early 90's! Thanks for the laugh.

Re:Difference? (0)

Anonymous Coward | more than 6 years ago | (#22318636)

The most important difference is the support you get with the card. Basically the pro cards are guaranteed to work with all the top end software like Maya/Max/XSI etc. If you buy any other card it's highly likely that something won't work and won't be fixed with a driver update in the future. Whereas the new top end cards are much more likely to work in the first place, or receive a driver update if it breaks on the latest software. Paying 4x as much for a gpu seems cheap when you look at the down time caused in your $10million project when hardware rendering breaks down. Check out the maya hardware requirements webpage. http://usa.autodesk.com/adsk/servlet/item?siteID=123112&id=9683256/ [autodesk.com] Notice how the latest software only tells you about FireGL/Quadro cards.

Of course a gamer isn't going to need a heavily tested driver for Maya which is why they're seen as expensive for the power these cards have. Their strength is in the driver.

Re:Difference? (1)

Latinhypercube (935707) | more than 6 years ago | (#22318680)

Seeing how absolutely amazing my 8800GT nvidia card is in Crysis and Gears of War, the bottleneck is the 3d software like max and maya, not the card. 3d animation software is definitely laging behind games. But Nvidia's recent purchase of mental ray might change that... I hope... I think the difference between a pro card and a high end gaming card is minimal. It would be great to see comparative benchmarks though.

Re:Difference? (1)

Spy Hunter (317220) | more than 6 years ago | (#22318852)

can anybody explain the difference between a high-end workstation card and a high-end gaming card?
About $2000 (check Quadro 5600 vs GeForce 8800)

Seriously, there is no difference in the hardware any more, and anyone who tells you different has no idea what they're talking about. The only substantiative difference is in the drivers; for gaming cards NVIDIA and ATI omit certain driver features that games tend not to use, and omit performance tweaks for modeling programs, in a blatant attempt at price discrimination [wikipedia.org]. Nothing more than that. Quadro/FireGL cards are for funneling cash to NVIDIA/ATI when an "enterprise" user specs out a computer with the best of everything because it's not his money he's wasting (I've watched it happen).

Actually, I say that now, and it's been true for a long time, but in the near future NVIDIA plans to release a Quadro card with double-precision floating point support, which may not make it into gaming hardware for some time (we'll see though). It will be better than a GeForce for the small subset of the small number of people doing GPGPU programming who also need double precision. Everyone else can keep using GeForces and saving beaucoup bucks.

Re:Difference? (1)

graphicsguy (710710) | more than 6 years ago | (#22319178)

For many applications, you are correct. It's a crime that some PC vendors will only sell you a "workstation class" graphics card with a "workstation class" PC. But if you need the differences, you need them. Line anti-aliasing, overlay planes, quad-buffered stereo, hardware synchronization across cards for tiled displays, etc.

Re:Difference? (2, Informative)

White Flame (1074973) | more than 6 years ago | (#22318978)

Another difference that at least existed in the past, and probably still holds true today, is that workstation cards have more geometry pipelines, whereas gaming cards have more pixel pipelines. The gamer stuff puts out very pretty but a lower number of polygons, whereas workstations often just use kajillions of tiny untextured polygons. It's a tradeoff that affects silicon size and internal chip bandwidth, and explains why games and workstation apps run slowly on the wrong 'type' of card with their different demands.

Re:Difference? (3, Informative)

Molt (116343) | more than 6 years ago | (#22319148)

This doesn't hold true any more, the latest generation of hardware are all using the Unified Shader Model [wikipedia.org]. This removes the distinction between a pixel pipeline and a vertex pipeline as a unified pipeline is used which can be switched between pixel and vertex processing as the scene demands.

Re:Difference? (1)

MichailS (923773) | more than 6 years ago | (#22319142)

Workstation cards are chip for chip identical to their gaming brethren, except that the drivers identify them as such and tweak settings accordingly. You more or less buy a cheap GeForce or Radeon but with a fancier label and a pricetag oriented around corporate budgets.

3D engineering applications use a bit different rendering techniques as far as I heard. Less polygons and more high-order math of some sort. Better visual fidelity at the cost of performance. Also, CAD doesn't need high framerates.

Gaming cards are intentionally nerfed in pro apps to "encourage" you to pay for the workstation card.

Re:Difference? (1)

graphicsguy (710710) | more than 6 years ago | (#22319194)

Workstation cards are chip for chip identical to their gaming brethren, except that the drivers identify them as such and tweak settings accordingly.

I'll bet that's true when them come out of fab, but I'll also bet that certain crucial bits are burned out afterwards to prevent mere software modifications from converting a GeForce to a Quadro (there was such a hack maybe 8-10 years ago, I think).

Re:Difference? (0)

Anonymous Coward | more than 6 years ago | (#22319180)

I have been told workstation cards have improved OpenGL support.

Re:Difference? (2, Interesting)

prisoner-of-enigma (535770) | more than 6 years ago | (#22319594)

"Workstation" generally means you're using some sort of 3D application, pushing hundreds of millions (or billions) of textured, lit triangles around. I have a 2S/2P Opteron workstation with 8GB of RAM and two Quadro FX 3500 cards, and I use it with 3DSMax.

The difference in cards is subtle. Most gaming cards are tuned for ultimate speed (framerate) but perhaps not as much accuracy or quality. Workstation cards have things like hardware anti-aliasing of wireframes, a great feature when you're working with a huge model in wireframe mode. Textures are handled differently as well. Gaming cards tend to have smaller textures (again, for speed) than high-end rendering for movies or video. It's all in how the card is tuned. That's why gaming cards tend to perform lower at workstation tasks and high-end workstation cards tend to perform badly (or just plain hideous) at games.

Note how an NVIDIA Quadro FX 5600 (a card costing nearly $3,000) gets spanked by an 8800GT (costing a little over $200) in games. Some people buy workstation cards for gaming thinking they are faster than gaming cards. It's the old "it costs more so it should run faster!" argument. A fool and his money are soon parted.

Anyway, there's one last dirty little secret in the workstation graphics card department: there is almost no hardware difference between a gaming card and a workstation card. In some cases there is no difference at all except the BIOS. There's a fellow called "Unwinder" over at www.guru3d.com who writes a program that will "softmod" a gaming card into a workstation card by strapping the BIOS. Benchmarks seem to show that a $200 gaming card softmodded into a $3,000 workstation card actually performs identically to the real $3,000 workstation card. This further bolsters the claim that NVIDIA and ATI are charging a ridiculous premium for their workstation cards.

I softmodded some gaming cards to workstation cards a few years back. Worked like a charm. However, it got to be more trouble than it was worth because NVIDIA kept trying to break the softmods with driver updates. You'd have to wait for Unwinder to update his program for the new driver before it would work again. For my next rig I bought real Quadros, just not the $3,000 ones :-).

It's a shame they don't test them against 'game ca (4, Interesting)

TheSunborn (68004) | more than 6 years ago | (#22318278)

It's a shame they don't test them against 'game cards'. It would be really interesting to find out how theese cards differ from the normal gaming cards, when doing realtime 3d.

Re:It's a shame they don't test them against 'game (1)

TheSpengo (1148351) | more than 6 years ago | (#22318458)

Not really, they are meant for completely different purposes. In a rendering competition between the latest geforce and the latest quadro in maya or 3ds max or something, the quadro would completely obliterate the geforce, but in the gaming scene the geforce would have a definite and very noticeable advantage. People who spend over $2k on their workstation card probably aren't too interested in how well it runs crysis though, hehe.

Re:It's a shame they don't test them against 'game (2, Interesting)

TheSunborn (68004) | more than 6 years ago | (#22318576)

]In a rendering competition between the latest geforce and the latest quadro in maya or 3ds max or something, the quadro would completely obliterate the geforce
I know that's how its suposed to be, but I have newer seen a benchmark between 'workstation cards' and 'gaming cards' which included example images from the different cards, that showed the difference.

This benchmark don't even include any example images, which I don't understand because it might be the biggest difference between the cards. Having a benchmark of 'workstation cards' that are suposed to look better then the gaming cards, and then not even including anything about the image quality is wierd.

Re:It's a shame they don't test them against 'game (2, Insightful)

neumayr (819083) | more than 6 years ago | (#22318586)

I think the OP meant that a test against consumer cards would be very interesting for 3D artists on a budget.
As in, do I stick to this GeForce and get that quadcore CPU in order to speed up my test renderings or does it make more sense to spend my money on a Quadro and stick to my slower CPU?

Re:It's a shame they don't test them against 'game (1)

TheSpengo (1148351) | more than 6 years ago | (#22318634)

Ah that is a good point. Unfortunately I have not seen any comprehensive comparisons of this kind so I guess the best you can do is look at the performance of the cpus and gpus separately and take a best guess. :/

in short... go for the CPU (4, Informative)

Animaether (411575) | more than 6 years ago | (#22318968)

sorry, but a graphics card does not speed up your rendering unless your renderer can take advantage of the graphic card; hint: that's not very many, and those that do only do so for very limited tasks.

The only reason you should have for upgrading your graphics card within the 'consumer' market is if your viewport redraws are being sluggish; this will still allow you to play games properly* as well.
The only reason to upgrade to e.g. FireGL or a QuadroFX is if you're pushing really massive amounts of polys and want a dedicated support line; e.g. for 3ds Max, there's the MaxTreme drivers for the QuadroFX line - you don't get that for a consumer card.

* on the other hand, do *not* expect to play games with a QuadroFX properly. Do not expect frequent driver upgrades just to fix a glitch with some game. Do not expect the performance in games to be similar to, let alone better than, that of the consumer cards.

For 3D Artists dealing with rendering, the CPU should always be the primary concern (faster CPU / more cores = faster rendering**) followed by more RAM (more fits in a single render; consider a 64bit O/S and 3D Application), followed by a faster bus (tends to come with the CPU)/faster RAM, followed by a faster drive (if you -are- going to swap, or read in lots of data, or write out lots of data, you don't want to be doing that on a 4200RPM drive with little to no cache) followed by another machine to take over half the frames or half the image being rendered (** 'more cores' only scales up to a limited point. A second machine overtakes this limit in a snap), as long as you don't have something slow like a 10MBit network going (for data transfer).

Re:in short... go for the CPU (1)

neumayr (819083) | more than 6 years ago | (#22320032)

.o(how on earth could that have been modded insightful?)

Barely (or rather, not really) suppressing the urge to just reply "no shit, shirlock", let me ask you this:
Who claimed that the rendering takes place on the GPU (ignoring nVidia's gelato, don't know how relevant it is)?

Also, the major bottleneck in rendendering is CPU, followed by memory and bus. Bandwidth of the storage system and the network are a very distant third.

Re:It's a shame they don't test them against 'game (1)

Sique (173459) | more than 6 years ago | (#22318770)

It's time for the confusing and wrong car analogy.

Normally you also don't see tests of vans against trucks even though they may build on the same frame and engine, and both are designed to carry more than a car.

Re:It's a shame they don't test them against 'game (0)

Anonymous Coward | more than 6 years ago | (#22318788)

Test against game cards? whatcouldpossiblygowrong with that?!?

what is a workstaion-class graphics card? (0)

Anonymous Coward | more than 6 years ago | (#22318292)

What kind of work do you have to do on this workstation?

Quadro FX5700 vs 8800 GTS OC? (5, Interesting)

alwaystheretrading (750171) | more than 6 years ago | (#22318294)

I'd really like to see a low end workstation card like one of these compared to a high end consumer card. When I'm working with half a million polys in 3DS Max 2008 is it really going to be worth the extra money to get the workstation card?

Re:Quadro FX5700 vs 8800 GTS OC? (1)

Broken scope (973885) | more than 6 years ago | (#22318338)

I think workstation cards have more memory typically and some optimizations. If you want to make the best and most accurate immage, yeah get the work station card.

Re:Quadro FX5700 vs 8800 GTS OC? (1)

acidream (785987) | more than 6 years ago | (#22318396)

I'm not too sure about 3DS Max, but I know in Maya a Quadro 5700 would blow the 8800 GTS away. We have some at the studio and their ridiculously fast.

Re:Quadro FX5700 vs 8800 GTS OC? (1)

CodyRazor (1108681) | more than 6 years ago | (#22318762)

Are you sure of this? have you tested any 8800gts cards or are you just making guesses because yours is fast?

I was learning maya last year, they reccomended we get new quadro cards for our computers to handle it. I figured the 2 8800GTS cards I currently had would do the job just fine.

The performance on my computer totally blew away all the lab computers. The teacher initially didnt beleive me when i told him the performance I was getting because I was using "just a cheap gaming card", not like his shiny expensive quadro. The cards they reccomended started at $2000AU. Maya on my machine was only using one card which cost me $600 at the time. Sorry I dont recall what exact quadros they were but they were the range of about march last year.

Also IANA3DA, but isnt all the work of the finished rendering done with cpu power? like ray tracing etc. As far as im aware GPUs in 3d are only for having fancy real time graphics in the dev environment so you can see whats going on so pixel perfect quality isnt an issue. So if perfect quality of the GPU isnt relevant what actual benefits am i getting from spending $3499 on a Quadro?

I spoke to a couple of people from a game developer at the time thats working on a ps3 game and that particular employee said they had never had any need for a quadro, and preffered to use dual 8800GTXs. I know that animal logic (the company that made happy feet) use only cpu power for their rendering process. As far as im aware render farms are pretty much cpu only.

If im wrong in any of this please correct me but GPU companies have to make money somewhere, and with the bargain basement prices of gaming cards my moneys on workstation graphics.

Re:Quadro FX5700 vs 8800 GTS OC? (1)

EnsilZah (575600) | more than 6 years ago | (#22319396)

I bought an 8800 GTS because I wanted to both play games and do some 3D, Compositing work.
It seems though that the combination of 8800 GTS, Windows XP x64 and Maya is not a very good one, my viewports seem to freeze after orbiting around for a while, sometimes it gets sluggish.
Since only the Quadro line is certified for this sort of work nVidia doesn't seem too eager to fix those issues.

If could, I'd return this card and get a Quadro.

Gaming vs Workstation Cards (0)

Anonymous Coward | more than 6 years ago | (#22318316)

From my understanding in laymen terms, a gaming card doesn't have to be perfect. It needs blazing speed and to support all the newest nifty tricks to make eye candy, not much unlike those in console systems.

Meanwhile for the Workstation market, those cards need to be able to render exactly correctly. While most won't mind a stray jaggie in say CoD4 on our PC, in a CAD design that could possibly mean 'Great job Bill!' vs 'Oh shit, your building fell over and crushed the preschoolers on a field trip.'

Perhaps I'm completely off, but as I said, from my understanding its a difference of accuracy.

Anyone else care to explain in better detail?

-MikeTheCannibal (Can't remember my damn PW)

Re:Gaming vs Workstation Cards (2, Informative)

SynapseLapse (644398) | more than 6 years ago | (#22318416)

Not exactly.
Gaming grade video cards tend to be very fast at special types of pixel shaders and excel at polishing the image to look better. Where they tend to be inaccurate is how they clamp the textures and even then it's fuzzy estimates that only are ever issues at extreme angles.
This is only in the way it displays data and wouldn't cause a COD program to "fall over."

Workstation cards are primarily high polygon crunchers. Games are rendered entirely in Triangles, whereas rendering programs use Triangles, Quadrangles and honest to goodness polygons (5+ sides).

Re:Gaming vs Workstation Cards (1)

ettlz (639203) | more than 6 years ago | (#22319304)

While most won't mind a stray jaggie in say CoD4 on our PC, in a CAD design that could possibly mean 'Great job Bill!' vs 'Oh shit, your building fell over and crushed the preschoolers on a field trip.'
That's what stuff like finite element analysis is for, not CAD or graphics cards. Anyone doing this won't be worried about the jaggies since they'll be putting in the numbers by hand.

All I can say is... (4, Informative)

snl2587 (1177409) | more than 6 years ago | (#22318322)

...if you're planning on using a Linux workstation, don't buy an ATI card. I don't mean this as flamebait, just practical advice. Even with the new proprietary drivers or even the open source drivers, there are still many, many problems. Of course, I prefer ATI on Windows, so it all depends on what you want to do.

Re:All I can say is... (3, Informative)

doombringerltx (1109389) | more than 6 years ago | (#22318504)

It depends. I bought a new ATI card after they opened up the 2D driver specs. When booted into Linux I haven't had any problems with my day to day activities. Its only when it tries to render anything in 3D that it shits bricks. To be fair there may be a problem besides the driver that I haven't found yet, but right now all signs are pointed to driver/card problems. Honestly its not a big deal to me. I just don't use any fancy compositing manager and I never played games in Linux anyways. While I'm on the subject, I know when they released the 2D specs they said the 3D specs were on their way, but then I never heard anything out of that again. Does anyone know if or when that will happen if it hasn't already?

Re:All I can say is... (1)

snl2587 (1177409) | more than 6 years ago | (#22318540)

Its only when it tries to render anything in 3D that it shits bricks.

Mine too...as well as everyone else's on the seemingly endless discussion boards.

While I'm on the subject, I know when they released the 2D specs they said the 3D specs were on their way, but then I never heard anything out of that again.

Actually, they released a driver in January that was supposed to correct all of the issues. Apparently that claim didn't hold any water, and so last I heard they were trying to push out a new one by March.

Re:All I can say is... (1)

neumayr (819083) | more than 6 years ago | (#22318616)

Hardly relevant, seeing this is a discussion about an article about workstation GPUs and their respective performance. In 3D.
If all this were about 2D performance we'd probably still be using Matrox cards..

Re:All I can say is... (1)

zIRtrON (48344) | more than 6 years ago | (#22318984)

Do Matrox still make cards?
All I want is excellent 2d dual monitor support under linux.

Re:All I can say is... (1)

neumayr (819083) | more than 6 years ago | (#22320492)

They do. Just check http://www.matrox.com/ [matrox.com].
They seem to have given up on the consumer market though, making their cards a little expensive (stating at around 100 Euros (in .de)).

Re:All I can say is... (0)

Anonymous Coward | more than 6 years ago | (#22320560)

If you don't care about 3D graphics, why did you bother spending the money on an ATI card? You could get something for $20 from Intel or Matrox that would have done the job just as well and have had open source drivers.

Re:All I can say is... (1)

palegray.net (1195047) | more than 6 years ago | (#22318572)

Although I would generally agree that avoiding ATI is a good idea for Linux systems, I think this kind of misses the spirit of the article. Most graphics professionals are using a platform, not just a random mix of computing hardware and software. For these people, it usually winds up being what their software vendor will directly support.

Re:All I can say is... (1)

neumayr (819083) | more than 6 years ago | (#22318648)

But then, this article is about "Affordable Workstation Graphics Card[s]".
Whoever cares about the price of that single component is not in the market for a platform.

We're next (3, Funny)

Hansele (579672) | more than 6 years ago | (#22318354)

As soon as the shootout's over, they'll come gunning for us. I, for one, welcome our new graphical overlords.

do we care? (1)

TimFenn (924261) | more than 6 years ago | (#22318390)

Whatever - all I want to see is open specs on the cards, and support for open drivers a la Intel [intellinuxgraphics.org]. Then I'll start thinking about buying ATI/NVIDIA.

Re:do we care? (1)

Broken scope (973885) | more than 6 years ago | (#22318530)

Does Intel, or the GPU makers besides Nvidia/ATI, even make a Workstation card capable of the high end 3d modeling?

If they don't then yes we do care, because people need these cards to do their jobs, regardless of how much they want open source drivers.

Re:do we care? (1)

splutty (43475) | more than 6 years ago | (#22318750)

As far as I'm aware there aren't any open source projects that would have any use for workstation graphics card, so your sentiment of open source drivers is really nice, but somewhat beside the point.

They're specifically in the market for 3D CAD, 3DS, Maya, that sort of stuff, of which there really isn't a heavy weight open source equivalent.

So, although in principle I agree with you, I don't think it's even remotely important. I'd much rather see open source drivers for the gaming cards, since those *are* useful.

Re:do we care? (3, Interesting)

TimFenn (924261) | more than 6 years ago | (#22318892)

They're specifically in the market for 3D CAD, 3DS, Maya, that sort of stuff, of which there really isn't a heavy weight open source equivalent.

I don't do 3D CAD, but being a biochemist type, I actually hang out with lots of folks that do work with all kinds of 3D data such as molecular models and volumetric MRI datasets. Workstation cards are especially useful for their stereo support, which many bio-folks find helpful when modelling. Most of the development is done on linux using stuff like VTK [vtk.org] or VMD [uiuc.edu] - its not just the engineering guys doing CAD in windows that want workstation cards.

As a scientist that uses linux daily for 3D applications, I would like to see an open source workstation card for 3D graphics, dangit.

Re:do we care? (1)

splutty (43475) | more than 6 years ago | (#22319004)

I wasn't aware of that, and that does look and sound rather interesting. In which case I have to retract my earlier statement and go picket the doors at ATI for open source drivers :)

Re:do we care? (1)

Verte (1053342) | more than 6 years ago | (#22320362)

The other thing these would be great for is as a GPGPU- open drivers and hooks for libraries such as the GSL could allow visualisation workstations to double as mini-supercomputers after hours.

Superficial Market Creation (3, Insightful)

ludomancer (921940) | more than 6 years ago | (#22318510)

There was a time when you could purchase a 3D card that worked excellently for both work and play. These new "workstation" cards are a farce. They are an ostensible attempt at a solution where there is no problem. I am a professional 3D Artist and I can attest to this due to personal experience over the last 15 years. Don't buy into this crap. They DO perform better for workstations, but only due to the fact that gaming cards are intentionally crippled in this area in order to push this alternative product. Luckily most gaming cards currently on the market work well enough for 3D workstations, so I encourage everyone to ignore this attempt at desultory market generation as much as possible, because it's perfectly possible for you to get great performance out of a gaming card for both purposes.

Work stations eh (1)

MSDos-486 (779223) | more than 6 years ago | (#22318622)

time to whip out my old SPARCstation and maybe my SGI anyone have a copy of IRIX i could borrow. I though the whole concept of workstations being a separate thing from high end PC went out when you could have 8 cores, RAID, and gig ethernet on the family PC in the Living room. Actually I am looking for some hardware XServers if anyone has one of those.

Re:Work stations eh (0, Troll)

jacquesm (154384) | more than 6 years ago | (#22318694)

sure, would you like your distribution to come on CD, QIC tape or DAT ? And what about the devkit or will you be using that open source stuff ? The last release I've got I think was IRIX 6.2, after that I switched to KDE/Linux and gave all my SGI stuff away.

Can't say I miss it either :)

Re:Work stations eh (1)

MSDos-486 (779223) | more than 6 years ago | (#22318774)

CD. That open source stuff is UNIX for the peasants, and it will never be used in REAL workstations, why RISC it when you have quality software developed for the superior race by the superior race.

An even more affordable solution (1)

WaroDaBeast (1211048) | more than 6 years ago | (#22318670)

Even better: buy a gaming card, then change its PCI deviceID and unlock the professional capabilities. Ta-dah!

The Biggest Scam of the Graphics Industry! (5, Interesting)

MindPrison (864299) | more than 6 years ago | (#22318700)

The biggest curtain that have ever been pulled over the artists eyes is the "PRO"-Graphics card-Fad! Youre paying to feel "pro" - you dont get more "pro" for your money at all, you just get to "feel-like-pro" but very little extra to justify the real bucks youre spending on Quadro & FireGL series.

I know this, Im a "graphics pro" myself that makes a living of designing 3D-Models & prototyping every day and Ive used nearly every card known to mankind.

Heres my advice - take it or leave it:

Buy a Gaming-Nvidia card! The difference between the Gaming Series cards and the Quadro series card is just some extra driver software that is optimized for your "insert-favorite-3D-app-here", yes...there are some less pixel-flaws..but this will never ever affect your final-render unless youre using Nvidias Gelato (which has - by the way - proven in many cases to render less effectively than modern Multi-core-CPUs with software rendering)

You will save up to THOUSANDS of Dollars by not buying into the "PRO" hype, and youll be one happy puppy you didnt - and work just as efficiently (I know - we do) as the ones with the "PRO" cards, the game cards are actually using the same chipsets (remember the Quad-Mod you could perform on their cards, it aint fake you know!)...it would make absolutely NO SENSE for them business wise to produce 2 different cards when their cards can in fact do the same thing....and actually use the same chips.

Re:The Biggest Scam of the Graphics Industry! (4, Interesting)

prefect42 (141309) | more than 6 years ago | (#22318736)

Problem is, your advice sounds reasonable even though it's not.

Looking at the hardware spec sheets, I'd agree with you. But when it came to it, and I compared what at the time were the top cards (Quadro 4500 vs 7800GTX) the difference was night and day. If you wanted to play games, but the 7800GTX, it was waaaay faster. Want to do your own OpenGL apps that are quite demanding (high polygon count, multiple clipping planes, lots of transparency) and it's clear that not only is the 4500 faster, but it gives almost twice the bang for buck. That's pretty impressive for a 1500 ukp card, where you're not expecting value for money...

What you need to see are benchmarks of a Quadro 1700 against a similarly priced 8800. I'd be tempted to call in favour of the Quadro for things that matter to me, but short of buying some to test, it's hard to get decent figures.

Re:The Biggest Scam of the Graphics Industry! (2, Interesting)

Spy Hunter (317220) | more than 6 years ago | (#22318992)

Clipping planes are one of the features NVIDIA cripples in the gaming drivers (as games hardly use them); it has nothing to do with hardware. Buy a GeForce and flash it with the Quadro firmware if you really care about clipping planes, but honestly features like clipping planes, hardware overlays, etc are better implemented in your application and in your shaders anyway, where they will run equally fast on gaming and workstation cards.

Re:The Biggest Scam of the Graphics Industry! (1)

geonik (1003109) | more than 6 years ago | (#22318842)

You have just solved a mystery that has been puzzling me for years, thanks for the great advice!

The 'pro' cards may not be meant for you (2, Insightful)

Anonymous Coward | more than 6 years ago | (#22319012)

You may think that your 3d modeling and prototyping is professional work - and I'm sure it is.

However, you should be thinking of people using CATIA to build an entire car or even more exotic pieces of software for building entire airplanes. We're not talking the piddly few million of polies that the average Disney/Pixar movie ponders about in Maya/etc., even though those would benefit as well - we're talking a dew hundered million polies. Now we're talking 'pro'. Now we're talking the kind of people who used to buy SGI workstations at a couple $10k a piece, then switched to 'generic' workstations but fitted them with E&S (Evans & Sutherland) cards that were so big (similar in design to dual-GPU cards people are messing with now) they had to keep the casings off their machines or the things wouldn't even fit, and who are currently salivating at the nVidia QuadroPlex solutions in both desktop and rackmount form ( http://www.nvidia.com/page/quadroplex.html [nvidia.com] ) before crying as even they think that's just a might bit too pricy and go back to the suped-up PNY QuadroFX offerings ( http://www2.pny.com/category_buymulti.aspx?Category_ID=329 [pny.com] ).

Consumers, prosumers and small business need not apply. As you do say, it's not worth the extra money (and it -is- a good chunk of extra money) for those groups.

Re:The 'pro' cards may not be meant for you (1)

graphicsguy (710710) | more than 6 years ago | (#22319212)

What you're saying about big models and big projects is true. However, getting a single workstation card is not going to allow you to render 10X more polygons/sec than a single gaming card. It is more useful for features like anti-aliased lines, tiled displays / multi-card rendering, etc.

Re:The Biggest Scam of the Graphics Industry! (2, Interesting)

Anonymous Coward | more than 6 years ago | (#22319562)

You are exactly right. Many Geforce series cards can be made to function exactly the same as the Quadro series with RivaTuner and the NVStrap driver. I have actually done this myself on one of my cards.

The only people who buy Quadros are non-saavy artist types. Those of us who know better can have the exact same thing for a fraction of the cost.

Dual Monitor Support For X (0, Redundant)

zIRtrON (48344) | more than 6 years ago | (#22318958)

I have not struggled with an XFree86 config since I was a noob, and I think now there's x.org to check out.
The only thing I want is dual monitor support under linux.

I'm on osx most of the time except when I code.

I have a spare P4 lying around and the only thing that makes me stop setting it up is that I don't know what the state of dual monitor support under linux is.

I would like to know of an equipmentvoting website that answers to the point in forum style discussions simple questions:

1. What cards are supported under X for dual monitor support? Free or non-free, then I'll get into the politics of it.
2. How much memory is needed to run normal 2d stuff - like editors and IDEs on a couple of big screens.
3. What's the difference for getting 3d into that same card, or it's higher up cousins?

A timeline/graph on this fabled website would be great too, so I can know what the background of the graphics industry is like so I can consider what is hot in the next 12 months....

Any pointers?

Free/Open Source workstation graphics card needed (2, Interesting)

SST-206 (699646) | more than 6 years ago | (#22319160)

What we need for our audio workstations is a fanless (silent) graphics card that will do OpenGL nicely, using Free/Libre/Open Source drivers. Affordable is helpful, but not essential.

I've been watching the gradual progress of the Open Graphics Project [duskglow.com] (and now Open Hardware Foundation [openhardwa...dation.org]) with interest and hope they can release something good before the major manufacturers get a clue - quite likely considering their years of promises (ATI) and proprietary drivers (nVidia). It seems that Intel [intellinuxgraphics.org] are doing good things, although IIUC those cards aren't so powerful; I know: power, silence, freedom (choose TWO only)... but progress? Is the ATI Radeon 8500 still the best fanless card with open drivers?

Please wake me up when we get to the 21st Century. I'd happily read a whole page of adverts for news on such a product.

Utter baroufes (1)

giorgist (1208992) | more than 6 years ago | (#22319548)

High end gaming cards specialize in pure speed while high-end workstation cards specialize in extreme accuracy and precision is the basic answer.

What accuracy, you are just rendering a complex assembly. You don't need accuracy, the human eye compensates. When you spin it, the drop bits, then it renders again accurately. I used workstation cards. A QuadroFX and one day it dies. I needed to do a job quickly so I rushed out and bought a cheap gaming card. I mean cheap. It worked just as good. I COULD HARDLY TELL THE DIFFERENCE. This workstation business is a scam. Try this test, as long as you have a fast CPU (ie less than a couple of years old CPU) and about 2Gig of ram. Download a demo of ProEngineer. You get 30 days. open the most complex assembly spin to your hearts content. I did this with an assembly with many thousands of parts. It flies. Graphics cards have advanced to a point, and this software worked will with top end cards in 1998. We have advanced 10 years now. The budget cards work great. Giorgis

How do gaming cards perform in OpenGL? (1)

Nomaxxx (1136289) | more than 6 years ago | (#22319884)

Interesting, at last OpenGL benchmarks! But I wonder how do gaming cards result compare to this? Unfortunately, in every recent computer magazine I buy, all benchmarks are about DirectX performance only, no word about OpenGL... and since I'm running Linux only the latest figures are of any interest to me... At least, in the past they used to have both OpenGL and DirectX tests...

Re:How do gaming cards perform in OpenGL? (1)

neumayr (819083) | more than 6 years ago | (#22320300)

Just check the benchmarks for OpenGL games. Like any Doom3 based ones.

Thard? (1)

beavioso (853680) | more than 6 years ago | (#22319908)

What does this tag mean? And on a related note why does whatcouldpossiblygowrong show up every third post? /rant

Riddle me this - subpixel accuracy? (1)

Overzeetop (214511) | more than 6 years ago | (#22320868)

Okay, I can appreciate the ability to render using OpenGL in hardware for programs which are (a) on non-Windows workstations or (b) have no support for DirectX. However, why do you give a rats ass about sub-pixel, or even pixel-level, accuracy for cards which are rendering realtime for workstations graphics? It's not like users can actually see, or need to see, that type of effect. Unless you're recording off the video output (why?) instead of rendering to a completed file, it won't matter - and you would always render to a file instead of capturing the video out for a final presentation.

Now, if you're talking real time broadcast...why are you even looking at affordable - that's mission critical kinds of stuff.

Actually, I'm fighting with this now. My CAD tech needs a new workstation, and I really don't want to spend the money on a card which is expensive simply because it's a lower-volume business application. Besides, we use AutoCAD, which is developing their DirectX engine instead of their OpenGL implementation (That and we do almost exclusively 2D work.). If there's a business reason to spend an extra $300, I'm all ears.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account