Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ATI Introduces FireGL V5000

Zonk posted more than 9 years ago | from the shiny-can-be-expensive dept.

Graphics 110

karvind writes "Folks at Tomshardware> are running a review of ATI's new FireGL V5000. The card's X700 processor, code named R410GL, is based on a 110-nanometer process and the card sports eight pixel pipelines, six geometry engines, 128 MB of GDDR3 memory, dual DVI connectors for multi-display applications and dual link support for 9 megapixels displays. Anandtech also posted a review."

cancel ×

110 comments

Sorry! There are no comments related to the filter you selected.

GL? (1)

darklingchild (726827) | more than 9 years ago | (#11789761)

Does this mean that it's going to be more Linux compatible? (openGL)

Re:GL? (1)

northcat (827059) | more than 9 years ago | (#11789935)

All major cards accelerate OpenGL (support OpenGL). And no, OpenGL support does not necessarily mean support for Linux.

Re:GL? (2, Informative)

tonsofpcs (687961) | more than 9 years ago | (#11791447)

The FireGL series has been around since it was a workstation graphics card line owned by Diamond. ATI bought Diamond's graphics cards a while ago, and then started to make their own FireGLs. The new ones are more for gaming than the old ones were, but they are still decent workstation graphics cards. They are supported in Linux using the default ATI driver as far as I am aware. These cards are called FireGL due to their amazing OpenGL accelleration.

My Diamond FireGL 3000 is sitting around waiting for a new machine (old one died), until then, I cannot really tell you much about Linux support.

Wow (2, Funny)

TheKidWho (705796) | more than 9 years ago | (#11789766)

The video card for the rest of us...

Re:Wow (1)

thedustbustr (848311) | more than 9 years ago | (#11790538)

he card's X700 processor, code named R410GL, is based on a 110-nanometer process and the card sports eight pixel pipelines, six geometry engines, 128 MB of GDDR3 memory the question is, Can it run Doom 3?

Re:Wow (1)

beardz (790974) | more than 9 years ago | (#11791759)

More importantly, who cares? :)

Re:Wow (0)

Anonymous Coward | more than 9 years ago | (#11793012)

More interesting, will the driver: a) run on the next operating system from microsoft or will I have to buy a new card then? b) ever run on linux at decent speed? c) run stable at all on any os current or future?

Man! (1)

Faust7 (314817) | more than 9 years ago | (#11789773)

It has "Fire" in its name and it's red! Is Ati trying to create its own ill omens?

Gotta catch 'em all (5, Funny)

tepples (727027) | more than 9 years ago | (#11789864)

Will ATI go on to make a LeafGL card that's green?

Re:Gotta catch 'em all (4, Funny)

TheKidWho (705796) | more than 9 years ago | (#11789879)

only after they make WindGL, EarthGL, and WaterGL.

Then Captain Planet can come and save the day from the evil corporations!

Re:Gotta catch 'em all (2, Funny)

tehshen (794722) | more than 9 years ago | (#11789978)

You're missing HeartGL! Spread the love! Embrace ATI!

Re:Gotta catch 'em all (1)

xoboots (683791) | more than 9 years ago | (#11790002)

"Will ATI go on to make a LeafGL card that's green?"

Since they are based in Toronto, if they made a LeafGL card, it would be blue.

*NHL is dying (0)

Anonymous Coward | more than 9 years ago | (#11790066)

The NHL is dying, and Google doesn't give me any references for Pokemon LeafGreen (for GBA) being changed into LeafBlue for the Canadian market.

Re:Man! (1)

game kid (805301) | more than 9 years ago | (#11789968)

No, but if it can render Lindsay Lohan [taod.com] or Angie Everhart [celebritybattles.com] with fiery red hair in 3D then I'm saving up.

Obviously anything that's fire red is fair game for me, especially gfx-cards. (And girls.) I wonder how it comp's with the Radeon Xxxx's though; I don't think workstation and home cards should be put in different classes, and it just seems a bit weird that Radeons and GeForces aren't even mentioned. Since I don't work on a workstation, I guess it's not my business though.

Re:Man! (1)

FuzzyBad-Mofo (184327) | more than 9 years ago | (#11792577)

Yeah, but does it run Linux?

Re:Man! (1)

metricmusic (766303) | more than 9 years ago | (#11793243)

as does Firefox and it's a raging success. :)

Only exciting for Windows users and Linux Experts (1, Troll)

wcitech (798381) | more than 9 years ago | (#11789778)

...because us linux n00bs have no idea how to get ATI cards to work in Linux. I love you nVidia!

One man's mid-range is another man's budget.... (2, Insightful)

erick99 (743982) | more than 9 years ago | (#11789780)

Since this product is aimed at the mid-range market with its price-tag of $699...

I do understand that is a mid-range market price and card, but, damn, I just bought my son a very nice computer with a very servicable video card for less than that.

Re:One man's mid-range is another man's budget.... (5, Informative)

TheKidWho (705796) | more than 9 years ago | (#11789790)

It's a workstation graphics card, not a gaming card...
Take a loot at the other FireGL's or Quadros, they go in the price range of $2,000 and above!

Re:One man's mid-range is another man's budget.... (5, Insightful)

be-fan (61476) | more than 9 years ago | (#11789852)

Does your son by any chance model jet engine compressors on that thing? It's a total apples to oranges comparison! It's like saying that a 777 is more expensive more expensive than your Toyota. Strictly, it's true, but it's a meaningless statement.

Re:One man's mid-range is another man's budget.... (5, Funny)

rtaylor (70602) | more than 9 years ago | (#11789902)

[i]Does your son by any chance model jet engine compressors on that thing? It's a total apples to oranges comparison! It's like saying that a 777 is more expensive more expensive than your Toyota.[/i]

Not only that, but the Toyota is easier to parallel park and handles tight corners better.

Re: Toyota (1)

MachDelta (704883) | more than 9 years ago | (#11791275)

Not only that, but the Toyota is easier to parallel park and handles tight corners better.
I take it you've never driven a Toyota Tundra?

*Insert foghorn sound here*

Re:One man's mid-range is another man's budget.... (1)

Hektor_Troy (262592) | more than 9 years ago | (#11791604)

Not only that, but the Toyota is easier to parallel park and handles tight corners better.
Sure, but it's bandwidth when full loaded with digital storage is nothing compared to that of the 777!

Re:One man's mid-range is another man's budget.... (0)

Anonymous Coward | more than 9 years ago | (#11789919)

I was sort of hoping they would model jet engine compressors with a CPU... I often wonder how people got anything in the air with paper and pencil, when it takes 50 engineers and 1200 computers to design one turbine blade today.

Re:One man's mid-range is another man's budget.... (0)

Anonymous Coward | more than 9 years ago | (#11789958)

Primary design is based on capacity to exhibit certain effects, refinements to design are based on increase efficiency in the degree of the effect demonstrated. The first gas turbines required more fuel to operate, the refined designs required less because the design of the engine had been made with lighter and stronger materials-decreasing weight and thus fuel consumption by that degree for the same load.

Slashdotted - Google cache link (0, Redundant)

DigitalBubblebath (708955) | more than 9 years ago | (#11789782)


Link to cached printable version: FireGL V5000 [64.233.183.104]

Re:Slashdotted - Google cache link (1)

TheKidWho (705796) | more than 9 years ago | (#11789842)

Tom's Hardware and Anandtech getting slashdotted??!?!

Sorry but you are overestimating the slashdot affect here.

Re:Slashdotted - Google cache link (1)

DigitalBubblebath (708955) | more than 9 years ago | (#11789889)

Haha yeah maybe! Server wasn't responding for 5 minutes or so after the story was published. Bleh.

Re:Slashdotted - Google cache link (1, Funny)

Anonymous Coward | more than 9 years ago | (#11789921)

affect, effect, whatever....

128MB? (0)

Anonymous Coward | more than 9 years ago | (#11789791)

Isn't that a bit skimpy for a high end graphics card designed to run dual dual-link displays?

9 megapixel * 3 bytes per pixel * 2 displays = 54 MBytes just for the display

Re:128MB? (1)

TheKidWho (705796) | more than 9 years ago | (#11789862)

Thats assuming your going to be running the T221 or any other high end monitor that has that kind of resolution. Plus if you can afford a T221, you can easily afford a 256mb or 512mb FireGL/Quadro.

Pointless benchmark? (3, Insightful)

rokzy (687636) | more than 9 years ago | (#11789810)

they take an OpenGL workstation card, the only type of ATI card with proper linux support, and benchmark it on XP SP2?

Actually (0)

Anonymous Coward | more than 9 years ago | (#11789891)

Actually, Believe it or not, its not always about linux. Many workstation class programs only run on windows. Maybe the target application isn't designing engines but instead digital content creation.

Re:Pointless benchmark? (3, Interesting)

Rufus211 (221883) | more than 9 years ago | (#11790138)

(-1, Troll)

The first linux drivers ATI released were for their firegl line of workstation cards. You could hack them to work with the normal cards, but for quite a while now ATI has provided drivers that work with all the cards. In fact, you can read anandtech's review of ATI and nVidia cards under Linux here [anandtech.com] .

Re:Pointless benchmark? (1)

rokzy (687636) | more than 9 years ago | (#11790851)

(-1, Troll) to you too.

I said only the firegl has proper linux support. this is true. I'm not talking about hacked drivers released once or twice a year for out of date versions of X. I'm talking about support written on the card box and backed up with full customer service.

Re:Pointless benchmark? (0)

Anonymous Coward | more than 9 years ago | (#11790604)

I am considering getting an ATi card for an upgrade and I want something on par with a Radeon 9600, but I want to set it up in a dual boot system with Windows. The FireGL 9600 is the equivalent card and can be had for less than $300. Has anyone tried gaming on a FireGL under Linux? Did you get decent gaming performance, particularly in Neverwinter Nights and UT2k4? The rest of the system will have a 2.0 GHz Pentium 4 (Northwood core) and 768 MB of RDRAM. Alternatively, have ATi's Radeon Linux driver's stopped offering poor performance yet?

Folks, WORKSTATION card, not gaming (4, Insightful)

MightyPez (734706) | more than 9 years ago | (#11789811)

These cards are meant to be used for workstation uses like 3D editing and creation. These aren't gaming cards. I realize you bought your gaming card for far less, but these are a completely different product.

well (5, Informative)

FidelCatsro (861135) | more than 9 years ago | (#11789868)

You can do a small modification to some ATI radeons to make them fireGL cards http://www.rojakpot.com.nyud.net:8090/default.aspx ?location=3&var1=185&var2=0 [nyud.net]

Re:well (0)

Anonymous Coward | more than 9 years ago | (#11790106)

It's basicly the same card just with some transistors turned off. It's all marketing - those people who use CAD apps have enogh money to spend on these things so they bump the price up, because it's not a (home) consumer's market. NVIDIA does this also.

I assume that maybe their drivers are more optimezed for CAD/CAM applications to better interact with the software.

Re:well (1)

enosys (705759) | more than 9 years ago | (#11790748)

Very interesting! Thanks for posting that.

I've been trying to find out what actually changes. It doesn't seem like any extra circuitry is enabled when "upgrading" a Radeon 9800 to a FireGL X2. Benchmarks [rojakpot.com] show an impressive increase in performace of CAD-type applications but the 3DMark score actually decreases. It seems like maybe this is just a change from a driver optimised for gaming to a driver optimised for professional use.

I also found the FORSAGE [hardwarelab.ru] driver which should supposedly allow one to "upgrade" to FireGL without any soldering or a new BIOS.

Re:Folks, WORKSTATION card, not gaming (1)

sgant (178166) | more than 9 years ago | (#11789962)

Yes, we realize this...and no, the FireGL wouldn't make a kick-ass gaming card either. Gaming cards and workstation cards address two different areas.

But I have to ask, why this story in the gaming section of Slashdot?

Re:Folks, WORKSTATION card, not gaming (1)

MightyPez (734706) | more than 9 years ago | (#11790672)

Yes, we realize this...

Talk to the people a few topics above this one.

Re:Folks, WORKSTATION card, not gaming (1)

johannesg (664142) | more than 9 years ago | (#11790157)

these are a completely different product.

Are they? Both NVidia and ATI base their workstation boards on their gaming boards, and AFAIK it is generally understood they are damn near identical. Is there any difference worth mentioning between these cards and their gaming equivalents? If so, what and why?

Irony (-1)

Anonymous Coward | more than 9 years ago | (#11789812)

A name like FireGL for, what is probably, the hottest card on the market.

Sometimes I get confused... (2, Insightful)

aendeuryu (844048) | more than 9 years ago | (#11789814)

Since this product is aimed at the mid-range market with its price-tag of $699 (630), potential customers can't expect the full feature set.

Hold the friggin' phone. 700$ is mid-range? What, do you have to take a second mortgage out to get top of the line stuff?

Anyway, it's good to see that ATI is going with V**** enumerations to match NVidia's Quadro FX ***** enumerations. Those X700/X800 and 6600/6800 patterns were too easy to remember, IMHO. It's not a free market unless you're confusing the hell out of your customer base with numbering schemes.

Re:Sometimes I get confused... (0)

Anonymous Coward | more than 9 years ago | (#11789834)

If you can't afford $700 for a professional card, you certainly can't afford the software that needs it.

Re:Sometimes I get confused... (3, Insightful)

TheKidWho (705796) | more than 9 years ago | (#11789856)

omg omg omg the corporations are out to get us!!!! run for cover!!!

You are being overly ignorant, these video cards are Workstation graphics card. The higher end versions usually cost somewhere in the range of $2,000 and above. Not to mention the software that actually benefits from these cards cost on the order of $1,000-$10,000+.

Yes they certainly are gouging the engineers because you know, engineers can't keep track of numbers...

Re:Sometimes I get confused... (2, Informative)

delirium of disorder (701392) | more than 9 years ago | (#11789874)

$700 is still mid-range. You want high end....check this out:

http://www.sgi.com/products/visualization/prism/

Re:Sometimes I get confused... (4, Insightful)

Jeff DeMaagd (2015) | more than 9 years ago | (#11789883)

If you are paying engineers and designers $60k or more a year, it makes sense to provide them a product that maximizes their productivity.

Workstation cards are optimized, validated and supported for specific products. Companies that make software these things use heavily test their products using specific driver revisions. Compared to the annual wage of the people that use this, that's peanuts. Think Avid, SolidWorks, Renderman and such. Don't think Blender or other consumer or hacker software.

I stand corrected... (1)

aendeuryu (844048) | more than 9 years ago | (#11789985)

Okay, I'll have to take your word for it, but 700$ still seems steep, though, considering how it's been shown possible to manually hack some of the gaming cards with the hardware equivalent into FireGL cards.

As stated in someone else's post that covered the hack -- "As many of you already know, the GPUs that ATI use in their desktop graphics cards are the same GPUs used in their workstation-grade graphics cards. The reason for the performance differences between desktop and workstation graphics cards lie in the driver."

Seems like you're paying an extra few hundred dollars for software, not hardware.

Re:I stand corrected... (0)

Anonymous Coward | more than 9 years ago | (#11791160)

But you ARE paying for software... you're paying for validated drivers with extra code intended to ensure they'll work as expected in your 'mission critical' video application. A misrendered pixel might be okay when playing a videogame, but for most of the guys doing high end cad/modelling it could very well botch the job they're doing, and unnoticed cost lost time and revenue when it eventually shows up and needs to be fixed (the pixel was just an example, I don't have a better one atm).

Besides if nobody's paying for the new drivers/firmware, how're you going to be able to make your gaming radeon into a FireGL card.. firmware has to exist before you can reflash something to 'upgrade' it ya know :)

Re:I stand corrected... (1)

snuf23 (182335) | more than 9 years ago | (#11791184)

Seems like you're paying an extra few hundred dollars for software, not hardware.

Exactly. High quality OpenGL drivers optimized for professional applications are expensive to develop. They also are not necessarily going to give you the performance you would want for OpenGL games. So ATI doesn't just use the same drivers for both.
I think on the consumer level, ATI is primarily concerned with DirectX and creates a tuned OpenGL driver that implements features required by popular OpenGL gaming engines (i.e. Quake and Doom 3).

Not only software, also hardware (1)

Harry Balls (799916) | more than 9 years ago | (#11792633)

The ATI V5000 has a dual-link DVI output, and apart from other FireGLs and a recently introduced Macintosh card there IS NO video card for PCs on the market that features a dual-link DVI output.

Why should anybody care?

If you want to hook up the 30" Apple LCD monitor, you NEED a dual-link DVI interface, and boy, have I been drooling over the 30" monitor ever since it was introduced.
(Not that I could afford it at its $3000 list price, but that's a different topic.)

Re:Not only software, also hardware (0)

Anonymous Coward | more than 9 years ago | (#11793009)

What are you talking about?

I purchased a video card for around $200 a year and a half ago that has dual DVI out. Gainward's "Ultra 780/XP"; an Nvidia FX5600 part with 256 megs of RAM.

I'd have to imagine that dual DVI parts are much cheaper today.

Re:Not only software, also hardware (3, Informative)

TheRaven64 (641858) | more than 9 years ago | (#11793522)

Dual Link DVI is not the same thing as simply having two DVI ports. Dual Link DVI ports (of which this card has two) have twice the signal bandwidth of standard DVI ports, and so can drive higher resolution displays.

Ati Schmati (1, Informative)

darth_silliarse (681945) | more than 9 years ago | (#11789831)

Bet the drivers suck for a year as usual, just in time for the next product line....

drivers... (1)

Penguinoflight (517245) | more than 9 years ago | (#11791918)

Bet the drivers suck for a year as usual, just in time for the next product line....

Which were you talking about, the 777 or the Toyota?

January Called... (0, Offtopic)

Hack Jandy (781503) | more than 9 years ago | (#11789835)

They want their headline back.

Why wait a month to post this now?

Re:January Called... (0)

Anonymous Coward | more than 9 years ago | (#11789943)

because you touch yourself at night.

Dell news and hardware reviews on Slashdot? (0, Offtopic)

boingyzain (739759) | more than 9 years ago | (#11789841)

Slow day, I suppose.

and a great marking department (1)

exoir (826214) | more than 9 years ago | (#11789878)

"based on a 110-nanometer process and the card sports eight pixel pipelines, six geometry engines..." How many geometry engines does your card have?

Re:and a great marking department (0)

Anonymous Coward | more than 9 years ago | (#11790360)

I dunno, but it does have Blast Processing.

How do these compare (3, Interesting)

xRelisH (647464) | more than 9 years ago | (#11789880)

to regular video cards? I've always been curious that exactly these cards offer ( other than more raw power ) over regular video cards other than the dual DVI setup.

Are there any benchmarks comparing regular video cards versus these graphic workstation cards on modelling? Also, how do these cards do in games? Do these cards perhaps do worse in games ( optimizations toward different types of rendering, like more photo-realistic hardware rendering that isn't that distinguishable for games but is for 3d work )

Re:How do these compare (5, Informative)

TheKidWho (705796) | more than 9 years ago | (#11789905)

The drivers are more optimized for the tasks that they perform. And yes there are benchmarks, and no they are not better then gaming specific cards. Usually the gaming specific video cards beat the living shit out of the workstation graphics cards.

Here [anandtech.com]

Re:How do these compare (1)

Holi (250190) | more than 9 years ago | (#11790594)

Actually workstation cards tend to do better at applications with higher polygon counts, while gamer (consumer) cards handle textures much better.

Re:How do these compare (1)

mikael (484) | more than 9 years ago | (#11791138)

I've always been curious that exactly these cards offer ( other than more raw power ) over regular video cards other than the dual DVI setup.

Workstation cards have more hardware for switching between rendering contexts and for multi-window overlap tests. Along with faster clock speeds and more pixel pipelines as well as support for overlay and underlay planes.

Since games run in full-screen mode, you only need one rendering context, can skip the multi-window overlap tests, and dump the overlay/underlay planes.

Re:How do these compare (0)

Anonymous Coward | more than 9 years ago | (#11791597)

Huh? More hardware where? It's the same chip.
It's only drivers.

Re:How do these compare (1)

mikael (484) | more than 9 years ago | (#11792118)

Device drivers can be used to emulate features not implemented in hardware. Ever heard of the OpenGL software implementation? Remember Nvidia boasted that they were the first to implement T&L (Transformation and Lighting) in hardware? And there is nvEmulate for emulating vertex and fragment shaders.

Although, saying that, I've noticed that a FX5600 laptop supports OpenGL shading language (with the exception of condition looping) under Linux, but not under Windows XP using the exact same chip.

Re:How do these compare (1)

greazer (165571) | more than 9 years ago | (#11793166)

Actually, clockspeeds are often lower than the game card counterparts. Mostly they're just enabling workstation specific features.

Re:How do these compare (1)

greazer (165571) | more than 9 years ago | (#11793153)

These workstation class card offer more than just speed. They have special features enabled that are useful only for workstation applications. Features like anti-aliased line drawing, useless for games, but critical for CAD and similar workstation apps.

Actually, these cards strictly speaking are often slower versions of their gaming counterparts. FPS is not as important for workstation purposes. Most cards are fast enough to display the datasets needed in most workstation apps. When you buy one of these cards, you're paying for the extra effort that went into making special features in the GPU and drivers for this small market.

At the end of the day, it's not that much money for the compnaies who need these cards. The average salary of the engineer or artist using it far outweighs the cost of the card. Especially considering these same companies were buying SGI workstations costing $30k+ less than ten years ago.

In other news... (0, Funny)

Anonymous Coward | more than 9 years ago | (#11789888)

massive power outtakes in areas where a new gpu card has been sold.

Re:In other news... (0, Funny)

Anonymous Coward | more than 9 years ago | (#11789955)

i, for one, welcome our new troll-modding overlords

Re:In other news... (0)

Anonymous Coward | more than 9 years ago | (#11790092)

uh, well, at least this proves the theory right..

Aren't FireGLs the same as regular cards? (4, Informative)

BigBuckHunter (722855) | more than 9 years ago | (#11789967)

If I remember correctly, ATI fireGL cards are the same chip as their normal line, with one or two resistors added/removed from the external chip packaging. All you have to do is:

1: Remove/add the resistors and change the BIOS.
or

2: Used a readily available hacked driver to recognize your stock card as a FireGL

All in all, there is no market for a 128MB solid modeling card. We had 128MB video cards in 1996 (Glint based). This card would be a huge step backward for a number of engineers.

BBH

Re:Aren't FireGLs the same as regular cards? (0)

Anonymous Coward | more than 9 years ago | (#11790022)

Im suprised people dont mention 3dlabs line of cards. I got a vx1 for a friend back in 98 for his CAD box and he swears that it still rips geforces and radeons latest offerings in opengl for autocad.

link here [3dlabs.com]

Re:Aren't FireGLs the same as regular cards? (0)

Anonymous Coward | more than 9 years ago | (#11790048)

I seriously doubt your friend, even if he's talking about 1998 workstation card vs 2005 gaming card.

The 1998 card might have accelerated opengl line draws, but you'll get that with a current era workstation card too.

Re:Aren't FireGLs the same as regular cards? (0)

Anonymous Coward | more than 9 years ago | (#11790094)

ofcourse..but the point is that it was 1998...7 years ago and it was a 400 dollar card!

Mmmmmm (0, Offtopic)

Anonymous Coward | more than 9 years ago | (#11790006)

That's lovely dear. Would you like a cup of tea?

512MB Goodness (0, Offtopic)

krunk4ever (856261) | more than 9 years ago | (#11790047)

On a side note, ATI will also be releasesing a 512MB version [slashdot.org] . Even though ATI acknowledges there will probably be no performance benefits to bumping the memory support, they believe there will still be a large market of "people who don't know better" that'll purchase it.

Damnit! When will they stop? (4, Funny)

multiplexo (27356) | more than 9 years ago | (#11790104)

I just sold my left kidney so I could afford an nVidia 6800. I'm not selling a testicle just so I can upgrade my video card! Unless it gives me a 3fps framerate increase in Doom III, then I might consider it.

Trust me. (0)

Anonymous Coward | more than 9 years ago | (#11791614)

You won't need it.

So yeah, it's worth it.

Re:Damnit! When will they stop? (1)

mnmn (145599) | more than 9 years ago | (#11793298)

I'm at pains to wonder who will buy it.

Nice and timely article (0, Redundant)

Rufus211 (221883) | more than 9 years ago | (#11790151)

from the anandtech article:
Date: January 31st, 2005

Re:Nice and timely article (0)

Anonymous Coward | more than 9 years ago | (#11790216)

So what .. whey didn't u post the article on time lazy ass.

Re:Nice and timely article (0)

Anonymous Coward | more than 9 years ago | (#11790286)

Why didn't u post story lazy ass.

code numbers galore (1)

spywarearcata.com (841806) | more than 9 years ago | (#11790210)

Wow! This must be a great product with all the X's and four digit numbers in all the coded names!

I've got to tell Boddicker before he uses the Cobra gun on the SUX3000!

Can you believe this? (0)

Anonymous Coward | more than 9 years ago | (#11790379)

I really and honestly cannot believe that all of you morons would argue over the dumbest shit.

You people are in dire need of a life.

They told me that morons hang out here and I just had to see it for myself.

meh (0)

Anonymous Coward | more than 9 years ago | (#11790526)

still doesnt come close to the 3DLabs wildcat Realizm 800.

good try ATI but no cigar.

Why is this in games? (1)

Holi (250190) | more than 9 years ago | (#11790554)

Why is a midrange workstation FireGL video card being discussed in slashdot/games. This is not your gamer's video card, This is meant for OpenGL apps in a workstation setting.

Re:Why is this in games? (1)

the angry liberal (825035) | more than 9 years ago | (#11792971)

Probably because there is no slashdot/workstation section.

Thanks for posting this... It's not like people who actually read the thread didn't get to see virtually the same post fifty times. >:)

I don't friggin care! (0)

Anonymous Coward | more than 9 years ago | (#11790689)

I have long since lost interest in video card marketing exercises.

Alphabet soup of model numbers & meaningless tiny increments in performance requiring massive expenditure of money to keep up with - NO THANKS!

Not really made for Games... (0)

holymoo (660095) | more than 9 years ago | (#11791042)

This card is made for high-end ray-retracing renders, this accounts for the high price tag. For instance, most users would not buy a xeon for playing hl2, since a prescott pentium 4 would provide better gaming performance. As for linux support, it looks like your going to have to wait till the get around to releasing decent drivers that still are a pain to work with linux.

Re:Not really made for Games... (0)

Anonymous Coward | more than 9 years ago | (#11793399)

Equally, you wouldn't use a Prescott if an AMD FX-55 was within your grasp.

It's been proven time and again - Intels are NOT GAMING CPUs. AMD have only one CPU that underperforms an Intel on the 939 line - the 3000+. (I refer you to http://anandtech.com/cpuchipsets/showdoc.aspx?i=23 30 [anandtech.com] )
Quote: "Things don't look so good for Intel here, the Pentium 4 570J is the only Intel CPU capable of outperforming the Athlon 64 3000+. Unfortunately for Intel, AMD's Athlon 64 4000+ is about 14% faster at 1280 x 1024.

The Extreme Edition CPUs don't do much for Intel, as Prescott does appear to perform equal or better clock for clock than the older Northwood core."
Given that the 570J is a whole 2GHz faster in clock, that's astonishing. What happened to Intel's product superiority? Out the window(s), that's where.

yes, but.... (0)

coopseruantalon (835573) | more than 9 years ago | (#11791984)

yes, but will it run Linux?

Does it work with the 30" Apple LCD monitor? (1)

Harry Balls (799916) | more than 9 years ago | (#11792568)

The 30" Apple LCD monitor REQUIRES a dual-link DVI card.
The ATI V5000 card has dual link capability on one of its output channels.
Thus, they SHOULD work together.

Now, has anybody tried, do they ACTUALLY work together in real life?
(Not that I have the $3700 lying around that will pay for both the graphics card and the monitor.)

Re:Does it work with the 30" Apple LCD monitor? (1)

JackAxe (689361) | more than 9 years ago | (#11793007)

You get a nice discount as a developer. :) I saved $300 on my 30". The screen system and video card were $1100 less then the retail price. When I minus the $500 ADC Select fee, I saved about $600 total. But now I get Tiger seeds every other month and a support call or something like that. :)

A 30" is worth every penny, if you work with graphics.

Confirmation (3, Informative)

Dixie Flatliner (850959) | more than 9 years ago | (#11792843)

Workstation cards provide almost no performance for games, unless those games are entirely OpenGL based, in which case they simply provide very poor performance. They do however run Maya and other high end rendering environments, something even your papa's SLI 6800U can't handle. Although I've tried another FireGL card in this performance range and was less then impressed. Stick with a FX3000 Quadro if you're at all serious about what you do.

And yes, it will work perfectly with an Apple 30" Cinema display.

Apple 30" Cinema
Dual Xeon 3.2GHz
4GB ECC DDR RAM
Quadro FX3400

Sounds ok... (1)

miyako (632510) | more than 9 years ago | (#11793574)

The specs on the card look nice, though I have to wonder why it only has 128mb of memory for a "mid-range" card. Most other mid-range cards tend to have at least 256mb of ram, and nearly all of the high end cards have at least 512 megs of ram (the card I've been eyeing, though can't really justify the cost of right now has 640mb of memory). Of course, it's been quite a while since I've used either a FireGL or a Quatro, last I remember neither of the cards offered much bang for the buck or could really compete with cards from some of the manufacturers who just make workstation cards.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>