Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ATI's 1GB Video Card

ScuttleMonkey posted more than 8 years ago | from the choke-on-the-price dept.

273

Signify writes "ATI recently released pics and info about it's upcoming FireGL V7350 graphics card. The card features 1GB of GDDR3 Memory and a workstation graphics accelerator. From the article: 'The high clock rates of these new graphics cards, combined with full 128-bit precision and extremely high levels of parallel processing, result in floating point processing power that exceeds a 3GHz Pentium processor by a staggering seven times, claims the company.'"

cancel ×

273 comments

use as a cpu? (5, Interesting)

Toba82 (871257) | more than 8 years ago | (#14969791)

Why doesn't ATi (or nVidia for that matter) make CPUs?

They obviously could make some very powerful chips.

Re:use as a cpu? (2, Insightful)

Tyler Eaves (344284) | more than 8 years ago | (#14969806)

Because for most general computing tasks, floating point doesn't matter.

Re:use as a cpu? (1)

VoiceOfRaisin (554019) | more than 8 years ago | (#14969931)

how is this insightful? the parent did not say their gpus are cpus, he asked why they dont make cpus. obviously they wouldnt be exactly the same as the chips they make now. duh?

Re:use as a cpu? (4, Insightful)

Kenshin (43036) | more than 8 years ago | (#14969818)

I'm thinking:

a) Tough market to crack. AMD's been around for years, and they're still trying to gain significant ground on Intel. (As in mindshare.) May as well spend the effort battling each other to remain at the top of their field, rather than risk losing focus and faltering.

b) These chips are specialised for graphics processing. Just because you can make a kick-ass sports car, doesn't mean you can make a decent minivan.

Re:use as a cpu? (2, Insightful)

Segfault666 (662554) | more than 8 years ago | (#14969971)

... Just because you can make a kick-ass sports car, doesn't mean you can make a decent minivan.

http://personal.inet.fi/surf/porschenet/text/fut _plan/varrera/index.html

enjoy...

Re:use as a cpu? (5, Informative)

TheRaven64 (641858) | more than 8 years ago | (#14969836)

Building a GPU is trivially easy relative to building a CPU. Here are a few reasons why:
  • You have an OpenGL driver for the GPU and a JIT for shader language programs. This means you can completely throw out the instruction set between minor revisions if you want to. An x86 CPU must mimic bugs in the 486 to be compatible with software that relied on them.
  • You have an easy problem. Graphics processing is embarrassingly parallel. You can pretty much render every pixel in your scene independently[1]. This means that you can almost double the performance simply by doubling the number of execution units. To see how well this works for general purpose code, see Itanium.
  • The code you are running is fairly deterministic and unbranching. Up until a year or two ago, GPUs didn't even support branch instructions. If you needed a branch, you executed both code paths and threw the result you didn't need away. Now, branches exist, but they are very expensive. This doesn't matter, since they are only used every few thousand cycles. In contrast general purpose code has (on average) one branch every 7 cycles.
GPUs and CPUs are very different animals. If all you want is floating point performance, then you can get a large FPGA and program it as an enormous array of FPUs. This will give you many times the maximum theoretical floating point throughput of a 3GHz P4, but will be almost completely useful for over 99% of tasks.

[1] True of ray tracing. Almost true of current graphics techniques.

Mod this dude up (0)

Anonymous Coward | more than 8 years ago | (#14969856)

As much as I hate these "mod up" posts, this guy deserves it. It covers everything well. Wish I had mod points to peg this as "Informative"

Re:Mod this dude up (0, Flamebait)

dotgain (630123) | more than 8 years ago | (#14970115)

Wish I had mod points to peg this as "Informative"

Or even "Flamebait" since that's what the blunt, honest truth seems to get these days...

Re:use as a cpu? (1, Interesting)

moosesocks (264553) | more than 8 years ago | (#14969899)

I understand that CPUs and GPUs have dramatically different roles.

However, how difficult would it be to write an operating system that offloaded floating point operations to the GPU, and everything else to the CPU.

Seems like that would be making the most efficent use of available resources..... (then again, isn't that the idea of the Cell processor?)

Re:use as a cpu? (1)

Dorothy 86 (677356) | more than 8 years ago | (#14969935)

However, how difficult would it be to write an operating system that offloaded floating point operations to the GPU, and everything else to the CPU.

Because then it wouldn't be Windows, which is what the (majority) of the target market uses.

Not likely OS level (2, Informative)

HornWumpus (783565) | more than 8 years ago | (#14969962)

But you can write very specific GPU code to solve some parallel FP problems.

For more general purpose FPOPs you will have a hell of a time getting enough gain in floating point performance to overcome the overhead chatter between the CPU and GPU that would be required to keep the states in synch.

I'd go so far as to say if the process can't be near complely moved (the CPU will need to feed it data and suck up results) onto the GPU then don't bother.

But I'm talking out of my ass, it could work. I'm just skeptical.

Re:use as a cpu? (1, Informative)

Anonymous Coward | more than 8 years ago | (#14970040)

The overhead of transferring all of this stuff to the graphics card for processing and back would be way too much for almost all FP operations. Even on a general purpose CPU the overhead of fetching the data is usually more than the time it takes to process. Thats why there are caches. They exist to minimize this overhead.

However if you have large amounts of data and want to process it over and over, you can afford to transfer it to a dedicated processor. This also happens to be what this card is designed to do.

Re:use as a cpu? (1)

Andrzej Sawicki (921100) | more than 8 years ago | (#14970139)

Well, ATI offers software that allows you to use the GPU for computing physics. The method for offloading CPU load is indirect, but there you have it.

Re:use as a cpu? (1)

lostchicken (226656) | more than 8 years ago | (#14969917)

You're absolutely right on, but I've got a minor picky point about the first part. While the instruction set must remain stable pretty much downrange forever in a given architecture, this is only really true for the front-end instruction set, which isn't what the CPU actually executes on any current IA-32[e] design. Only the decode stage of things is bound by the instruction set, and the actual bulk of the work is done in the internal architecture which can, and does, change between different designs.

next thing you know... (1, Interesting)

Anonymous Coward | more than 8 years ago | (#14969991)

ATI might just add a CPU on the card in order to boost gaming performance and delivery the best gaming experience

you don't render pixels independently... (1)

YesIAmAScript (886271) | more than 8 years ago | (#14970078)

For example, if you need to render a triangle, transforming the coordinates and finding the u,v (for texture mapping), then calcuating the normals and interpolating them all has a lot of setup math. You don't want to do this setup math for each pixel.

In fact, unlike ray tracing, where you proceed pixel by pixel, checking all the geometry, in modern techniques you render the parts of the geometry in order into the frame buffer, and due to the magic of z-buffering, the end result frame buffer is complete for all pixels at the same time.

So pretty much what you say about pixels being independent is nearly completely false.

You can still parallelize a lot of stuff, but it isn't as simple as you say and is done using different techniques.

Re:you don't render pixels independently... (1)

vux984 (928602) | more than 8 years ago | (#14970138)

the end result frame buffer is complete for all pixels at the same time.

Actually, that's pretty much the definition of a parallel process. ;)

The point is the completion of one pixel isn't dependant on the completion of others. The fact that so much of the 'setup' is the same for all pixels just means you can acheive the equivalent of a million independantly computed pixels with far less actual effort.

In other words, if they -weren't- so easy to parallelize these optimizations would be impossible to make.

Re:use as a cpu? (1, Insightful)

pherthyl (445706) | more than 8 years ago | (#14970143)

Building a GPU is trivially easy relative to building a CPU

Easier? In some respects. Trivially easy? Not quite.

In contrast general purpose code has (on average) one branch every 7 cycles.

One branch every 5-7 instructions, not cycles.

Other than that, good comment, I didn't know about the (lack of) branch support in GPUs.

Re:use as a cpu? (1)

dcapel (913969) | more than 8 years ago | (#14969864)

You, like almost everyone else seeing stuff like that thinks that the first time.

The reason why GPUs aren't uber compared to CPUs is that CPUs are general purpose, while GPUs focus on a few key areas. This allows them to get really good performance at a few things, but suck at most other things. Its the difference between a Jack-of-all-trades and a specialist.

hey ya (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14969792)

Rumors swirling around 1 Infinite Loop in Cupertino have Apple developers abuzz. Mac OS X engineers, currently knee-deep in Mac OS X v10.4.4 development, are gearing up for what comes next. Due Spring '06, the next iteration of Tiger is coming Mac OS X v10.4.5, code-named Shere Khan.

Shere Khan will be the most significant and wide-reaching Mac OS X update ever. Said to merge all of the post-10.4.3 and 10.4.4 updates into a single package, it will include synchronization with Intel builds, kernel optimizations, re-enablement of Quartz Extreme 2D, and AirPort and security updates.

New features to be introduced with Shere Khan are impressive: SafeSleep for all New World Macs that support Tiger, a new handJobs [slashdot.org] framework to support a major update to CockBand [slashdot.org] , and the first iteration of "The 'Burbs [slashdot.org] ," Apple's code-name for its project to move Mac OS X from the CPU to support chips.

More reports suggest other, less-known technologies slated for inclusion in 10.4.5. Among these are support for the mysterious iDong port, a new framework and app to help Mac users get into the Mac scat scene [slashdot.org] , and Hentai support in the kernel to help migrate 5G iPod users over to penis-tentacled demons raping young girls.

Overall, Shere Khan will have profound effects on the Mac experience and will probably send the halo effect into critical mass, helping to escort the masses to the Mac. The probable release date is late April or early May of next year. Until then, visit back to get the latest on this king of Mac OS X updates!

GNAA announces a 2005 success (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14969793)

partnership
GNAA annonuces 2005 a success; rings in the new year with rumors of a hai2u partnership
Saturday December 31, 2005

Jmax (GNAP) - Miami, Florida, USA - GNAA leader timecop announced earlier today that "2005 was, overall, a success." In 2005 the GNAA found itself a new home, a myriad of new members, and some old returning members. Also, in 2005, GNAA won the war on blogs and jews.

The War on Blogs [wikipedia.org] is a front led by timecop against blogs and Jews. Blogs are a new form of spam used by politicians in order to gain votes and spread propaganda, and by denizens of the Internet (or "netizens") in order to gain fame and popularity. They are primarily used for self-promotion and other selfish uses. Jews are often the maintainer of blogs, as acts such as self-promotion and seeking fame are the priorities of Jews.

Other successful acts by the GNAA in 2005 include the obtaining of staff status on the IRC network freenode by ex-member Grog which led to panic and drama, and the trolling of avchat gooks. Said gooks were met by such lead trolls as depakoye, Staos, Jmax, and lenny. They were shocked to see the hideously obese body of depakote masturbating itself. depakote comments on the subject: "I find eating my own seed a way of sustenance. This is basically a secret passed down to me from ancient times, a method of nearly unlimited energy.

"I've gained massive amount of weight in my earlier years because of my theory of atomic compression. The weight of my body against my testes and grundle produces an effect which nearly mimics a fusion reaction, binding the atoms of the semen into a super-dense fluid, which is said to have magical powers on whomever it is blessed upon."

I, for one, was disgusted and amazed at the absolute idiotic genius of the beast that is depakote. The very sight of his large black body aroused me in such a manner that cannot be described, only felt. The nauseating stench of his semen caused me to ejaculate on sight. Needless to say, a Gay Nigger orgy occurred

2005 also proved to be a grim year for the GNAA; members l0de and Staos, and leader timecop, died. Lead technician l0de died in New Orleans during the Hurricane Katrina, which become the source of pain and suffering for many niggers in the area. Staos died of leukemia in mid-November. Leader timecop died a tragic death in late December when he was killed in a car accident while on his cell phone IRCing.

2006 will be a good year for the GNAA. The LastMeasure's new updates [nimp.org] will be tested in full; and rumors of a partnership with hai2u [hai2u.com] are rampant. Without giving out too much information, I can say with confidence that hai2u will be playing a larger part in GNAA business affairs.


About hai2u:
hai2u.com is a publicly held (NASDAQ: HAI2) Fortune 500 company, with operations based in Phoenix, AZ. Founded in 2004 by CEO Jim Blecenkley the company currently employs more than fifteen hundred diverse Americans, generating annual revenus in excess of approximately 5 billion USD.

About GNAA:
GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the first organization which gathers GAY NIGGERS from all over America and abroad for one common goal - being GAY NIGGERS.

Are you GAY [klerck.org] ?
Are you a NIGGER [mugshots.org] ?
Are you a GAY NIGGER [gay-sex-access.com] ?

If you answered "Yes" to all of the above questions, then GNAA (GAY NIGGER ASSOCIATION OF AMERICA) might be exactly what you've been looking for!
Join GNAA (GAY NIGGER ASSOCIATION OF AMERICA) today, and enjoy all the benefits of being a full-time GNAA member.
GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the fastest-growing GAY NIGGER community with THOUSANDS of members all over United States of America and the World! You, too, can be a part of GNAA if you join today!

Why not? It's quick and easy - only 3 simple steps!
  • First, you have to obtain a copy of GAYNIGGERS FROM OUTER SPACE THE MOVIE [imdb.com] and watch it. You can download the movie [idge.net] (~130mb) using BitTorrent.
  • Second, you need to succeed in posting a GNAA First Post [wikipedia.org] on slashdot.org [slashdot.org] , a popular "news for trolls" website.
  • Third, you need to join the official GNAA irc channel #GNAA on irc.gnaa.us, and apply for membership.
Talk to one of the ops or any of the other members in the channel to sign up today! Upon submitting your application, you will be required to submit links to your successful First Post, and you will be tested on your knowledge of GAYNIGGERS FROM OUTER SPACE.

If you are having trouble locating #GNAA, the official GAY NIGGER ASSOCIATION OF AMERICA irc channel, you might be on a wrong irc network. The correct network is NiggerNET, and you can connect to irc.gnaa.us as our official server. Follow this link [irc] if you are using an irc client such as mIRC.

If you have mod points and would like to support GNAA, please moderate this post up.

.________________________________________________.
| ______________________________________._a,____ | Press contact:
| _______a_._______a_______aj#0s_____aWY!400.___ | Gary Niger
| __ad#7!!*P____a.d#0a____#!-_#0i___.#!__W#0#___ | gary_niger@gnaa.us [mailto]
| _j#'_.00#,___4#dP_"#,__j#,__0#Wi___*00P!_"#L,_ | GNAA Corporate Headquarters
| _"#ga#9!01___"#01__40,_"4Lj#!_4#g_________"01_ | 143 Rolloffle Avenue
| ________"#,___*@`__-N#____`___-!^_____________ | Tarzana, California 91356
| _________#1__________?________________________ |
| _________j1___________________________________ | All other inquiries:
| ____a,___jk_GAY_NIGGER_ASSOCIATION_OF_AMERICA_ | Enid Al-Punjabi
| ____!4yaa#l___________________________________ | enid_indian@gnaa.us [mailto]
| ______-"!^____________________________________ | GNAA World Headquarters
` _______________________________________________' 160-0023 Japan Tokyo-to Shinjuku-ku Nishi-Shinjuku 3-20-2

Copyright (c) 2003-2006 Gay Nigger Association of America [www.gnaa.us]

1 gb! (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14969799)

yeah, i got the card and it's so darn fast i got the first post!

Time (-1, Troll)

Anonymous Coward | more than 8 years ago | (#14969800)

Time to pwn teh m0ds7 & devs muahahaha

Seven times you say? (0)

Anonymous Coward | more than 8 years ago | (#14969803)

...result in floating point processing power that exceeds a 3GHz Pentium processor by a staggering seven times, claims the company. P4's double ALUs result in integer processing power that exceeds a FireGL V7350 graphics card by a staggering seven times, claims the company.

hackers quote (1, Funny)

Kwiik (655591) | more than 8 years ago | (#14969808)

"it has a p6 chip. Tripple the speed of the pentium"

Re:hackers quote (0)

Anonymous Coward | more than 8 years ago | (#14969928)

RISC chip.

Re:hackers quote (1)

j79 (875929) | more than 8 years ago | (#14969970)

RISC is good

Not bad... (1)

spartacus_prime (861925) | more than 8 years ago | (#14969814)

What market is it directed towards? I hardly imagine that typical gamers or animators would have the funds for such a card.

Re:Not bad... (1)

ksheff (2406) | more than 8 years ago | (#14969866)

high detail flight simulators or virtual reality apps.

I would have loved having a graphics subsystem with a 1GB for texture maps when I was doing gl work 12 years ago.

Re:Not bad... (3, Interesting)

KajiCo (463552) | more than 8 years ago | (#14969911)

3D Artists, Game Developers, Scientists who do three dimensional simulations, Weather forcasting models, Engineers who do real time simulations.

They're not really for gaming as much as they are for developing stuff.

Re:Not bad... (1)

AgNO3 (878843) | more than 8 years ago | (#14970051)

Well is is a FireGL card. and Hell yes animators would love this card. The only problem with this card is that it is not an NVIDIA card so right animators won't want this card.

My wife gave me two thumbs up... (5, Funny)

Anonymous Coward | more than 8 years ago | (#14969815)

...when I told her that I would buy an ATI card that would allow us to decrease the gas bill for our furnace next winter. Guys, you just have to give your better half a good argument and this graphics card is installed in your computer in no time. Just don't mention that you need to buy a better air conditioner to the summer... she'll discover that one. ;)

Re:My wife gave me two thumbs up... (1, Funny)

Anonymous Coward | more than 8 years ago | (#14969873)

Guys, you just have to give your better half a good argument and this graphics card is installed in your computer

I really don't think most slashdotters cats are going to care what kind of graphics cards they have as long as the computer keeps the cat warm.

Re:My wife gave me two thumbs up... (1)

eclectro (227083) | more than 8 years ago | (#14970142)

Just don't mention that you need to buy a better air conditioner to the summer... she'll discover that one. ;)

And when she does you're either going to ebay that card or be very lonely at night.

Hot hot hot! (5, Funny)

Anthony Boyd (242971) | more than 8 years ago | (#14969817)

It's called the FireGL because it puts out heat at levels equivalent to a large fire. -T

so... (4, Funny)

Anubis350 (772791) | more than 8 years ago | (#14970007)

Are you saying it renders flames *very* realistically? :-P

Re:so... (1)

Baloo Ursidae (29355) | more than 8 years ago | (#14970087)

Heheh, Sterno [wikipedia.org] GL...

Awesome! (5, Funny)

Rank_Tyro (721935) | more than 8 years ago | (#14969821)

Now I can upgrade to Windows "Vista."

Re:Awesome! (1)

spartacus_prime (861925) | more than 8 years ago | (#14969847)

Don't waste your time, let Windows Vista crash a lesser graphics card.

Re:Awesome! (4, Informative)

Jozer99 (693146) | more than 8 years ago | (#14969930)

Appreciate the joke, but for folks out there who think he is serious, Microsoft has said that the Intel GMA 900 and ATI Radeon X200 are the minimum graphics cards for using the "new" DirectX GUI. Vista will work on computers with less graphics systems, but in a compatability mode similar to Windows XP's GUI.

Re:Awesome! (0)

Anonymous Coward | more than 8 years ago | (#14970091)

... in a year's time.

Re:Awesome! (1)

weicco (645927) | more than 8 years ago | (#14970103)

But you still can't get more than 40 fps in Counter Strike Source.

So? (4, Insightful)

Tebriel (192168) | more than 8 years ago | (#14969825)

Other than high-end graphics work, what the hell will this mean? Are you seriously saying that we will be seeing games needing that must video memory anytime soon? Hell, they have a hard enough time getting people to buy cards with 256 MB of RAM.

Re:So? (5, Insightful)

Bitter and Cynical (868116) | more than 8 years ago | (#14969897)

Other than high-end graphics work, what the hell will this mean?
Nothing. These cards are not meant for gaming, in fact if you did try and use it for gaming you'd be very upset. The FireGL line is a workstation card meant for things like CAD or Render farms that are very memory intensive and require a high level of precision. Its not meant for delivering a high frame rate and no gamer would stick this card in his machine

Re:So? (1)

atrus (73476) | more than 8 years ago | (#14970100)

The GPU is actually identical except for a few features masked off on the Radeon cards which are available on the FireGL. The framerate would be nearly identical. It would be a monumental waste of money.

256 MB is small (4, Interesting)

emarkp (67813) | more than 8 years ago | (#14970074)

Try rendering medical image data as a 3D texture (well three textures actually, one for each primary image). With 300 images, 256KB per image, x3 textures, that comes out to 225MB just for the textures. I deal with datasets like these routinely, and more video memory is a welcome development.

Re:256 MB is small (-1, Flamebait)

Anonymous Coward | more than 8 years ago | (#14970110)

He's talking about gaming, not medical rendering uses. Please read and understand posts before responding.

follow Nvidia into Physics? (5, Interesting)

arjovenzia (794122) | more than 8 years ago | (#14969826)

With all that beef behind them, i sure hope they will follow Nvidia (i actually have no doubt that they will) in offloading physics to the GPU. http://www.rojakpot.com/showarticle.aspx?artno=303 &pgno=0 [rojakpot.com]

it would be nice not having to purchase a top-notch CPU, GPU, and PPU (Physics Processing Unit) in the future, rolling the PPU and GPU together

Re:follow Nvidia into Physics? (1)

thedletterman (926787) | more than 8 years ago | (#14969956)

I'm definately a far bigger fan of Havok [havok.com] 's sli physics for nvidia than a 1GB video card. I wish ATI would spend half as much effort not making crossfire such a piece of crap.

what... teh.....fuk (1)

Spy Handler (822350) | more than 8 years ago | (#14969827)

this graphic card has MORE RAM than my entire computer, and faster too, and a faster processor, and probably a bigger heat sink.

Is it just me or are graphic cards getting ridiculously insane? I know I don't need this thing cus the last game I bought for my comp is 2002 Star Wars: GB. Maybe I'm just a lamer and and l00ser...

Re:what... teh.....fuk (5, Informative)

TheRaven64 (641858) | more than 8 years ago | (#14969861)

This is a workstation card, not a games card. The people buying this are likely to be either CAD/CAM people with models that are over 512MB (the workstation it plugs into will probably have a minimum of 8GB of RAM), or researches doing GPUPU things. To people in the second category, it's not a graphics card it's a very fast vector co-processor (think SSE/AltiVec, only a lot more so).

Re:what... teh.....fuk (2, Informative)

temojen (678985) | more than 8 years ago | (#14969985)

Stream processor, not vector processor. The programming models are different.

Width of the floats? (2, Interesting)

mosel-saar-ruwer (732341) | more than 8 years ago | (#14970014)


Signify: full 128-bit precision

TheRaven64: or researches doing GPUPU things. To people in the second category, it's not a graphics card it's a very fast vector co-processor (think SSE/AltiVec, only a lot more so)

Traditionally, ATi floating point numbers were only 24-bits wide [i.e. only "three-quarters" of single precision, which is 32-bits].

nVidia, IBM Sony Cell, and Altivec support only 32-bit floats.

MMX supported no floats whatsoever. SSE supported 32-bit floats. SSE2 & SSE3 support 64-bit floats. x86 supports 80-bit floats.

So what is this 128-bit stuff all about?

I don't suppose there's a chance in hell that these could be quad-precision floats, could they?

Re:Width of the floats? (1)

temojen (678985) | more than 8 years ago | (#14970056)

Most likely 4d single precision vectors or 2d double precision vectors if they want to market it to researchers who had been cooking their own stream processors from FPGAs

Re:what... teh.....fuk (2, Interesting)

dbIII (701233) | more than 8 years ago | (#14970044)

the workstation it plugs into will probably have a minimum of 8GB of RAM
Will it really need that much system memory? The first SGI workstation I saw had double the amount of video memory to system memory.

Re:what... teh.....fuk (1)

Telvin_3d (855514) | more than 8 years ago | (#14970088)

Oh, yeah. 8 GB is common.
Part of it is why not? Even if you could get away with 4 or 6 GB, if you are building the type of workstation that would use this chip, bringing the ram up to 8GB is a drop in the bucket as far as money goes. Even EEC 1 and 2 GB sticks of ram is cheap compared to what this card will run.

Re:what... teh.....fuk (1)

c_forq (924234) | more than 8 years ago | (#14969867)

This isn't a gaming chip, this is part of their pro line. These are for CAD and rendering purposes.

Re:what... teh.....fuk (1)

pizpot (622748) | more than 8 years ago | (#14970102)

Uhhh, I did CAD for 10 years at an office designing buses. 30,000 parts per bus. We didn't put textures on the parts, and quite frankly that card is extreme overkill for CAD IMHO. Hell, any game card did fine as long as you had lots of RAM. It was a Unigraphics shop.

Obligatory... (0)

Landshark17 (807664) | more than 8 years ago | (#14969831)

Yes, but will it run Duke Nukem Forever?

Re:Obligatory... (1)

aurb (674003) | more than 8 years ago | (#14969860)

Not only it will run Duke Nukem Forever, it will run Duke Nukem Forever on Windows Vista... Or maybe not...

Re:Obligatory... (2, Funny)

atari2600 (545988) | more than 8 years ago | (#14969921)

You mean, it will run Duke Nukem Forever, forever

Re:Obligatory... (0)

Anonymous Coward | more than 8 years ago | (#14969875)

Hey you idiot, not even the guys at 3D Realms run Duke Nukem Forever!!

No (1)

temojen (678985) | more than 8 years ago | (#14970073)

But it will run Duke Nukem really smoothly until enough silicon decays into phosphorus to make it stop running, or the owner gets tired of Duke Nukem, whichever comes first.

Useful or hype? (1)

Digital Pizza (855175) | more than 8 years ago | (#14969837)

What I wonder is, can current software and hardware make use of those massive specs and is the memory bandwidth high enough for the GPU to benefit from a gig of video ram. or is it all just a gimmick?

Is it worth somone's money to buy such a card?

Re:Useful or hype? (0)

damsa (840364) | more than 8 years ago | (#14969886)

insert joke about Phantom console, PS3 and Duke Nukem here.

Re:Useful or hype? (1)

Arramol (894707) | more than 8 years ago | (#14969949)

Two very valid questions. I remember when the first 512mb cards came out, the benchmarks showed minimal performance gains. Occasionally, they even performed slightly below their 256mb equivalents (though never significantly). Since we're only just beginning to see 512mb cards come of age, this could be history repeating itself. That said, this is a FireGL, not a Radeon, so it's not really designed with gamers in mind. It is possible then that it might actually be worth it provided you're using it for its intended purpose.

Re:Useful or hype? (0)

Anonymous Coward | more than 8 years ago | (#14970001)

One application of this might be volume rendering. Current medical CAT angiography scans have a resolution of >= 512x512x1024 with 2 bytes per voxels -- too much for a 512MB card considering that you still need some other data on there (lookup tables, etc.).

If you don't have enough GPU memory your only options are a CPU-based approach (slow), custom hardware like the VolumePro board (bloody expensive), or bricking. Bricking means that you divide the volume into sub-cubes and upload & render each of them seperately for every frame which essentially kills performance.

But increasingly powerful cards like this one might make interactive volume rendering possible using off-the-shelf GPUs. Pretty neat!

$ 2000 ! yikes.. give me two (0)

Anonymous Coward | more than 8 years ago | (#14969838)

anyway... this will be half the price by the time Vista ships.

call me again in year 2007

Re:$ 2000 ! yikes.. give me two (1)

Penguinoflight (517245) | more than 8 years ago | (#14970022)

It's a high-end workstation card; the price on these things drops only about 10% for the first 5 years simply because they aren't mass produced, and they aren't mass marketed.

Too bad its still an ATI... (3, Informative)

Jackie_Chan_Fan (730745) | more than 8 years ago | (#14969844)

ATI's opengl drivers are flakey on their non firegl line of cards. Some suspect thats by design.

Graphic card makers should get with the program and stop releasing firegl's and quadros. Just release really kick ass 3d accelerators for all.

That way we can all have full opengl support and not the lame opengl game drivers by ATI. Nvidia's gaming card opengl drivers are better than ATIs

Re:Too bad its still an ATI... (1)

Kagenin (19124) | more than 8 years ago | (#14969964)

These cards tend to be for Game and and other 3D Developers, not gamers themselves. Developers need hardware that's about a year ahead of what consumers are currently using. These also tend to be a little more optimized for Windowed 3D apps (CAD/CAM apps, Maya, etc...) than say for Full-Screen apps (games). Consumers don't need that much punch, usually (until the game is released, that is).

I've been using an ATI AIW 9800 for some time. I don't know what you're talking about regarding OpenGL implementation. I'd been using NVIDIA boards up until my AIW (my last was a GF2MX, cheap but punchy, but I'd been using NVIDIA chips since the Riva128), and ATI's implementations are as bad as you might think, after reading your post.

If you wanna cite something, that might help your arguement....

Either way, this card isn't for you or me. But if you're making Maya movies or something, it might be up your alley.

Re:Too bad its still an ATI... (1)

Baloo Ursidae (29355) | more than 8 years ago | (#14970127)

That way we can all have full opengl support and not the lame opengl game drivers by ATI. Nvidia's gaming card opengl drivers are better than ATIs

Not by enough to matter, and will probably never see the developers it needs to work right because nVidia's too much of SGI's bitch to do anything about the situation so they can open-source their drivers...

Great, but... (0)

Anonymous Coward | more than 8 years ago | (#14969852)

is there an OpenBSD driver?

ATI/nVidia make gen purpose chips ? (1)

cyberfunk2 (656339) | more than 8 years ago | (#14969855)

If these chips are so powerful, and they do seem to be somewhat general purpose (at least by evidence of people making thinks like pi calculators and other small examples utilizing the graphics hardware), why isnt intel/amd using these same techniques with their main chips ?

Re:ATI/nVidia make gen purpose chips ? (1)

Helios1182 (629010) | more than 8 years ago | (#14969868)

They are, it is just that things occur on different scales. Graphics processing is hugely parallel, most other code isn't unfortunately. There are fundemental differences that must be dealt with, a general purpose CPU is much more difficult to design than a dedicated GPU.

Re:ATI/nVidia make gen purpose chips ? (1)

drivekiller (926247) | more than 8 years ago | (#14969988)

Looking forward to some really fast GPU-based software for breaking encryption or brute-forcing passwords, then...

ATI/nVidia make special purpose chips ? (0)

Anonymous Coward | more than 8 years ago | (#14969887)

"why isnt intel/amd using these same techniques with their main chips ?"

MMX,SSE,Altivec. Of course one could ask why isn't SATA, or sound integrated into main processors, or any other specialized task? The reason is that the GP in GPU doesn't stand for general purpose. Despite the work at GPGPU, the GPU's domain is narrower than what a CPU addresses.

ATI Sucks for driver support with Linux (1, Informative)

Vskye (9079) | more than 8 years ago | (#14969865)

In a nutshell, see the subject.
I really don't give a flying *uck if any company, be it ATI or Nvidia comes out with the latest and greatest video card if it does not have proper driver support! Anyone who's run linux for awhile knows the drill.

Re:ATI Sucks for driver support with Linux (2, Informative)

kitejumping (953022) | more than 8 years ago | (#14969871)

thats why I buy nvidia... if the performance is the same, may as well have driver support.

ATI's Linux Zealotry Support Sucks (0)

Anonymous Coward | more than 8 years ago | (#14969948)

"I really don't give a flying *uck if any company, be it ATI or Nvidia comes out with the latest and greatest video card if it does not have proper driver support! Anyone who's run linux for awhile knows the drill."

Gee. Zealots don't give a flying fuck about non-zealots. Film at eleven. The people who will be using this card aren't going to be using Linux. Therefore your don't care attitude is rather irrelevent. For the domain were Linux will be used. Video cards aren't being used. e.g. clusters. So once again your don't care attitude is irrelevent.

ATI works great for me with Linux (2, Interesting)

deek (22697) | more than 8 years ago | (#14970006)

Flogging generic statements like "ATI sucks for Linux", is not very accurate. A better way of putting it is "ATI sucks for some cards under Linux".

I can certainly say that my laptop, with its ATI Radeon Xpress 200M chip, works wonderfully under Linux. Yes, I'm talking about their binary driver distribution. Using the latest version of their drivers. I'm also using the Xorg 6.9 xserver. It's fully 3D accelerated, as shown in the following command:

$ glxinfo | grep OpenGL
OpenGL vendor string: ATI Technologies Inc.
OpenGL renderer string: RADEON XPRESS 200M Series SW TCL Generic
OpenGL version string: 2.0.5695 (8.23.7)

I'm aware that the binary driver doesn't work with some ATI cards, especially some of the top range ones. But for what I use, it's brilliant. The installer is a little easier than the Nvidia one too. Thanks ATI, you've done a great job, from my perspective.

Re:ATI works great for me with Linux (2, Informative)

yellowcord (607995) | more than 8 years ago | (#14970085)

Try turning on the whiz bang hardware accelerated effects in KDE. You get a nice garbage screen if you turn on "Use translucency/shadows" in Window Behavior under desktop in the Control Center. Apparently it works great on the FOSS drivers for the Radeon 8500 and below but I get a screen full of garbage on my 9800. Probably not a big deal for most people but its very annoying to me.

Also frustrating is the lack of support for games played through Cedega, I just installed the Cedega timedemo tonight and Guild Wars almost but not quite works. I have characters missing heads and the frame rate is about 2 frames per second. I can get decent frame rates with a different setting but it crashes if I go to another district. This may be fixable, but so far my next computer is going to have an nVidia GPU.

Re:ATI Sucks for driver support with Linux (1)

barefootgenius (926803) | more than 8 years ago | (#14970059)

It does seem that they cheat somewhere doesn't it. Tonight I'm putting my 9600 pro into a friends computer because my old Geforce 4 works just as well. It doesn't get the framerates that the Radeon does but its much smoother and the drivers aren't a bitch to install.


On the other hand, at least ATI does Linux drivers.

O.O~! (0, Redundant)

KajiCo (463552) | more than 8 years ago | (#14969898)

...

If only I didn't have to sell my car to buy one :/

Whoa. (0, Redundant)

RoffleTheWaffle (916980) | more than 8 years ago | (#14969910)

Okay, if this thing is seven times more powerful than a three-gigahertz Pentium processor, why the hell aren't these guys making CPUs alongside their GPUs? Seriously, GPUs have gotten to the point at which they are just as if not more powerful than standard CPUs, and with quantities of RAM to match a whole PC. The news of this new and extremely powerful GPU - should it stand up to the hype - alongside news that NVidia has developed a physics coprocessing technique using SLI leads me to believe that GPUs may soon no longer be just GPUs, but complete co-computing units to which graphics, physics, and other demanding tasks could be offloaded. That might be interesting.

Re:Whoa. (1)

atari2600 (545988) | more than 8 years ago | (#14969938)

Here's an analogy:

Company A: Company A makes hand held projectile weapons (insert your favorite weapon here). Their latest and greatest hand-held, one man operable compact weapon is twice as powerful as an Anti-aircraft gun (can punch a 2x bigger hole in a cuboid of lead).

Company B: This company makes anti-aircraft guns that, well, have one purpose.



Kid who knows nothing about the Instruction set and complexity of the process: Why THE HELL doesn't Company A make Anti-Aircraft guns since they make GUNS THAT ARE TWICE AS POWERFUL (when it comes to shooting through lead).

Think about it. (Also read the post near the top of this page/post).

Re:Whoa. (3, Informative)

Jozer99 (693146) | more than 8 years ago | (#14969941)

Explained in detail above. Suffice it to say that CPUs and GPUs are radically different. With GPU's, ATI can throw out old architectures and create new ones whenever they want (quite often). Since the hardware is accessed by a driver, the user isn't limited in what programs they can use. With CPU's, everyone is stuck with x86, which was invented in the 1980s. You can't break compatability with x86. GPUs do mostly simple floating point calcuations. Therefore, they are basically massively parallel FPUs. If they need to do a non-floating point calculation, they are quite slow. CPUs can do floating point calculations, but also many other types of calculations, and are about equally good at everything. For the sake of heat, power consumption, size, and cost, the FPU on a CPU is not nearly as large as a GPU. If each processing unit on a CPU was the size/power of a specialized processor (GPU, ect...), the chip would be gigantic, and so would be hard to make, expensive to buy, consume massive amounts of power, and emit unimaginable heat.

New for 2010! (1)

FlyByPC (841016) | more than 8 years ago | (#14969918)

A fire-breathing, liquid-helium-cooled 256-core graphics engine, complete with 2KW independent power supply, temperature throttling, and VR interfaces.

(Oh, yeah -- and we think there may be a CPU in there somewhere, too.)

damn. (0, Redundant)

popeguilty (961923) | more than 8 years ago | (#14969945)

Now I have to change my pants.

"workstation cards" what r they 4? (1)

2ms (232331) | more than 8 years ago | (#14969955)

I use Pro/E as my job right now but have only ever used it on this 4 year old Dell "workstation" with some kind of 4 y/o Fire card in it. But though this equipment is old, I have never for a second felt like anything was slow at all while using Pro/E. So what are these insane cards for? Im sure they are for something I just would genuinely like to know what it is. I would have thought they were for CAD but now I see other people running Pro/E just fine on like laptops with integrated Intel graphics.

Re:"workstation cards" what r they 4? (3, Interesting)

Mithrandir (3459) | more than 8 years ago | (#14970037)

I work a lot with the visualisation end of the market and recently have been working with NASA on the CEV project(s). Some models that we deal with are in the gigabyte file size just for the geometry for a single subassembly. This card would make viewing some of these things far easier as you can preprocess and schlepp almost all the geometry to the video card as a VBO and never have to pass it over the bus again. Makes for tremendous performance gains.

Re:"workstation cards" what r they 4? (3, Interesting)

LookoutforChris (957883) | more than 8 years ago | (#14970122)

As a fellow Pro/ENGINEER user this is not my experience. What version are you using and how big are your models? The latest version is a hog (as always). I can't imagine using it on an old Dell with a FireGL and doing anything very complicated. I have to admit I'm not a fan of ATI cards, their OpenGL support seems to be very flaky. But I like the larger memory on these new cards and the price is good. Price wise this card would seem to compare favorably to a top model WildCat Realizm or a top model nVidia Quadro.

innovation (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14969957)

In software, the area with the most innovations and cutting edge technologies is always that of gaming and entertainment. The same goes for hardware.

Now, imagine if the R&D money went to medical instead. . .

Saweet...GIS time (1)

The GIS Guy (962775) | more than 8 years ago | (#14969975)

Now to light a fire under the IT dept's rear to me one of them

Can i reallocate that memory as system memory? (2, Interesting)

mOOzilla (962027) | more than 8 years ago | (#14970009)

Can i reallocate that memory as system memory?

Yes, but... (-1, Redundant)

SilentOneNCW (943611) | more than 8 years ago | (#14970015)

Will it be able to handle Duke Nukem Forever?

Ars Technica discussion on why so much GPU memory (1)

Brett Johnson (649584) | more than 8 years ago | (#14970019)

Last year, Ars Technica talked about how newer OS's are leveraging fast GPUs for advanced graphics. The main problem is the bottleneck between system memory and GPU/VRAM. One solution is to move the bottleneck to the other side of the backing store.

http://arstechnica.com/reviews/os/macosx-10.4.ars/ 13 [arstechnica.com]
http://arstechnica.com/reviews/os/macosx-10.4.ars/ 14 [arstechnica.com]
http://arstechnica.com/reviews/os/macosx-10.4.ars/ 15 [arstechnica.com]

now if only they knew how to make drivers (2, Insightful)

xamomike (831092) | more than 8 years ago | (#14970047)

sounds great and all, but have they gotten around to paying their own programmers to make drivers that actually work, and install off the CD it comes with, instead of outsourcing it to a few guys in their basement?

Seriously, I've owned 6 different ATI cards of differing lines this year, and only 2 of them installed properly with the drivers that came on the CD. That just aint right.

Yes but (2, Funny)

Anonymous Coward | more than 8 years ago | (#14970048)

Does it get you banned in World of Warcraft?

Not for gaming, for graphics workstations!! (3, Informative)

Anonymous Coward | more than 8 years ago | (#14970124)

Ok, I cannot beleive the absurd number of posts I am seeing from lamers who think this thing is for video games. Hello People! Both ATI and NVidia have had seperate high-end workstation lines for years now! This is nothing new. Where have you people been?

This card is for people who need serious rendering of high detailed scenes and 3D objects, not serious frame rates for games. For applications where image quality, complexity, and accuracy are much more important than frame rate. The GPUs in these high end workstation cards are geared in a totaly different manner and actually suck for video games! These are great for CAD/CAM, medical imaging (like from CAT and EBT scanners), chemical modeling, and lots of other hard core scientific and 3D developement type stuff.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...