Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Makes First 4GB Graphics Card

ScuttleMonkey posted more than 5 years ago | from the making-everything-else-cheaper dept.

Upgrades 292

Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.

cancel ×

292 comments

Sorry! There are no comments related to the filter you selected.

Power != memory (1, Interesting)

unity100 (970058) | more than 5 years ago | (#25704075)

excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.

lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card and they always kick the living daylights out of that nvidia card in terms of cost/performance per unit of processing power.

Re:Power != memory (1)

jgtg32a (1173373) | more than 5 years ago | (#25704181)

Does ATI have some sort of CUDA functionality?
This isn't a gaming card

Re:Power != memory (5, Informative)

Neon Spiral Injector (21234) | more than 5 years ago | (#25704295)

Yes, AMD's Stream [amd.com] technology. I don't think it is used as much as CUDA in practice.

Re:Power != memory (0)

Anonymous Coward | more than 5 years ago | (#25704979)

It is not used anything like as much. ATI hasn't decided what they want to do. For a while they were touting Brook++. The problem there may have been that one of Nvidia's employees wrote Brook as his PhD thesis :)

Re:Power != memory (2, Interesting)

GNUPublicLicense (1242094) | more than 5 years ago | (#25704811)

Moreover ATI/AMD specs are opened... meaning you can code directly the hardware. That's times more powerfull and flexible than CUDA. And there are frameworks in the works in order to have easy access to GPU lowlevel interfaces (see Intel/AMD GEM work in the mesa project).
Basically, nvidia behavior is generating a lot of hate in coders community...

Re:Power != memory (4, Interesting)

Grey_14 (570901) | more than 5 years ago | (#25705217)

Coder Hate like that brought by the shitty, bug filled drivers that ATI has a long history with?

I think ATI/AMD is on the right path, but they have a long history of being on the wrong path, while NVIDIA has always been more towards the middle (Not completely right, but not too badly wrong). It'll take some time before I jump to the ATI Bandwagon as completely as you obviously have.

Re:Power != memory (1)

KasperMeerts (1305097) | more than 5 years ago | (#25705285)

Man, I would love it if you were right but face it, nVidia still controls the largest part of the market. The reason nVidia isn't opening up their specs is because they don't have to. That's the big problem with a monopoly, you don't have to give a shit. And that's exactly the same we've been getting from Microsoft.
However, there is hope. The Vista failure is biting Microsoft in the ass so hopefully this will also happen in a way for nVidia and give us some OSS drivers. If everything else fails, there is still Nouveau.

Re:Power != memory (5, Funny)

LearnToSpell (694184) | more than 5 years ago | (#25705411)

That's times more powerfull and flexible than CUDA.

I like how statistics are so meaningless we're not even putting the numbers in anymore.

Re:Power != memory (1)

Gewalt (1200451) | more than 5 years ago | (#25705895)

I ACCIDENTALLY THE WHOLE COMMENT!!!

*insert lowercase letters here to defeat lameness filter that doesn't know how to take a joke.

Re:Power != memory (4, Insightful)

ardor (673957) | more than 5 years ago | (#25705559)

meaning you can code directly the hardware

Guess what CUDA and Stream have been designed for? Yes: for programming the hardware. What you suggest is pure insanity. NEVER EVER touch hardware directly from an userland app. And once you start writing a kernel module, you end up with something like CUDA/Stream anyway.

I am a coder, and quite frankly I couldn't care less about nvidia drivers being closed source. They are MUCH better than the ATI ones, especially in the OpenGL department. nvidia whipped up a beta GL 3.0 driver in less than a month since GL3 specs were released. ATI? Nope. New standardized feature X is added to the registry. nvidia adds it pretty quickly; ATI adds it months, even years later. nvidia drivers are also pretty robust; I can bombard them with faulty OpenGL code, and they remain standing. With ATI's fglrx, even CORRECT code can cause malfunctioning.

THESE are the things I care about. Not the license.

Re:Power != memory (-1, Flamebait)

Mister Whirly (964219) | more than 5 years ago | (#25705479)

Yes, ATI came out with it's new SaraCUDA card. It comes decked out in really expensive case (that somebody else paid for), has lipstick, and doesn't take any crap from Nvidia cards. If the device drivers don't handle it correctly, it has been know to throw tantrums. Comes packaged with the game "Big Moose Hunter". CAUTION - adding this card to your computer may cause your peers to have less support for your system.

Re:Power != memory (4, Informative)

rogermcdodger (1113203) | more than 5 years ago | (#25704425)

Or maybe there are companies that need high end cards with 4GB of RAM. This isn't some trick to get consumers to pay more for a low end card. This is now Nvidia's highest end workstation card.

Re:Power != memory (1, Interesting)

Anonymous Coward | more than 5 years ago | (#25704441)

This is not a gaming card, this is CAD/CPU card. what I mean to say is, this card in is basically a super computer that fits in a slot and you can put more than 1 on a machine. The idea, being that you can use these GPU's to do more than just graphics. This card's shader is horrible and would not play any graphics intensive game well. it would be a waste of money for a gamer.

Re:Power != memory (2, Informative)

ccool (628215) | more than 5 years ago | (#25705503)

You're absolutly right, but it would be amazing with any CUDA-apps right now. Hell, you could probably use that to encode your H.264 movies more than 18x faster!!! see http://www.nvidia.com/object/cuda_home.html [nvidia.com]

Re:Power != memory (5, Informative)

Ephemeriis (315124) | more than 5 years ago | (#25704697)

excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.

lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card and they always kick the living daylights out of that nvidia card in terms of cost/performance per unit of processing power.

In case the $3,500 price tag didn't tip you off, this isn't a gaming/enthusiast card. This is a Quadro - a professional card for high-end 3D rendering. Stuff like generating film-grade 3D or insane CAD stuff. Actually, due to the design of the card, it'd be pretty horrible at playing games.

This thing is aimed at high-end scientific calculation and professional-grade rendering.

ATI may, or may not, have something comparable. ATI may even have something better. I don't know, I don't follow the GPU industry very closely. But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.

Re:Power != memory (3, Funny)

doti (966971) | more than 5 years ago | (#25705467)

In case the $3,500 price tag didn't tip you off, this isn't a gaming/enthusiast card. This is a Quadro - a professional card for high-end 3D rendering. Stuff like generating film-grade 3D or insane CAD stuff.

Cm'on, we are all grown ups here. You can say it clearly:

It's for high-detailed 3D virtual porn.

Re:Power != memory (1)

Mister Whirly (964219) | more than 5 years ago | (#25705647)

You don't want to see most of those "actors" in that great of detail.

Re:Power != memory (3, Insightful)

GleeBot (1301227) | more than 5 years ago | (#25705581)

But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.

Umm, the GeForce GTX 280, a gamer card released last summer, also has 240 "cores" (as Nvidia counts them; actually stream processors).

This workstation card, as you might expect, is essentially the same thing as the consumer card, just tweaked towards the professional market (more RAM, different drivers). It's nothing especially innovative.

Re:Power != memory (2, Insightful)

pak9rabid (1011935) | more than 5 years ago | (#25705221)

excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.

Are you some kind of idiot?

With 4GB of graphics memory and 240 CUDA-programmable parallel cores

That alone should be a plain indicator that this ISN'T a consumer-level card, nor is it even remotely close to being targeted as such by nvidia.

Re:Power != memory (1)

Hatta (162192) | more than 5 years ago | (#25705309)

lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card

But can you get decent drivers for it?

Re:Power != memory (4, Interesting)

mikael (484) | more than 5 years ago | (#25705415)

There is no upper limit on the amount of memory required for tasks like volume visualisation, where you have a nice big 3D cube of data in 16-bit format. A cube 1024 voxels in each dimension with a single channel of 16-bit data (2 bytes) is going to be 2 Gigabytes. You will need at least two such cubes to do any sort of image processing work.

Even a digital movie can be considered to be a cube if you consider time as the 3rd dimension.

Rather than having cards with a fixed amount of VRAM, which can't manufacturers just put a bunch of memory card sockets on the card and allow users to add memory when they want?

Re:Power != memory (1)

vadim_t (324782) | more than 5 years ago | (#25705597)

They used to do that. You can find sockets on ancient cards, like a S3 ViRGE.

The problem in my understanding is that current cards push memory as far as it will go, and a socket would impose a limit on it. Besides, it's hard to put a heatsink on the RAM then.

Re:Power != memory (1)

GleeBot (1301227) | more than 5 years ago | (#25705713)

Rather than having cards with a fixed amount of VRAM, which can't manufacturers just put a bunch of memory card sockets on the card and allow users to add memory when they want?

They used to, at least on some of the better ones (way back when "Windows accelerators" that drew rectangles faster were hot stuff).

You need to realize, though, that graphics cards are on the absolute bleeding edge of memory technology. It's not electrically feasible to pump the sort of bandwidth and latency a modern GPU requires (which is literally orders of magnitude faster than what your CPU gets), while at the same time enabling expandability (especially if you want to make it a standard socket).

Video card design takes a systems approach to integration a GPU with the right RAM and other components. It's one of the main reasons why GPU technology can advance much more quickly than CPUs do. How much benefit are you really going to get when your blazingly-fast GDDR3 RAM gets replaced the very next year with the latest hot new GDDR4 RAM technology? (This is at least one of the reasons why VRAM upgrades never really took off.)

And really, the cost of RAM is a pretty minor portion of the total; compare the price differential between otherwise comparable, say, 256 MB and 512 MB cards.

Re:Power != memory (1)

icedcool (446975) | more than 5 years ago | (#25705543)

Yea. This card is made for Cad functioning and not shading or any kind of gaming. The card is more for workshops.

Just what I always wanted! (5, Insightful)

Jaysyn (203771) | more than 5 years ago | (#25704119)

A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

Re:Just what I always wanted! (1, Troll)

TheSovereign (1317091) | more than 5 years ago | (#25704301)

you mean you aren't on x64 yet? pfft get with it gramps!

Re:Just what I always wanted! (1)

altloser (1113413) | more than 5 years ago | (#25704437)

nobody uses xp64

Re:Just what I always wanted! (1)

TheSovereign (1317091) | more than 5 years ago | (#25704495)

believe it or not on my home pc, I DO

Re:Just what I always wanted! (3, Funny)

smussman (1160103) | more than 5 years ago | (#25704683)

I do as well. And *both* of us can't be nobody.

Re:Just what I always wanted! (0)

Anonymous Coward | more than 5 years ago | (#25705163)

As do I. It is the greatest thing since Windows 2000.

Re:Just what I always wanted! (1)

Poltras (680608) | more than 5 years ago | (#25705385)

I do as well. And *both* of us can't be nobody.

you both ain't somebody either.

Re:Just what I always wanted! (1)

flape (1114919) | more than 5 years ago | (#25705437)

get real ... reaaly good choice for desktop os ;) more stable and faster than xp32 as the xp64 is based on windows server 2003 core ;)

Re:Just what I always wanted! (1)

lorenzo.boccaccia (1263310) | more than 5 years ago | (#25704741)

I shouldn't feed the troll but I feel the urge to add some vista folklore: the first vista x64 release reserved as much address space as needed to match the ram in the video card, providing linear address mapping. mind you, not ram, but address space, which is a different concept. Still, as cards didn't handle very well to be mapped to a high address range, cards were mapped under the 3g address space. 32bit legacy application would then see a reduced space and would not be able to use any ram already in a space addressed for the graphic card.

Now, throw a 4g card in the fray, and watch hilarity as ensues.
this is all resolved in sp1, of course (and to be fair).

Re:Just what I always wanted! (1, Informative)

Anonymous Coward | more than 5 years ago | (#25704421)

Um the card in question is in the Graphics workstation Quadro line, not for gaming, for making money doing CGI animations, 4D modeling, etc. Would not have 32bit drivers for XP, only drivers 64bit Microsoft and Linux O/S's.

Re:Just what I always wanted! (5, Insightful)

IanCal (1243022) | more than 5 years ago | (#25704467)

If you're doing scientific computing requiring about 4 gigs of ram, and need the processing power of current-gen graphics cards then you should be able to figure out how to migrate from XP32 to 64 bit.

That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.

Re:Just what I always wanted! (1)

saudadelinux (574392) | more than 5 years ago | (#25705785)

That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.

There's [gizmodo.com]

a FUCKLOAD [wikipedia.org]

of problems that are [nvidia.com] I'll never buy anything with an Nvidia card in it again.

Especially since (1)

Sycraft-fu (314770) | more than 5 years ago | (#25705793)

You can still run all your 32-bit programs with no problems. Windows has an extremely good virtualization layer that allows 32-bit software to run in the 64-bit OS with no problems. We've done extensive testing and use of it at work. So even supposing you did need a big card and a 32-bit app, well that'd work.

Of course if you are doing anything that would need 4GB of video RAM, good chance you need a good deal more than 4GB of system RAM. After all, you probably aren't keeping the data only on the card, never to touch main memory. So there's a real good chance once you take the OS + background tasks + your app + your data you get a required amount of RAM over 4GB.

If you are getting a video card with 4GB of RAM, I'd be surprised if you didn't have 8GB or more of system RAM.

Re:Just what I always wanted! (1)

Lord Ender (156273) | more than 5 years ago | (#25704535)

The sort of person still running a 32 bit OS is not from the same set as those who might spend $3k on the latest and greatest hardware. You don't matter to them.

Re:Just what I always wanted! (0)

Anonymous Coward | more than 5 years ago | (#25704587)

You don't understand memory management on Windows XP.

The OS doesn't allocate video RAM into the OS; DirectX hands those features off to the video card to do its own memory management with DirectX objects. The OS pumps DirectX data to the video card driver, which passes it off to the video card, without mapping that memory into system memory space.

Re:Just what I always wanted! (0)

Anonymous Coward | more than 5 years ago | (#25704665)

You don't understand PCI addressing or graphic apertures.

Re:Just what I always wanted! (0)

Anonymous Coward | more than 5 years ago | (#25705805)

Yup. Parent is right, and GP is wrong.

Re:Just what I always wanted! (1)

Hal_Porter (817932) | more than 5 years ago | (#25704875)

A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

That's not necessarily a problem. VRAM doesn't have to be mapped into system RAM, it could be banked like it was back in the Dos/ISA days. It's not really clear whether they do but I did find this

http://lists.opensuse.org/opensuse-multimedia-de/2006-04/msg00012.html [opensuse.org]

The graphics component of XFree86-DGA is not supported because it
requires a CPU mapping of framebuffer memory. As graphics boards
ship with increasing quantities of video memory, the NVIDIA X
driver has had to switch to a more dynamic memory mapping scheme
that is incompatible with DGA. Furthermore, DGA does not cooperate
with other graphics rendering libraries such as Xlib and OpenGL
because it accesses GPU resources directly.

Which sounds like graphics memory is banked into CPU address space on an NVidia card. It doesn't make any difference in the common case because normal the hardware on the card does the drawing, not the CPU. Using a card like this as a dumb framebuffer for the CPU to draw into is kind of a waste.

Re:Just what I always wanted! (1)

petermgreen (876956) | more than 5 years ago | (#25705023)

The days of the graphics card mapping all it's memory into PCI address space at once are over and have been for some time. IIRC modern cards use a movable window of 256MB or so for access to graphics card ram from the rest of the system.

How did this retarded comment get upmodded? (3, Interesting)

Nimey (114278) | more than 5 years ago | (#25705095)

Really, people. If you're going to buy such an expensive professional card, you're going to go with a professional-grade operating system, which will of course be 64-bit.

Re:Just what I always wanted! (2, Insightful)

Jackie_Chan_Fan (730745) | more than 5 years ago | (#25705297)

32-bit is dead. It should have been dead 4 years ago...

Any serious computer enthusiast or professional running a 32bit os on today's hardware should be ashamed. They're holding the industry back.

Re:Just what I always wanted! (1)

PRMan (959735) | more than 5 years ago | (#25705527)

Actually, it's all the applications (such as Adobe Flash) that don't work on 64-bit that are holding the industry back.

It's much easier for them to recompile than it is for us to work without certain software...

Re:Just what I always wanted! (1)

Jaysyn (203771) | more than 5 years ago | (#25705547)

Yeah, it was a joke, I'm not using modern hardware. Hell, I don't even have a computer with a PCIe bus in it other than my laptop.

Re:Just what I always wanted! (1)

pak9rabid (1011935) | more than 5 years ago | (#25705521)

A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

A few things things wrong with this statement:

  1. The GPU (which is far beyond 32 bits) is accessing the VRAM, not the CPU
  2. Video rendering/CAD powerhouses are the target audience for this card (not consumer-level gamers/enthusiasts), whom are probably NOT going to be running this card on a 32 bit version of XP

Benchmarks (1)

toxygen01 (901511) | more than 5 years ago | (#25704121)

Any numbers how does it compare to the rest of cards? except number of cores and amount of memory...

History repeats... (2, Informative)

B5_geek (638928) | more than 5 years ago | (#25704143)

I am reminded of old 3DFx advertisments just before they went belly-up.

Re:History repeats... (1)

CaptainPatent (1087643) | more than 5 years ago | (#25705441)

Hey, I loved my Voodoo 5 6000 [wikipedia.org] until the day it was never shipped to me.

what a revolution (5, Funny)

jollyreaper (513215) | more than 5 years ago | (#25704155)

Does this mean we can finally run Crysis now?

Re:what a revolution (5, Funny)

dreamchaser (49529) | more than 5 years ago | (#25704445)

Oh come on, you can run Crysis with half that number of cores and only 2 gig of video RAM. This card is obviously being built because of the impending release of Duke Nukem Forever.

Re:what a revolution (1)

Albanach (527650) | more than 5 years ago | (#25704723)

Nope, but you can run Vista.

Re:what a revolution (2, Funny)

SimonTheSoundMan (1012395) | more than 5 years ago | (#25704935)

Almost.

misread the subject (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#25704179)

I read that as 4*MB* video card.

  I fucking hate the beginning of work weeks.

Re:misread the subject (4, Funny)

Bromskloss (750445) | more than 5 years ago | (#25704479)

I read that as 4*MB* video card.

I fucking hate the beginning of work weeks.

Working hard, I see.

Any wagers? (1, Troll)

Duradin (1261418) | more than 5 years ago | (#25704183)

Anyone want to put odds on the next Crysis requiring at least two of these?

All things considered, I'm glad I gave up the pc gaming habit. Consoles may not have the newest-super-duper-double-1337-hyper-lens-flair effect but they do tend to play any game made for the system without feeding it new hardware every six months.

Not a card for gamers. (1)

BoredAtWorkWhatElse (936972) | more than 5 years ago | (#25704385)

It's a Quadro [wikipedia.org] , not the kind of card you buy to play Crysis on. It's meant for workstations or server farms.

Re:Any wagers? (1)

jgtg32a (1173373) | more than 5 years ago | (#25704397)

Technically Crysis runs on a ATI 9800, personally I run it on a 1950pro (med setting) and it looks better than most games.

Re:Any wagers? (1)

BrennanM3 (1397275) | more than 5 years ago | (#25704455)

You don't have to upgrade your machine with the latest expensive stuff every 6 months to play games. I'm still running a machine from 2005 and have played crysis, fallout 3, etc without any problems.

Re:Any wagers? (1)

Arboris Clover (916737) | more than 5 years ago | (#25704473)

Unfortunatly, bad design in games can still create slowdowns on consoles. Especially with multiplatform titles.

Imagine... (1)

Anonymous Coward | more than 5 years ago | (#25704221)

a beowulf cluster of beowulf clusters of these

Preemptive strike (0, Troll)

MaxwellEdison (1368785) | more than 5 years ago | (#25704235)

Yes. It runs Crysis. Please stop asking. You have mistaken a short lived snide remark as an actual joke. It is not funny.

Now if anyone with an actual understanding of the architectures present here which would like to describe the actual improvement created, go ahead. But if you want to play the Crysis card, please crawl in a drainage pipe and die.

Correction... (0)

Anonymous Coward | more than 5 years ago | (#25704333)

That should be 'in history SO FAR'..

In other news, new things better than old.

Coming up later - this graphics card but only cheaper.

cool (1)

circletimessquare (444983) | more than 5 years ago | (#25704355)

i've always wanted to watch wall-e as it is being rendered in real-time

Re:cool (2, Interesting)

loufoque (1400831) | more than 5 years ago | (#25704651)

Not gonna happen, RenderMan is CPU-only.

no it's not (4, Funny)

hcdejong (561314) | more than 5 years ago | (#25704363)

... "the most powerful video card in history", it's "the most powerful videocard yet".

[/pet peeve]

Re:no it's not (1, Insightful)

Anonymous Coward | more than 5 years ago | (#25704505)

Looks at his history books and current events... Yep, most powerful card in history (so far)

Re:no it's not (4, Funny)

Anonymous Coward | more than 5 years ago | (#25704513)

I dunno, those Germans made quite a powerful video card back in the 1940s.

It certainly had more power than those steam-powered video cards the French made in WWI.

Re: history \notin future (0)

Anonymous Coward | more than 5 years ago | (#25704659)

FYI: history does not include the future.

Your argument is invalid.

Re:no it's not (1)

L4t3r4lu5 (1216702) | more than 5 years ago | (#25704941)

Wrong closing tag.
 
You mean [/pedant].

Re:no it's not (1)

lysergic.acid (845423) | more than 5 years ago | (#25705001)

maybe you should call Guinness Book of World Records and tell them that all their records are incorrect. or you could, you know, stop being such a pedant.

Re:no it's not (1)

Swanktastic (109747) | more than 5 years ago | (#25705275)

"the most powerful video card in history", it's "the most powerful videocard yet".

FACT: The Rebel Alliance used significantly more powerful videocards to render the Death Star in Star Wars Episode IV: A New Hope.

FACT: This event occured a long time ago in a galaxy far far away.

[/pet peeve]

Re:no it's not (0)

Anonymous Coward | more than 5 years ago | (#25705571)

Well history is the summation of previously recorded events, right? So if it's the most powerful yet, then it is by default the most powerful in history.

A GTX280 with VRAM thrown at it... (0)

Anonymous Coward | more than 5 years ago | (#25704435)

Same number of cores as the GTX 280. Sure, it's got 4x the memory (at 10x the cost), but the extra memory is only useful for some applications.

And what about their chip bonding problems? (1, Informative)

Enleth (947766) | more than 5 years ago | (#25704451)

Are we going to shell out $3,500 for a card that will fail [theinquirer.net] after half a year, or did they correct the problem already?

Re:And what about their chip bonding problems? (0)

Anonymous Coward | more than 5 years ago | (#25704943)

I dunno how that's a troll, aren't a lot of the major integrated device retailers (HP, Dell, Apple) experiencing 15%+ failure rate with certain NVIDIA chips?

Modding that post as troll is kind of implying that assessment is false. I'm inclined to think it's true based on the number of these reports, and the sites they are reported on...

Whoever modded this as a troll should be modded as a troll.

Not for home users (2, Informative)

Bieeanda (961632) | more than 5 years ago | (#25704477)

If the price tag didn't tip you off, this is one of Nvidia's Quadro line. They're not enthusiast boards, they're for intensive rendering work-- film-grade CG or simulations. Now, while the technology may come down to consumer-level hardware, especially if Carmack's supposition that real-time raytracing is the next big step, but this is like comparing a webcam to a real-time frame grabber.

Re:Not for home users (1)

Microlith (54737) | more than 5 years ago | (#25704695)

And if it's anything like their Quadro line to date, the difference between it and a standard gaming card is a couple BIOS settings and the driver setup.

you're all confused (5, Insightful)

DragonTHC (208439) | more than 5 years ago | (#25704509)

I don't believe anyone claimed this was a gaming card.

This is a scientific number cruncher. Its use is in visual computer modeling for anything from weather models to physics models.

How about folding@home? this does it faster than any computer on the block.

All of you kids making jokes about crysis are missing the point. This might run games, but it's a science processor first.

Re:you're all confused (0)

Anonymous Coward | more than 5 years ago | (#25704917)

You should know by now that even when explained in simple terms, most of slashdot still won't understand. Just look back to the government CTO threads to see the number of people who STILL think that it's someone who will dictate what technology everyone in America can and can't use. When simpletons choose not to understand something, it's better to leave them be.

Re:you're all confused (0)

Anonymous Coward | more than 5 years ago | (#25705201)

Actually, the Tesla [nvidia.com] is for scientific computing. The Quadro line is for graphics.

Re:you're all confused (1)

644bd346996 (1012333) | more than 5 years ago | (#25705403)

And the difference between a Tesla card and a Quadro card is a DVI port...

Re:you're all confused (2, Informative)

sa1lnr (669048) | more than 5 years ago | (#25705487)

folding@home.

My 3GHz C2D gives me 1920 points every 30/33 hours. My Geforce 8800GT gives me 480 points every 2.5 hours.

HA! At last! (1)

rarel (697734) | more than 5 years ago | (#25704533)

Sony batteries worldwide now shake with fear at the perspective of meeting a most worthy opponent!

Programming specs? Where are they? (0, Troll)

GNUPublicLicense (1242094) | more than 5 years ago | (#25704559)

NVIDIA still lagging behind Intel/AMD/Via... where are the programming specs? Keeping such specs hidden is an hate generator...

That's Awesome. (1, Insightful)

Petersko (564140) | more than 5 years ago | (#25704661)

In two years I'll be able to pick it up for $149.

That's the great thing about video cards. Even a card that's two generations old is a terrific card, and they're fantastically cheap.

Re:That's Awesome. (1)

rogermcdodger (1113203) | more than 5 years ago | (#25704783)

The Quadro FX 5500 is two and a half years old and still goes for $2000+ new so I wouldn't count on it.

Games must be optimized (1)

Starturtle (1148659) | more than 5 years ago | (#25704673)

Doesn't matter how much RAM they pump into these things, the game ultimately has to be optimized to leverage the new memory. If you check most benchmarks in Tom's Hardware, you'll see no significant gains and potentially a loss if the game or driver's aren't optimized to take advantage of the new memory.

Re:Games must be optimized (1)

Eddy Luten (1166889) | more than 5 years ago | (#25705097)

Doesn't matter how much RAM they pump into these things, the game ultimately has to be optimized to leverage the new memory. If you check most benchmarks in Tom's Hardware, you'll see no significant gains and potentially a loss if the game or driver's aren't optimized to take advantage of the new memory.

That's because most games these days are made for last generation SM 3.0 hardware and severe memory limitations.

More RAM means larger framebuffer (higher resolutions) and higher quality or more textures per frame. It also allows for the caching of shader programs, etc. etc. All in all, higher clockspeeds, more RAM and more processing units are never bad, IMO.

Also, a Quadro is targeted towards professional applications (Maya, Max, Cinema4D, Scientific, etc.), not for games. But it will probably take a short while before GeForce cards will carry 4GB of RAM on-board.

Re:Games must be optimized (1)

_Shad0w_ (127912) | more than 5 years ago | (#25705393)

If someone is playing games with one of those cards in their system, it's because they're killing time in their lunch break before going back to do some seriously complex computational work. That's not a gaming card.

Headlines (1)

Nom du Keyboard (633989) | more than 5 years ago | (#25704849)

It's all about the headlines, that's all.

Old news (3, Informative)

freddy_dreddy (1321567) | more than 5 years ago | (#25705033)

These [nvidia.com] were being sold in the first half of August for 10500$ - containing 2 of those cards. Only 3 months late.

TFLOPS? (1)

wjh31 (1372867) | more than 5 years ago | (#25705155)

as people have pointed out, lots of ram isnt really relevant, not compared to its TFLOPS rating, especailly considereing its intended use. a quick google finds this: http://fxvideocards.com/PNY-Nvidia-Quadro-FX-5800-4GB-PCI-E-VCQFX5800-PCIE-PB-p-16430.html [fxvideocards.com] which seems to suggest it comes in at about 1TFLOP, which compared to the ATI HD4870x2 2GB capable of 2.4TFLOPS at only £350 ~US$550 makes for apparently quite poor value

The ultimate power (0)

Anonymous Coward | more than 5 years ago | (#25705249)

... for cracking WPA keys using GPU assisted decryption.

Its the concept car of video cards. (1)

Kaptain Kruton (854928) | more than 5 years ago | (#25705491)

Car manufacturers occasionally make concept cars that look neat but have no real purpose other than drawing public attention and possibly testing ideas. They are not practical and do not serve any real purpose for the general public. This card is simply the concept car of video cards. It draws public attention to the company (being on slashdot definitely draw attention) and it tests the idea of putting a large potential for processing capabilities used by movie makers and CAD users (and it tests the market for that field). However, for the general public and most companies, it is pointless and has no real value. My $0.02

Our incremental history (1)

Captain Spam (66120) | more than 5 years ago | (#25705587)

[...] NVIDIA has released what they are calling the most powerful graphics card in history.

...until three months from now when the "GeForce".(++$ver_num) is released. Just like three months ago. And before that, and before that, and...

Does it run Linux? (1)

argent (18001) | more than 5 years ago | (#25705857)

No, really, could it?

Forgetting Nvidia Tesla (0)

Anonymous Coward | more than 5 years ago | (#25705893)

The Nvidia Tesla little more powerful, http://www.nvidia.com/object/tesla_c870.html

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>