Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD-ATI Ships Radeon 2900 XT With 1GB Memory

Zonk posted about 7 years ago | from the shiny-happy-hardware-holding-hands dept.

132

MojoKid writes "Prior to AMD-ATI's Radeon HD 2000 series introduction, rumors circulated regarding an ultra-high clocked ATI R600-based card, that featured a large 1GB frame buffer. Some even went so far as to say the GPU would be clocked near 1GHz. When the R600 arrived in the form of the Radeon HD 2900 XT, it was outfitted with 'only' 512MB of frame buffer memory and its GPU and memory clock speeds didn't come close to the numbers in those early rumors. Some of AMD's partners, however, have since decided to introduce R600-based products that do feature 1GB frame buffers, like the Diamond Viper HD 2900 XT 1GB in both single-card and CrossFire configurations. At 2GHz DDR, the memory on the card is also clocked higher than AMD's reference designs but the GPU remains clocked at 742MHz"

Sorry! There are no comments related to the filter you selected.

But... (5, Funny)

gQuigs (913879) | about 7 years ago | (#20791433)

Does it run (on) Linux yet?

That could be viewed as a serious question (3, Informative)

MSRedfox (1043112) | about 7 years ago | (#20791713)

Hopefully, it will run well under Linux in the near future given AMD's recent actions. As was covered previously on here, AMD has already release quite a bit of detail to improve Linux support. http://hardware.slashdot.org/article.pl?sid=07/09/24/053252 [slashdot.org]

Re:But... (5, Funny)

dascritch (808772) | about 7 years ago | (#20791859)

A lit'l more RAM and it can run Windows Vista. By itself.

Re:But... (4, Funny)

hitmark (640295) | about 7 years ago | (#20791983)

but dont expect it to run any office software at the same time...

Re:But... (4, Funny)

UnderDark (869922) | about 7 years ago | (#20793159)

Why would I want a card whose only purpose in life is make lots of calculations quickly and accurately run MS Office Excel 2007?

Re:But... (0, Offtopic)

postmortem (906676) | about 7 years ago | (#20792713)

None yet e'er drank a honey'd draught Unnmixed with cup of bitter gall, And cup of gall for honey equally doth call, That so, the mixture one may easier drink. Beg Ivan-beg of ancestry heroic, Like tawny lion fought against the Turks, On every side, and deep in gory woods: Half of his lands the Turks did take from him, The country delug'd was with blood, These Moslems slew his doughty brother, - Ferocious dragon, Urosh Voivoda! - On tune broad fields of Tchèmovo. Ivan his only brother mourn'd. Mourn'd him more, - the Voivoda Urosh; - Than were he mourning both his sons; Mourn'd him more, the Voivoda Urosh, Than he could mourn a whole lost land; Mourn'd him more, the Voivoda Urosh, Than he could mourn the loss of both his eyes; Not dearer they to him than brother Urosh. Full many a time and oft the hero may Excite high heaven unto mighty laughter! Ivan with cup on high vow'd direful vengeance, Drinking the toast with consecrated wine. He lets his white hair fall upon his shoulder's. His white beard curling down unto his waist; With his old hands he grasps his sword and lance; Blood-sprinkled both his weapons and his arms, At every step he fells a Turkish foe; The old man bounds as were he nimble youth! O dear my Lord, it sure must be a dream, That on this wise an aged man can leap! Good fortune past returns to him again: At Karoutché upon Tsrmnitsa's boundary, Of whole band of fifteen thousand Turks, Not one of them escap'd alive; Their marble tombs, which men still see, Attest the glory of Prince Tsrnoyevitch: God grant mercy to the soul of Urosh. Wondrous offerings made men to his memory!

also, more vespene gas (5, Funny)

User 956 (568564) | about 7 years ago | (#20791435)

When the R600 arrived in the form of the Radeon HD 2900 XT, it was outfitted with 'only' 512MB of frame buffer memory and its GPU and memory clock speeds didn't come close to the numbers in those early rumors.

Well, that's because when they tried to build the 1GB units, a loud voice was heard saying "We require more minerals", and production was blocked.

Re:also, more vespene gas (0)

Anonymous Coward | about 7 years ago | (#20791483)

mod parent frickin hilarious

Re:also, more vespene gas (1)

weirdcrashingnoises (1151951) | about 7 years ago | (#20791535)

thank YOU parent, for that late-night mt. dew all over my computer screen!

Re:also, more vespene gas (1)

digitalchinky (650880) | about 7 years ago | (#20792105)

Seriously, I don't get the joke?

Re:also, more vespene gas (0, Redundant)

WhatAmIDoingHere (742870) | about 7 years ago | (#20793387)

Do get the joke you require additional vespene gas.

Considering 32-bit OSes are still mainstream.. (3, Insightful)

Chas (5144) | about 7 years ago | (#20791439)

These cards are ridiculous. ESPECIALLY in Crossfire installs.

Wow! Now that 4GB of main system memory I installed has been pared back down to a more manageable 2GB!

WHEE!

Until 64-bit becomes more mainstream, cards like this will only become more and more detrimental to the systems they're installed in.

Re:Considering 32-bit OSes are still mainstream.. (1)

Graphic_Content (1047676) | about 7 years ago | (#20791473)

So get the 64-bit Windows XP/Vista OS. Then you won't "lose" your precious "minerals".

Re:Considering 32-bit OSes are still mainstream.. (1)

ichigo 2.0 (900288) | about 7 years ago | (#20791685)

When did the Orz start posting on slashdot?

Re:Considering 32-bit OSes are still mainstream.. (2, Funny)

Spikeles (972972) | about 7 years ago | (#20792099)

"Who are you? You are not Orz! We are Orz! Orz are happy *people energy* from the outside. Can you come together with Orz for *parties*?" - The Orz, Star Control II

Re:Considering 32-bit OSes are still mainstream.. (1)

Chas (5144) | about 7 years ago | (#20791823)

Note: Wasn't talking about for ME. I'm already running 64-bit, as I've chosen hardware that's fairly well supported driver-wise.

Re:Considering 32-bit OSes are still mainstream.. (3, Funny)

Schemat1c (464768) | about 7 years ago | (#20792895)

Note: Wasn't talking about for ME. I'm already running 64-bit, as I've chosen hardware that's fairly well supported driver-wise.
Sorry but even 64-bit isn't going to help ME run any better.

Re:Considering 32-bit OSes are still mainstream.. (2, Funny)

ichigo 2.0 (900288) | about 7 years ago | (#20792977)

Lies! Windows ME runs exactly as designed!

Too bad it was designed by left-handed monkeys on crack.

Re:Considering 32-bit OSes are still mainstream.. (1)

UnderDark (869922) | about 7 years ago | (#20793175)

I am left-handed you insensitive clod!

Re:Considering 32-bit OSes are still mainstream.. (2, Funny)

ichigo 2.0 (900288) | about 7 years ago | (#20793655)

In Soviet Russia, left hand is always right!

Re:Considering 32-bit OSes are still mainstream.. (1)

Jeff DeMaagd (2015) | about 7 years ago | (#20791587)

These cards are ridiculous. ESPECIALLY in Crossfire installs.

You mean like doubly so? Like the point that the money spent on the doubling the money is almost completely wasted, the money spent on doubling the graphics chips is almost completely wasted?

Re:Considering 32-bit OSes are still mainstream.. (1)

Jeff DeMaagd (2015) | about 7 years ago | (#20791603)

I mean "doubling the memory". Stupid sleepiness.

Re:Considering 32-bit OSes are still mainstream.. (1)

Chas (5144) | about 7 years ago | (#20791841)

No. I mean that since this memory has to be mapped within a 32-bit address space, you wind up wasting space that could be better allocated to system memory.

Sure, for anything that remains strictly on the graphics card, it's great. But for anything else (functions besides raw graphics in a game (like AI) or for non-gaming application), stealing that memory allocation space degrades overall system performance.

"Framebuffer memory" (0)

Anonymous Coward | about 7 years ago | (#20791641)

XBox 360 has quarter a gig of "framebuffer memory". So this is really not a big deal, since you really want a good card for next-gen development.

Re:"Framebuffer memory" (2, Interesting)

Chas (5144) | about 7 years ago | (#20791873)

That's great. No. Really!

I'm not talking about an XBox. I'm talking about a PC.

An XBox has half a gig of memory, half of which is dedicated to graphics at a relatively low-res output.

I'm talking about a gaming PC with 2+GB of RAM in it and how graphics cards with multiple gigs of memory are detrimental to overall system performance (including gaming) in a 32-bit memory map.

Re:"Framebuffer memory" (4, Funny)

AHuxley (892839) | about 7 years ago | (#20791915)

640p is enough for anyone.

Re:"Framebuffer memory" (1, Offtopic)

Chas (5144) | about 7 years ago | (#20791971)

If I hadn't started this sub-thread, I'd have modded that up for funny. ;-)

Re:Considering 32-bit OSes are still mainstream.. (2, Interesting)

Doppler00 (534739) | about 7 years ago | (#20791709)

Yeah, this is why I'm waiting before upgrading my computer. I need to see better 64 bit support in the future. I always plan on doubling everything at next major upgrade. From 2GB -> 4GB, 2 cores -> 4 cores. Until there is an operating system and application support though, I don't think I'm going to go there.

Re:Considering 32-bit OSes are still mainstream.. (1, Insightful)

Colin Smith (2679) | about 7 years ago | (#20791851)

Right because Linux hasn't been 64bit and running on SMP systems for years...

Oh wait. You meant Windows. Sorry, I do apologize ... Well, you'll have to wait the traditional 3 year Microsoft lag behind the state of the art.
 

Re:Considering 32-bit OSes are still mainstream.. (0)

Anonymous Coward | about 7 years ago | (#20792041)

Since when does linux support any high-end graphics card?

Granted, this is not direct failing of Linux itself, but still. I really don't see how Linux is an option for gamers, wine/cadega notwithstanding.

(Posting as AC as I will almost certainly be moderated into oblivion.)

Re:Considering 32-bit OSes are still mainstream.. (1)

level_headed_midwest (888889) | about 7 years ago | (#20793331)

Linux has support for almost every ATi and NVIDIA GPU, including the NV 8800 and ATi 2900 series. You can look at NVIDIA's website and AMD's ATi website for the drivers. The ATi R600 support is new, but Linux has supported high-end graphics cards for some time.

Re:Considering 32-bit OSes are still mainstream.. (1)

Tim C (15259) | about 7 years ago | (#20792181)

So has Windows - the 64 bit build of Win XP was released a couple of years ago.

The problem isn't the OS, it's the hardware support. In my case, USB wi-fi dongles; none of the ones I have kicking about the place have driver support from the manufacturer (thanks for that Netgear!) and I don't fancy trailing cat5 cabling through my house again.

And yes, Linux is great, and yes it was my primary desktop OS for a couple of years, but it simply doesn't support all the software I want to run (and yes, that includes games, and no I don't fancy hoping that WINE will run something I've just shelled out cold, hard cash for)

Re:Considering 32-bit OSes are still mainstream.. (1)

Chas (5144) | about 7 years ago | (#20791877)

With multi-core support, I don't think that's really going to change. You're not necessarily going to see a huge "performance boost" from massively parallel processing.

However, you'll still have the luxury of running multiple processor intensive apps without bringing the whole system to a standstill.

Re:Considering 32-bit OSes are still mainstream.. (1)

Lorkki (863577) | about 7 years ago | (#20791891)

What are the specific problems with OS and application support you're having? Windows may be an issue if you still need applications with 16-bit components or if you have bad luck with driver support. In Linux there's trouble with closed-source browser plugins, which can be partly alleviated with nspluginwrapper, although Java 6 can still be a pain. Other than that, the support is about as good as it can be expected to get, and I've been running the AMD64 builds of both Ubuntu and XP on my desktop since when I first got a machine with support for it in 2005.

Re:Considering 32-bit OSes are still mainstream.. (0)

Anonymous Coward | about 7 years ago | (#20792031)

I always plan on doubling everything at next major upgrade.

Exactly. That's why I'm holding out for 128 bit CPUs for my next upgrade.

Re:Considering 32-bit OSes are still mainstream.. (1, Funny)

Anonymous Coward | about 7 years ago | (#20792521)

I always plan on doubling everything at next major upgrade.

Exactly. That's why I'm holding out for 128 bit CPUs for my next upgrade.


I'm holding out for 65 bit CPU's for my next upgrade.

Re:Considering 32-bit OSes are still mainstream.. (0)

Anonymous Coward | about 7 years ago | (#20794147)

Am I missing something? How does installing a graphics card with 2 GB of RAM mean you can only address 2 GB of system memory? By that logic, wouldn't just the 1 card in a non-Crossfire install mean you only have 3 GB of the 4 available? Is this some kind of Crossfire weirdness?

UEI++ (2, Interesting)

thatskinnyguy (1129515) | about 7 years ago | (#20791455)

Hmmm This might mod my Vista User Experience Index up to 3.0

Finally some effects! (4, Funny)

The-Pheon (65392) | about 7 years ago | (#20791477)

With this new hardware, will be able to run vim with some colors for syntax highlighting? :)

Re:Finally some effects! (1)

chuckymonkey (1059244) | about 7 years ago | (#20791537)

It's not vim you have to worry about, it's emacs.

Re:Finally some effects! (1)

callinyouin (1138469) | about 7 years ago | (#20791637)

I wasn't going to go into the comments for this one due to the topic, but I'm glad I did. For some reason this made me laugh, but not because I use vim....because I don't.

Useless! (5, Insightful)

ynososiduts (1064782) | about 7 years ago | (#20791533)

Unless you are running quad 32" screens at some insane resolution, there is no need for 1 GB of frame buffer RAM. I think this is more for the "OMG MI VIF CARD HAZ 1 GIGGBYTES OF MEMORYIES!11!" type.

Re:Useless! (2, Informative)

Clay Pigeon -TPF-VS- (624050) | about 7 years ago | (#20791553)

I take it you've never gamed at very high resolutions with ALL the eyecandy turned on.

Re:Useless! (2, Informative)

ynososiduts (1064782) | about 7 years ago | (#20791563)

My 8800 GTS with 320 MB runs all games fine at 1680x1050 with max settings. That's pretty much one third of one gigabyte. I seriously doubt you need one, let alone two, gigabytes of video RAM.

Re:Useless! (1, Interesting)

Clay Pigeon -TPF-VS- (624050) | about 7 years ago | (#20791717)

Maybe in a direct x 8.1 rendering path. No way youre getting consistant 80+ fps playing tf2 or other directx 10 capable games.

Re:Useless! (1)

Poromenos1 (830658) | about 7 years ago | (#20793293)

I have the same card and res as the GP and TF2 gets 80ish fps, with spikes up to 100.

Re:Useless! (2, Informative)

MikShapi (681808) | about 7 years ago | (#20792355)

I second that. I run an 8800GTS/320 on a triple 17'' 1280x1024 setup (using a matrox triplehead2go digital to split the DVI signal in 3). The card pushes out 3840x1024, which amounts to about 4MP, and it's been happy so far in Gothic, Oblivion, S.T.A.L.K.E.R and a bunch of other titles, giving very reasonable frame rates with either all or practically all the graphics bells and whistles turned on.

Memory doesn't make a card faster, except on REALLY insane resolutions (way higher than 4MP I suspect) when you really need all those textures close at hand, and what with PCIe bus being nowhere near saturated, putting said textures closer, latency-wise, to the plate is really more than its made out to be. Ton-of-memory-cards are just a tax on people who don't understand what the fuck really matters in their system. Sorta like uber-expensive-RAM which gives an entire 2% improvement over what el-cheapo brandless stuff does.

What *does* a fast card make, at least as of 8th generation GF's which have many parallel stream processors, is a LOT of processors. The jump from 32 in the mid-range cards, to 96 or 128 in the high-end ones, is what makes these cards kick royal ass.

Re:Useless! (2, Insightful)

mikkelm (1000451) | about 7 years ago | (#20792555)

You're going to have to spill the beans on how you manage to run S.T.A.L.K.E.R. in a resolution like that, and how you manage to do it on that kind of hardware.

With an 8800GTS/320, myself, and most all review sites, struggle to stay above 60FPS in 1024x768 at times with all the eyecandy on.

Re:Useless! (1)

billcopc (196330) | about 7 years ago | (#20793155)

That's funny, because I was just contemplating replacing my 320mb with a 640 because I can make this card chug pretty hard in certain recent games. Mind you, I like my 16x AA+AF so I'm probably taxing it harder than the more reasonable folks out there, but there is definitely a point to having more video ram.

We've had 256mb cards for a few years now for "normal" resolutions like 1024x768 and 1280x1024... but that's not hardcore anymore. Hardcore is SLI GTX'es (or HD2900s) driving a 30" Dell at 2560x1600. We now have games that have the rendering chops to look awesome at that rez, whereas the old stuff would be showing its blocky flaws and jumping polys.

Some folks blow $5k on a plasma tv, I prefer to blow mine on a killer PC :) Same diff.

Re:Useless! (3, Informative)

DigiShaman (671371) | about 7 years ago | (#20791577)

It's not just for frame buffering. That memory is also used to store texture maps, Z-buffers, stencil buffers, etc. Basically, Almost all of it is used for 3D games/applications. If all you needed was a 2D card, you could get away with just 64MB of on-board RAM.

Re:Useless! (1)

ynososiduts (1064782) | about 7 years ago | (#20791585)

But see, there is no need for 1 GB of RAM for modern gaming. It's useless unless you are running games at big resolutions. Most people I've seen don't go any higher then 1600x1200.

Re:Useless! (3, Informative)

Anonymous Coward | about 7 years ago | (#20791711)

You do realize that texture size is completely independent of screen resolution right? And that you possibly have hundreds of textures loaded at once? And they can't be stored compressed because decompression would take too long?

Basically, other than the framebuffer for what's actually displayed on screen none of the graphics card memory is depended on screen resolution.

Anyway, this card isn't useful *now*. That's because video game producers target the cards that are widely available. 2 years from now you're going to need *at least* 1GB to run games at their max settings.

Re:Useless! (1)

holywarrior21c (933929) | about 7 years ago | (#20794011)

yeah i get away with my g4 ibook from 3 years ago. it has 32mb of ram onboard graphics. and i don't find trouble doing anything on my laptop. i do everything except gaming. i use it for school, cs projects, some web programming for my church website. i connect 19inch screen at 1280-1024 resolution with unlock-shell crack. most of the time i find no trouble having 5 windows opened at the same time. firefox, office, itunes, editors, etc..

Quad 32" screens at 1600x1200 fits in 32Mb (2, Informative)

Joce640k (829181) | about 7 years ago | (#20791723)

Do the math. You don't need anywhere near 1Gb for that.

What you *do* need it for is texture and vertex data, but even then games aren't really going to use it - they're designed for current hardware.

Nope, the only people who'll buy this are ignorants with too much money*.

- Not that there's any shortage of those.

[*] ...and medical people who like to look at 3D textures from MRI scans - they can never get enough video memory. 1Gb is only enough for a single 512x512x512 texture.

Re:Quad 32" screens at 1600x1200 fits in 32Mb (0)

Anonymous Coward | about 7 years ago | (#20792363)

Triple buffering, HDRI, FSAA, deferred rendering and whatnot might raise the memory requirements somewhat from x*y*4. Plus if you think 1600x1200 is insane resolution for a 30" display (32 is probably a thinko) you haven't been following.

Re:Quad 32" screens at 1600x1200 fits in 32Mb (1)

ynososiduts (1064782) | about 7 years ago | (#20794059)

You just proved my point. I was saying that most people only go as high as 1600x1200. I wasn't saying that it is only needed for people that run at 1600x1200.

This is probably redundant but.. (2, Insightful)

Chas (5144) | about 7 years ago | (#20791901)

The memory on a video card is used for more than just simple frame buffering.

Notice how some of the newer games see less performance degradation on some of the 640MB nVidia cards than equivallently clocked 320MB versions of the same card.

Depends on what side your on... (1)

msimm (580077) | about 7 years ago | (#20792183)

If enough people will pay for it to create a sustainable market, it's needed (period). Not to mention I hear what sounds like and assumption that this is going to be targeted at the enthusiast market, side-stepping the high-end (and high-cost) graphics shops and their associated market. Unless we assumed Pixar ran all their workstations on...pixie dust?

Anyway, if it finds a markets more power. Engineers do their jobs, people get paid. Welcome the open market. (:

Re:Depends on what side your on... (3, Funny)

Anonymous Coward | about 7 years ago | (#20792305)

Hey, nice backwards smiley there. The smiley emoticon has been around for 25 years, and it looks like this: :)

Re:Useless! (1)

Emetophobe (878584) | about 7 years ago | (#20793397)

Exactly. Looking at the benchmarks, there is no difference between the 512MB version of the 2900 XT and this new 1GB version. Infact, most of the benchmarks show the 1GB version performing slightly worse than the 512MB version for some reason...

When I bought my 1900 XT several months ago, I decided to get the cheaper 256MB version instead of the 512MB version because benchmarks showed there wasn't even a 5% difference in frame rates between the two cards. I play all my games at 1680x1050 with 4x AA and 8x AF, with the highest possible in-game detail settings and everything runs great. I'm not talking about really old games either, I'm talking about Bioshock, Team Fortress 2, etc..

The only case I've seen where the 512MB version of the 1900XT actually performs better than the 256MB version is when using the Folding@home GPU client.

Big screens.... (1)

Joce640k (829181) | about 7 years ago | (#20793405)

How about one of these [digitaltigers.com] ?

Re:Useless! (0)

Anonymous Coward | about 7 years ago | (#20793975)

There might not be that many, but there are games that need 1GB VRAM. Flight Simulator X for example runs like shit even on todays high end systems. It will actually make use of 4gb of system ram and 1gb of video ram.

Useful for 3D animation work. (5, Informative)

Animats (122034) | about 7 years ago | (#20791543)

Sounds useful for 3D animation work, where you need all that memory for textures. Remember, by the time players see a game, the textures have been "optimized"; stored at the minimum resolution that will do the job, and possibly with level of detail processing in the game engine. Developers and artists need to work with that data in its original, high-resolution form.

Re:Useful for 3D animation work. (2, Informative)

Runefox (905204) | about 7 years ago | (#20791599)

Yeah... The FireGL has been doing that for several years. In fact, they have a 2GB version [amd.com] now, the V8650. Don't try it with games, though. Not going to work so well.

Drivers (-1, Flamebait)

Anonymous Coward | about 7 years ago | (#20791561)

Too bad the drivers suck and make the whole card a piece of shit.

Ahh... (5, Funny)

xx01dk (191137) | about 7 years ago | (#20791595)

Question. Where are the ships? I wanted to read about video cards and ships. This article only half-delivers.

Re:Ahh... (1)

xx01dk (191137) | about 7 years ago | (#20791611)

...just pokin' fun atcha shiny :)

Anyhow I just read the review in the newest CPU and they gave it what could best be described as a "meh". Give it a few more iterations and then we might have a respectable competitor to the current top-shelf Nvidia offering, but of course by then there will be something better out...

Story on Slashdot with comments, Posted. (-1, Troll)

vought (160908) | about 7 years ago | (#20791669)

Did you people ever take high school English class?

Re:Story on Slashdot with comments, Posted. (1)

Demablogia (1149365) | about 7 years ago | (#20791831)

Maivi

Frame buffer? You mean video ram? (5, Informative)

chrisl456 (699707) | about 7 years ago | (#20791697)

Umm, not to sound like a tech jargon-nazi, but "frame buffer" to me has always meant just the part of video ram that "mirrors" what you see on screen. A 1GB frame buffer would give you 16384x16384x32bit color, so unless you're doing some kind of huge multi-screen setup, 1GB of frame buffer is a bit overkill. ;)

Re:Frame buffer? You mean video ram? (5, Funny)

aliquis (678370) | about 7 years ago | (#20792161)

And here he where trying to be cool using new words he had seen, and you ruined it all :(

Re:Frame buffer? You mean video ram? (1)

StarReaver (1070668) | about 7 years ago | (#20792925)

Multi-monitor? I'll fit that resolution on my single 17" CRT monitor! ...I just won't be able to see anything.

What is the point for most users? (2, Interesting)

polyex (736819) | about 7 years ago | (#20791707)

I understand if you were doing research of any sort that would exploit this hardware - assuming you use ALL of it or can write the code to do so - the better bandwidth you have, the faster the results etc. I understand hardware like this being useful in this regard. I also understand it from the perspective of a software developer who may be developing with this hardware for a future product that will be released in a year or so, and this sort of hardware will be more standard at that time and affordable. But I am sort of baffled by people who spend hundreds upon hundreds of dollars for something that they will not use the bandwidth for until next year or later and then the thing will be down in price anyway. Its like buying terabytes of drive space, but then only filling the drive up after a year or two. I am sure that people are thinking that they actually use this stuff fully NOW, but I have to wonder if most of it is to play games with a slightly better resolution but a "lesser" card could have solved that immediate problem. Personally I think its silly to spend so much to play a $60 game, but I understand that it is a hobby and I am not necessarily criticizing that particular form of madness. I guess I am asking if folks have a practical and immediate need for this with software that is out today and that they personally use every day. I know scalability is built into most games and things, but that seems to be arrow relative to the difference in price between this sort of hardware and what is commonly available outside of specialized apps that demonstrably improve when given more powerful hardware now.

Re:What is the point for most users? (1)

BarneyL (578636) | about 7 years ago | (#20792605)

But I am sort of baffled by people who spend hundreds upon hundreds of dollars for something that they will not use the bandwidth for until next year or later and then the thing will be down in price anyway.
At the extreme end of the scale it is bad value. However if you do need a new graphics card it often works out better to go towards the high end. I'd rather spend $300 on one card that keeps up with the latest games for two years than $180 get a mid range card that will need to be updated with another $180 card in a year's time.

Disk cache (0)

Anonymous Coward | about 7 years ago | (#20792033)

When most of that memory is unused (when you're not playing a game), I wonder if it were possible to use that as disk cache?

Eventually (1)

LM741N (258038) | about 7 years ago | (#20792047)

The mobo will be a giant video card and the cpu will reside in a board on one of the slots.

summary (-1, Flamebait)

Anonymous Coward | about 7 years ago | (#20792101)

the Diamond Viper HD 2900 XT 1GB is a overpriced piece of garbage thats easily beaten by most any modern Nvidia card.

how is this different from any other ATI product?

I'm not feeding the trolls... (2, Interesting)

ascendant (1116807) | about 7 years ago | (#20792141)

It was my understanding that ATI hardware was fine- it was the drivers that made it inferior to nVidia for performance gaming. Which would mean that if ATI and nVidia drivers were equal, ATI would win with hardware. On a side note, is it Nvidia, nVidia, or safely nVIDIA like on the website?

Re:I'm not feeding the trolls... (1, Insightful)

afroborg (677708) | about 7 years ago | (#20792257)

Does it matter? As far as I can see it is irrelevant why the system is slow, whether it's slow harware or slow drivers, it's still slow. If nVidia produce better drivers to squeeze more performance out of their hardware then you still get more fps in the end. Does it matter where it comes from? It's not like (at the moment) anyone else is writing better drivers than the manufacturers...

And AFAICS, the statement that "ATI's hardware is better it's just the drivers that let them down" sounds pretty unsubstantiated and unprovable, and more than just a little bit fanboyish...

Shouldn't there be a sign somewhere? (0, Offtopic)

ascendant (1116807) | about 7 years ago | (#20792287)

Note to self: do not feed trolls; more come over to see what's up...

Re:I'm not feeding the trolls... (0)

Anonymous Coward | about 7 years ago | (#20792729)

It is NVIDIA.

But is it? (1, Insightful)

Anonymous Coward | about 7 years ago | (#20793973)

They themselves have recommended "NVIDIA" for some time, but since it's never been an acronym, "Nvidia" makes most sense. Let's not succumb to their corporate perversions of age-old spelling conventions... Otherwise it gets like the way ATI has SMARTSHADER technology enabled by their CATALYST drivers ;-P

Possible to be used as system RAM? (2, Insightful)

i.of.the.storm (907783) | about 7 years ago | (#20792113)

I was just wondering, since before games often used system RAM when the graphics RAM was full, do any of you think it would be possible to go the opposite way, i.e. use gfx ram as system ram? It's a lot faster, and when you're just sitting there outside of a game it's not doing anything. Ultra-fast system cache ftw? Or am I just crazy? Is PCI-e too slow for that kind of stuff? Maybe with Vista's new driver model that allows GPU virtualization something like this could become true, but I really have no idea of the technical details involved in doing something of this nature,

Re:Possible to be used as system RAM? (0)

Anonymous Coward | about 7 years ago | (#20792189)

It's a lot faster

Probably a lot less reliable, too. Not easy to notice bit errors in video ram, but very easy to notice bit errors in your system RAM.

Re:Possible to be used as system RAM? (1)

brian.gunderson (1012885) | about 7 years ago | (#20793269)

Not easy to notice bit errors in video ram
The pixels turn... Blue???

Re:Possible to be used as system RAM? (0)

Anonymous Coward | about 7 years ago | (#20792641)

You could do this on an Amiga. I mapped three megs of RAM from my S3 Virge card as system RAM. It was slow, and the system put it at the end of the list of RAM to be allocated, but it worked.

Re:Possible to be used as system RAM? (1)

cyanid3 (998026) | about 7 years ago | (#20792989)

Atleast in Linux, you can use the memory as swap or a ramdisk.
Gentoo guide [gentoo-wiki.com]

How this is newsworthy now (1, Informative)

Anonymous Coward | about 7 years ago | (#20792125)

I have such a 1GB model from PowerColor for 2 months now and since the last linux driver release, it runs flawlessly there too. Slashdot, late as always :/

2Gbytes....anyone? (0)

Anonymous Coward | about 7 years ago | (#20792195)

Going once ...going twice....I need 2Gbytes to 4Gbytes video card :D

bitchin (5, Funny)

savuporo (658486) | about 7 years ago | (#20792471)

All these years later, and its still no match for the original Bitchin' fast 3d! 2000 [carcino.gen.nz] Livin' la Video loca con Puerto Para Garficios Acelerados Gigante!

Yes but (1)

diego.viola (1104521) | about 7 years ago | (#20792673)

When will they release their promised 3d specifications for their GPUs?

Wake me when it's got 8GB (1)

fast turtle (1118037) | about 7 years ago | (#20793169)

then I'll be able to simply heat my entire house without turning the damn heaters on while running the Folding GPU client.

Seriously though, we're already seeing problems with PSU's that can't deliver enough on the 12volt rail (2900XT needs 24 Amps by itself) and now they want to push that up to 26-28 amps? So where is the power going to come from? The wicked fairy's curse?

Re:Wake me when it's got 8GB (1)

couchslug (175151) | about 7 years ago | (#20793709)

"Seriously though, we're already seeing problems with PSU's that can't deliver enough on the 12volt rail "

That limitation is a design choice. Beefier supplies are no problem to build.

Whither my RAM on ia32? (1)

thenerdgod (122843) | about 7 years ago | (#20793321)

Considering how much address space MMIO takes up in Windows 32-bit OS's, One can only imagine some poor sap buying two of these and wondering why he only has 1.8GB of RAM available in Windows.

"lawl"

1GB Video Memory != "1GB Frame Buffer" (0)

Anonymous Coward | about 7 years ago | (#20793855)

Frame buffers will only take a fraction of that gibibyte. Sure they'll take plenty if you are triple-buffering with FP32/component pixels, but still the vast majority of that on-card memory pool goes to local storage of textures; in comparison a minority goes to shader programs, frame buffers and the Z/Stencil buffer.

So please have some clue when posting hardware stuff on a fairly popular tech-oriented site... Technical terms aren't freely interchangeable, you know, because they are, you know, technical terms.

Umm.... (0)

Anonymous Coward | about 7 years ago | (#20793971)

Sorry, surely I'm missing something, I have been running a HD2900XT 1GB (by Powercolor) for about 2 months now...

But, wait... (0)

Anonymous Coward | about 7 years ago | (#20794033)

Someone explain this to me -- I thought one of the strong points of the PCI Express bus means that, technically, a PCIe graphics card's memory is limited only by your own computer's memory, that it doesn't have to exclusively rely on its own RAM to hold textures, etc.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?