Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's New Flagship HD 6990 Tested

CmdrTaco posted more than 3 years ago | from the pixels-rot-the-brain dept.

Graphics 164

I.M.O.G. writes "Today AMD officially introduces their newest flagship GPU, the Radeon HD 6990. Targeted to counter Nvidia's current generation flagship GTX580, for AMD this is a followup to their previous generation 2xGPU on a single PCB design, the Radeon HD 5970. It represents the strongest entry AMD will release within their 6000 series graphics lineup. Initial testing and overclocking results are publishing at first tier review sites now. As eloquently stated by Tweaktown's Anthony Garreffa, the 6990 'punches all other GPUs in the nuts.'"

Sorry! There are no comments related to the filter you selected.

punches all other GPUs in the nuts (1, Informative)

Skarecrow77 (1714214) | more than 3 years ago | (#35420312)

and your wallet too!

$700. ouch.

Hey, if you've got the money to play, lucky you. I'm envious.

Re:punches all other GPUs in the nuts (1)

sandytaru (1158959) | more than 3 years ago | (#35420348)

It's about exactly twice as much as the budget I've set aside for building my next PC...

Re:punches all other GPUs in the nuts (0, Funny)

Anonymous Coward | more than 3 years ago | (#35420462)

It's about exactly twice as much as the budget I've set aside for building my next PC...

So your precise approximation is vaguely on point.

Re:punches all other GPUs in the nuts (1)

Belial6 (794905) | more than 3 years ago | (#35421116)

Just for fun, lets look at his statement and see if it is as silly as it seems. I would guess that his budget is "About $350". And since this card is $700, it is "exactly" twice the $350 number. Thus it is About (his budget isn't exact) Exactly ($350 + $350 = exactly $700) twice as much.

I would say that the sentence is awkward, but accurate. The real question is whether it should be "about exactly" or "exactly about".

Re:punches all other GPUs in the nuts (1)

Ephemeriis (315124) | more than 3 years ago | (#35420834)

It's about exactly twice as much as the budget I've set aside for building my next PC...

Last computers I built were budgeted at $500 a piece... Whole new systems - motherboard, CPU, RAM, HDD, power supply, LCD monitor, keyboard, mouse, all of it. Came in just slightly over after I was done with shipping and handling and whatnot...

We're still using those computers, too. And we do a good amount of gaming. Obviously I can't crank all the settings up as far as they'll go... But I have yet to see a game that didn't play just fine on this machine.

Re:punches all other GPUs in the nuts (0)

Anonymous Coward | more than 3 years ago | (#35420846)

And you'll be replacing that crap machine in about exactly two years because it's already obsolete.

Re:punches all other GPUs in the nuts (2)

John Napkintosh (140126) | more than 3 years ago | (#35421062)

Or, he'll still have it in 5 years because his needs didn't require a $700 video card in the first place.

Re:punches all other GPUs in the nuts (2)

jedidiah (1196) | more than 3 years ago | (#35421276)

Or that $700 video card becomes more reasonable like $10 or $60 and you just upgrade that bit in 2 years.

Re:punches all other GPUs in the nuts (1)

Anonymous Coward | more than 3 years ago | (#35420378)

I've got the money and am in the market for a new graphics card, but I just don't see the point in a card like this. 450W draw is fucking retarded and currently there just aren't games out there that will legitimately make use of a card like this. And in a year or two when the games actually exist, you'll be able to buy a card that can keep up with this one for significantly less money and probably a lower TDP.

Re:punches all other GPUs in the nuts (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35420634)

I admit I haven't kept up to date on the latest PC games. I was a WoWaholic for a long time, and I've been spending most of my gaming time with my PS3 recently. my GTX 470 has had a nice break the past few months. That said, in just about every generation of games and hardware, there's at least one game that still runs like crap even on the best current hardware when you push all the settings to ultra and run it on a 30inch monitor, be it mass effect, crysis, or what have you. Don't we have one of those this generation?

Re:punches all other GPUs in the nuts (1)

sandytaru (1158959) | more than 3 years ago | (#35420810)

Doesn't have to be a new game. Older games have been having major issues with new cards, mostly because of DirectX8 incompatibilities. My pet game, FFXI, was utterly broken on GeForce 400s until the latest driver release in January, and is still sub-par on many ATI cards. Nothing like 5 FPS on an 8 year old game on a $200 card to make you angry!

Re:punches all other GPUs in the nuts (1)

AnonGCB (1398517) | more than 3 years ago | (#35420856)

That's called a 'Bad port' and game makers are /SLOWLY/ changing to actually focus on PC development.

I remember when GTAIV came out for PC and it crippled everything by being the worst optimized game I've ever seen.

Re:punches all other GPUs in the nuts (0)

The End Of Days (1243248) | more than 3 years ago | (#35421088)

No, that's called "PC gamers spent the last decade shitting in the well and now they want to know why the water tastes funny."

Re:punches all other GPUs in the nuts (1)

hedwards (940851) | more than 3 years ago | (#35421748)

Because we all know that console gamers never pirate games...

Re:punches all other GPUs in the nuts (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35421128)

pretty sure crysis was ported -to- consoles, not from it, although I could be wrong. Mass effect was released on the xbox first, but since the 360 is running directx 9 I believe, it can't have been that much of a port job.

as a side point, PC ports of console games often get a bad rep for performance compared to their console brethren. The PC version usually has a lot more eye candy in my experience. Aside from the convenience of using a mouse, I can't play the 360 version of mass effect nor the ps3 version of modern warfare 2 after seeing the respective PC versions. MW2 especially looks like a blurry piece of low textured crap on the console compared to the sharp high res textures on the pc (yes, I was running them on the same monitor so I could A/B them).

Re:punches all other GPUs in the nuts (0)

Anonymous Coward | more than 3 years ago | (#35420694)

The reason you can buy a card in a year or two that has similar performance to this one at a much lower cost is because they have developed this card now...

Re:punches all other GPUs in the nuts (1)

John Napkintosh (140126) | more than 3 years ago | (#35421090)

Not only that, it will probably get a die shrink and only require 250w

Re:punches all other GPUs in the nuts (1)

Ephemeriis (315124) | more than 3 years ago | (#35420794)

and your wallet too!

$700. ouch.

Hey, if you've got the money to play, lucky you. I'm envious.

Yep.

I was always kind of amazed at these prices... I'd build an entire computer for $700, and then somebody would come along and tell me how they had two of these $700 video cards in their machine.

I mean... If you've got the money, go for it. But I just can't see justifying $1400 in video cards alone. Especially when we're talking about the consumer-grade gaming cards. Are a few more frames per second in Crysis really worth $1400 to you?

Re:punches all other GPUs in the nuts (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35420954)

true that.

Had I that kind of money to waste on a computer, I'm pretty sure I'd be much better off in the long run spending tha tmoney on something like a single upper mainstream (say, gtx 570) card, a 30" monitor, and two solid state drives in raid 0, but that's just me.

Come to think of it, anybody who spent $1400 on two of these probably have the 30" monitor and the two SSDs as well.

Re:punches all other GPUs in the nuts (2)

chill (34294) | more than 3 years ago | (#35421460)

Much more fun to interrupt their proxy-penis waving with a few well placed headshots.

"Dual $700 cards, huh? How come you still suck?" *BOOM* headshot

Re:punches all other GPUs in the nuts (1)

ChefInnocent (667809) | more than 3 years ago | (#35421188)

Maybe it's more because I'm not a serious gamer, but I've been looking at a bank of these cards for GPU computing. More computing power, less space, faster transfer across cards rather than network, and less overhead costs associated with buying more systems. We bought the Radeon HD 5970 & GeForce GTX 580 two weeks ago for comparisons. This week will tell us which platform is better for our needs since we need to move from theory to reality.

For home use, I'd keep using the GeForce 8800 GTX, but I'm the lucky recipient of the card that doesn't win.

Punches your power supply in the nuts, too (3, Interesting)

drinkypoo (153816) | more than 3 years ago | (#35420320)

375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.

Re:Punches your power supply in the nuts, too (2, Informative)

Anonymous Coward | more than 3 years ago | (#35420470)

I was ATI-only from 2000-2009. Thought the same thing about their drivers. Then I went Nvidia again because of my dislike and... nope, no difference.

If you're planning on Linux though then yeah, Nvidia is obviously much better. ATI-on-Linux will make you want to hang yourself with a sock.

Re:Punches your power supply in the nuts, too (-1)

Anonymous Coward | more than 3 years ago | (#35420816)

This just in: Hardware runs buggy and poorly as shit on Linux. Film at 11.

Re:Punches your power supply in the nuts, too (0)

billcopc (196330) | more than 3 years ago | (#35420864)

Yeah, NVidia's driver quality has taken a nose-dive in recent years. At least ATI is no better or worse than they've ever been.

Re:Punches your power supply in the nuts, too (4, Insightful)

Beelzebud (1361137) | more than 3 years ago | (#35420868)

Since AMD took over ATI, their drivers have massively improved, even in Linux.

Re:Punches your power supply in the nuts, too (1)

saleenS281 (859657) | more than 3 years ago | (#35421572)

I disagree. I was running a pair of 4850's in crossfire for almost 2 years. There was a bug that would make the mouse cursor icon go all corrupted [tinyurl.com] that they never bothered to fix despite knowing it was there. I switched to a pair of nvidia 460 GTX's in SLI and haven't had a problem since.

Re:Punches your power supply in the nuts, too (0)

Anonymous Coward | more than 3 years ago | (#35420474)

375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.

And therein, more than the cost, lies the rub.

Re:Punches your power supply in the nuts, too (1)

Ephemeriis (315124) | more than 3 years ago | (#35420850)

375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.

Holy hell. I've only got a 500w PSU in my box... I don't think I could even run one of those.

feels hollow (2)

Stele (9443) | more than 3 years ago | (#35420354)

I don't know - the card is certainly fast, but when all you can do to beat your competition's single-GPU card is to stick two of your slower GPUs on it, it just feels hollow to me. All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?

Re:feels hollow (1)

div_2n (525075) | more than 3 years ago | (#35420388)

You say that as if it's a trivial thing to do.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35420424)

it is...

Re:feels hollow (1)

Bergs007 (1797486) | more than 3 years ago | (#35420604)

You don't understand a whole lot about caches, memory controllers, die area considerations, power planes, load balancing, or system I/O, do you?

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35420718)

On a well made product, from the consumer point of view, it should be trivial.

Re:feels hollow (4, Insightful)

Chris Burke (6130) | more than 3 years ago | (#35420830)

It should be trivial to do something, from the point of view of someone who isn't doing it, and has no idea what is involved in doing it.

Are you a manager, perchance?

Re:feels hollow (0)

The End Of Days (1243248) | more than 3 years ago | (#35421114)

What Slashdot have you been reading lately that this attitude surprises you?

Don't worry, supergenius. It's not only managers who are inferior to you.

Re:feels hollow (-1)

Anonymous Coward | more than 3 years ago | (#35420458)

in comparison to the difference between, say, the radeon p
asodj a'lmc'kcjvl'kmve
  dw
kasd

I just realized I don't actually care.

Re:feels hollow (1)

click2005 (921437) | more than 3 years ago | (#35420444)

All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?

Thats the 590. Its out in a week or two.

It makes me laugh that most sites reviewed it on a single screen system, most at 1080p. Most of the current top-end cards can easily do modern games at maximum detail even on 30" screens. These kinds of cards are only really worth it for multi-monitor gaming. The problem is 3 x 30" screens starts to fill that 2GB
of video memory quite quickly.

I hope the 7990 has better memory use. Use HyperTransport or some kind of NUMA setup and let the GPUs access all the memory.

Re:feels hollow (1)

Khyber (864651) | more than 3 years ago | (#35420564)

The memory on a GPU card is typically MUCH faster than the system memory.

Re:feels hollow (1)

LoudNoiseElitist (1016584) | more than 3 years ago | (#35420592)

Why does screen size matter? Or did you actually find something that big that wasn't still 1080p?

Re:feels hollow (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35420746)

I wouldn't buy a monitor bigger than 24" that only supported 1080p.

TV sure.

Computer monitor no.

Re:feels hollow (1)

gstoddart (321705) | more than 3 years ago | (#35420828)

I wouldn't buy a monitor bigger than 24" that only supported 1080p.

I can only guess at what something like that would cost and where you'd buy it.

I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35420926)

Wut? Must have been imagining all those 1920x1200 and 2560Ã--1600 monitors.

Re:feels hollow (1)

Anubis350 (772791) | more than 3 years ago | (#35421014)

I wouldn't buy a monitor bigger than 24" that only supported 1080p.

I can only guess at what something like that would cost and where you'd buy it.

I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.

Exotic? Really?

On the consumer/normal workstation end of things off the top of my head you've got the Dell U2711, IPS, res. 2560x1440 [dell.com] (list 1k, but frequently on sale for ~$700) plus Apple's *only* display, in the same price range with essentially the same panel (glossy though, and LED backlight).

On the true high end Eizo, NEC, and others make even better displays. Not to mention that with slightly lower DPI you cna get the same 2560x1440 resolution on nearly every 30" computer monitor made in the last few years (including Apple, Dell, HP, Samsung, etc)

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35421022)

At least 9 higher then 1080 here
http://www.newegg.com/Store/SubCategory.aspx?SubCategory=20&name=LCD-Monitors# [newegg.com]

4 here
http://www.tigerdirect.com/applications/Category/guidedSearch.asp?CatId=12&name=Monitor-LCDs [tigerdirect.com]

15 here
http://www.buy.com/SR/SearchResults.aspx?tcid=3494 [buy.com]

Not really a rare and exotic thing. Its becoming more common place all the time.

Re:feels hollow (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35421040)

I've got an i-inc (rebranded Hanns-G) 28" 1920x1200 on my desk that cost me $250 from compusa/tiger direct?

I really do prefer the 16x10 ratio for computer monitors. The thing about 1080p is that the vertical resolution on it is really about the same as the 19" crt I had on my desk 10 years ago, it's just wider.

I will agree that getting to 2560x1600 does seem to take a big paycheck though.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35421160)

I'm not sure they're that exotic. I've always assumed they're just mass produced TV panels with modified inputs etc.
I've had a 28" 1920x1200 monitor for a year or two. It cost ~£270 at the time and if you're not bothered about super accurate colour reproduction it's very easy on the eyes to use all day. It's great for games and films too (again, if you're not a purist and can cope with a bit of uneven backlighting etc)
It's a HannsG HG281.

Of course I'd probably rather have something with a 10000x8000px res at the same screen size, but hey for that money it was a good buy.

Re:feels hollow (1)

Jeffrey_Walsh VA (1335967) | more than 3 years ago | (#35421292)

I found six with "Recommended Resolution" of 2560 x 1600 from $1-3k.

http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=20&PropertyCodeValue=1099%3A25153 [newegg.com]

Re:feels hollow (1)

gstoddart (321705) | more than 3 years ago | (#35421400)

I found six with "Recommended Resolution" of 2560 x 1600 from $1-3k

Don't know about you, but I consider a "$1-3k" monitor to be exotic and pricey.

Sure, they sound absolutely awesome, but you're talking about more money than I'd be willing to spend on a computer.

Definitely 'niche' market kinda stuff.

Re:feels hollow (1)

Tr3vin (1220548) | more than 3 years ago | (#35420766)

He is going off the sort of standard resolutions for monitors of that size. Once you get past 24", most decent monitors increase in resolution past 1920x1200 or 1920x1080. A good 30" monitor is normally 2560x1600. This of course assumes you are buying a computer monitor and not a TV.

Re:feels hollow (1)

the_banjomatic (1061614) | more than 3 years ago | (#35420854)

I assume he was referring to the 30" lcd's that run at 2560x1600 resolution... which are awesome for the record

Re:feels hollow (1)

gstoddart (321705) | more than 3 years ago | (#35421004)

I assume he was referring to the 30" lcd's that run at 2560x1600 resolution... which are awesome for the record

*drools on keyboard*

Wow! Seriously, wow! How much does something like that cost? This seems like you're way beyond gaming rig here -- and, if you're really talking about running 2 or 3 of these for a gaming machine (like some people are), well, then I strongly suspect you don't really care that your video card(s) cost. You've already spent a small fortune on monitors.

Re:feels hollow (1)

the_banjomatic (1061614) | more than 3 years ago | (#35421122)

Yeah, I can't imagine having 3 of these... 1 takes up a good amount of desk real estate as it is, I got my Dell 3008 refurbished for about $1200 I think a year and a half ago with full warranty, etc.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35421500)

There are a few screens larger than 1080p but not many are suitable and when you're looking at triple screens then 48:10 is much nicer than 48:9.
At least one of your screens needs DisplayPort (unless you use an active adapter) so unless you want more problems matching panel types and colours
you'll want all 3 with DP. There are NO larger 1200p monitors with DP so you have to look at the 2560x1440 & 2560x1600 large screens.

Also.. the 2560xnnnn screens mean you cant use HDMI audio as there dont seem to be any AV receivers yet that can handle over 1200p yet and there
are no dual link DP to HDMI adapters.

There is an interesting thread about a guy doing 3 x 30" 1600x2560 monitors running off four Nvidia 580 cards.

Re:feels hollow (1)

gstoddart (321705) | more than 3 years ago | (#35420664)

It makes me laugh that most sites reviewed it on a single screen system, most at 1080p. Most of the current top-end cards can easily do modern games at maximum detail even on 30" screens.

If they're both at 1080p, then the size of the screen doesn't matter, does it? It doesn't take more memory if the pixels are bigger but the same in number.

Or, are you talking about running at resolutions higher than 1920x1080? I didn't think you could easily get monitors at much higher resolution.

Re:feels hollow (1)

TrancePhreak (576593) | more than 3 years ago | (#35421002)

3 screens would be (3 * 1920) * 1080. So technically running at a much higher resolution.

Re:feels hollow (2)

Nemyst (1383049) | more than 3 years ago | (#35420456)

AMD's offerings usually have lower power consumption and heat generation. While I'm sure nVidia could come up with something, they'd probably have a hard time using the 580 as a basis, because it runs so hot already. I mean, the 6970 consumes a whole ~140W less than the 580 (!), yet they still had to notch it down so it fit in the standard and add that clever switch. AMD's current offerings are just far more power efficient than nVidia's, which means they'd need to underclock their dual-GPU card more than AMD had to. Heat would also be a concern requiring underclocking.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35420538)

That doesn't really matter if you do 2x 6990 (or 5970) in Crossfire. Then the dual-GPU is an advantage, not some gimmick to one-up their competitor.

Re:feels hollow (0)

Anonymous Coward | more than 3 years ago | (#35420612)

Nvidia is already planning a dual-GPU 500-series card, the 590. It's supposed to be out sometime this quarter, priced somewhere between $700 and $1000. AMD is simply beating them to the punch, which is a completely valid marketing move. Especially since nVidia seems to be encountering problems making the 590 - probably because those chips are massive, nearly twice the size of AMD's equivelant.

ATI and AMD have never really been the "top-of-the-line" manufacturer. They're more of the "most bang for your buck" company - while usually slightly less powerful than the competition, they are generally cheaper, more reliable, and released on schedule.

PS: I'm going to make a little prediction: within a few years, we will see quad-GPU boards coming out. Probably made from middle-of-the-line chips, not the high-end; a quad-GPU based on the 6870, for instance, would provide 33% more power than the dual-GPU 6990 for only a 20% increase in cost (by my estimates).

Re:feels hollow (1)

billcopc (196330) | more than 3 years ago | (#35421218)

The big problem with dual 580s is peak power draw would be around 750w, just for the GPUs. They would have to make certain sacrifices to fit any reasonable power envelope.

If these GPUs keep sucking more and more power, they will have to start seriously considering making them external. You'll have your PC, a GPU box beside it with its own kilowatt power supply, and just an interface board and cable between the two. There is simply no sense in cramming more heat and power into the PC chassis, just to play random games or join some hippy-dippy folding project.

Wonder how long before someone realizes. (0)

Anonymous Coward | more than 3 years ago | (#35420362)

Wonder how long before someone realizes clock speed ain't everything. I think the company which wises upto this by offering better drivers/architecture/ eco system for development and more optimization will survive these vicious bouts.

Case in point : When Intel was pumping P4 clock speeds Opteron came along and caught Intel pants down and peeps quickly realized a 2.6Ghz core was faster than a pentium 3.2GHz

-S

Re:Wonder how long before someone realizes. (0)

Anonymous Coward | more than 3 years ago | (#35420690)

Welcome to 5 years ago?

Re:Wonder how long before someone realizes. (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35420700)

ATI and Nvidia have been doing this dance since 1998 I believe. I don't think they're going to change anything soon.

Also, I'm a bit confused by your logic, as GPU clock speeds haven't advanced anything like CPU clock speeds in the same time period. GPUs have mostly been going for the massively parallel multi-core architecture design ever since the Voodoo2, which CPUs have only really started doing in the past few years. Hell, I think my GTX478 has something like 448 cores or something like that, clocked -lower- than the GTS 8800 I had before it.

why? (2, Insightful)

Charliemopps (1157495) | more than 3 years ago | (#35420420)

My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?

Re:why? (1)

Gr33nJ3ll0 (1367543) | more than 3 years ago | (#35420516)

Exactly! Mod parent up. Right now the video game market is being driven largely by the consoles that have video cards from ten years ago. There's really not much to max out a ePenis like this card.

Re:why? (1)

TeknoHog (164938) | more than 3 years ago | (#35420682)

IMHO, the only sensible use for these monster GPUs is parallel computing (OpenCL etc). For many problems they are the best bang per buck, as well as per power consumption. It seems that the HD5000 series maintains the lead in this sense; for example the HD6990 has fewer stream units than the HD5970, and the extra texture units are not generally used in computing.

Re:why? (1)

The End Of Days (1243248) | more than 3 years ago | (#35421324)

Your opinion needs a little more humble in it. The only use you can see and the "only sensible use" have absolutely nothing in common. You just aren't that important.

Re:why? (1)

Gaygirlie (1657131) | more than 3 years ago | (#35420738)

Exactly! Mod parent up. Right now the video game market is being driven largely by the consoles that have video cards from ten years ago. There's really not much to max out a ePenis like this card.

It isn't about some specific need per se; overclocking and tuning is a hobby, an expensive and not always such a smart hobby, but nevertheless there's some even worse hobbies in the world. It just happens to be fun to see how far you can push your PC, how much more you can squeeze. Is it useful? No. Does anyone need such power for anything? Not really, atleast home users don't. And games simply have trouble taking advantage of it all even as it is. It STILL is fun.

That said I personally would not buy such hardware even if I could afford it -- and I definitely can't -- and rather just buy cheap parts that I know will overclock well and/or are easily modified and thus still get a decent gaming rig.

Re:why? (1)

Skarecrow77 (1714214) | more than 3 years ago | (#35420844)

It isn't about some specific need per se; overclocking and tuning is a hobby, an expensive and not always such a smart hobby, but nevertheless there's some even worse hobbies in the world.

As an example from another of my hobbies, the price of this card would get you halfway towards a top-of-the-line set of headphones... not counting of course the top-of-the-line amp to go with it... which put together are WAY cheaper than an equivalent speaker setup... which in turn is WAY cheaper than an offshore boat... which is of course way cheaper than manipulating the world financial markets for shits and giggles.

That last hobby scares me.

Re:why? (1)

gstoddart (321705) | more than 3 years ago | (#35421224)

As an example from another of my hobbies, the price of this card would get you halfway towards a top-of-the-line set of headphones... not counting of course the top-of-the-line amp to go with it... which put together are WAY cheaper than an equivalent speaker setup

And, of course, the rest of us are convinced you're daft to spend that much money on a set of headphones.

You may actually be able to hear the difference, or at least believe you can. To most of us, it seems like you're spending several times more, for a tiny, almost non-existent difference.

Seems like diminishing returns on investment.

But, hey, I've known audiophiles, and they seem to be mostly OK spending outrageous amounts of money on such things.

Re:why? (0)

Anonymous Coward | more than 3 years ago | (#35420832)

Future proofing.

Until recently, any hardware I put inside my computer was in there until it absolutely had to be replaced. At the time, I was running a radeon from the 1XXX series ( a 1650 I think ) in 8x AGP. I upgraded to a 4870 when I went to a PCIe compatible board and just recently upgraded to a 6870 and passed my 4870 to my brother so he could run the matching one he has in crossfire.

I'm going to be set for a long time - this card maxes out everything I have and will hold me for at least a year or two minimum before I start to consider upgrading.

Re:why? (0)

Anonymous Coward | more than 3 years ago | (#35421290)

Why not buy a currentGen-1 card that can still play everything currently out there and save some $? Then when you find a game that needs this 6900 card, buy it when it's not bleeding-edge expensive?

Re:why? (2)

Krazy Kanuck (1612777) | more than 3 years ago | (#35420836)

Well I would imagine you are not running your $150 card at 5760 x 1200 (across three 24" monitors) with 4X AA and 16 AF now are you?

There IS are market for this performance, and granted it may not include you, but some people are more than able to bring cards with these specs to their knees.

As for console ports, granted there are quite a few, but I seriously doubt my GeForce 3 Ti500 (2001) could have have run any of today's games.

Re:why? (1)

Gr33nJ3ll0 (1367543) | more than 3 years ago | (#35420984)

And exactly how many titles support this please? I'm going to guess very few. Frankly, I wish it was the other way around, but for now it appears that this is just bragging rights.

Re:why? (1)

asto21 (1797450) | more than 3 years ago | (#35421396)

You don't seem to get the point? If there is even ONE title that supports such a configuration and if I want to play that title in such a fashion, I would need a graphics card like this. Yes?

Re:why? (1)

Gr33nJ3ll0 (1367543) | more than 3 years ago | (#35421598)

I think you and I disagree on the definition of the word "need".

IMHO, it's not worth it, or a "need" since it's far, far outside my normal usage, or most of the people I'm familiar with.

Unfortunately, this is a consumer electronics component, so they need more than a very few fanatic people with 3 monitor setups to sell these cards. However, in recent years the number of titles that support these extremes has grown less, shrinking the pool of people who could potentially be interested. We've gone from "You need this card to play" to "You need this card to get good quality graphics" to "You need this card to max out the settings on three monitors for a few titles" with the pool shrinking at each step. When you're down to one or two titles, which could be played on much much cheaper equipment, with little loss in quality, you've got a very very small pool.

Seems like a very poor decision when you want to sell thousands or millions of units.

Re:why? (1)

MozeeToby (1163751) | more than 3 years ago | (#35420912)

There's a difference between being able to play a game and running the game on ultra high settings. My laptop can run any game on the market right now, but I wouldn't say it would be the most pleasant experience and it certainly wouldn't be at anything more than medium to medium-low settings. Some people like the new shiny that PC games offer. While I (and apparently you) don't think it's worth the extra money just to be able to run the latest Crisis expansion across three monitors with the graphics up to ultra-high, other people apparently disagree, and them being willing and able to pay the price improves innovation for everyone down the road, which is kind of a nice bonus.

Re:why? (1)

Gr33nJ3ll0 (1367543) | more than 3 years ago | (#35421008)

I've got a $200 video card that appears to run everything on the ultra settings, including the original Crysis. That being said, even the reviewers are forced to run the same 5-6 titles again and again because there are so few titles that really stress video cards anymore. So why pay $500-1500 for less than a half dozen titles?

Re:why? (1)

VGPowerlord (621254) | more than 3 years ago | (#35421548)

I've got a $200 video card that appears to run everything on the ultra settings, including the original Crysis. That being said, even the reviewers are forced to run the same 5-6 titles again and again because there are so few titles that really stress video cards anymore. So why pay $500-1500 for less than a half dozen titles?

You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250. Those cards being the AMD Radeon HD 6950 [newegg.com] and nVidia GTX 560 Ti [newegg.com] .

Re:why? (1)

Gr33nJ3ll0 (1367543) | more than 3 years ago | (#35421640)

You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250.

I was referring to the mythical three monitors + video card setup. As I stated earlier, I've got a $200 video card that I'm very happy with.

Re:why? (2)

ustolemyname (1301665) | more than 3 years ago | (#35420658)

Triple display + 3d. need > 120fps with at least 3 times the resolution that your monitor has. And if you consider the cost of such a setup, 700$ is a reasonable proportion of the cost. Not saying it's a good use of money, just saying there are systems that can use this power.

Re:why? (0)

Anonymous Coward | more than 3 years ago | (#35421254)

$700 is not proportional to that type of mid-range setup. What you're talking about (3 screens, etc) is more like a $300-400 video card.

$700 would be proportional to a setup with 4 to 8 screens and a bleeding edge CPU + 16GB or more RAM.

Re:why? (1, Funny)

clintonmonk (1411953) | more than 3 years ago | (#35420750)

If you have to ask, then you're not the target audience.

Re:why? (1, Funny)

jedidiah (1196) | more than 3 years ago | (#35421304)

....if you can't answer, then you're not the target audience either.

Re:why? (1)

Ephemeriis (315124) | more than 3 years ago | (#35420878)

My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?

Hell, the $150 card I bought about two years ago still works fine.

Obviously I can't crank all the settings up as high as they'll go... But I have yet to run into a game that doesn't run well.

Just finished playing through Dead Space 2 - it ran fine and looked great.

Re:why? (0)

Anonymous Coward | more than 3 years ago | (#35421064)

It's for those who have big monitors (2560x1440).

Re:why? (1)

Anonymous Coward | more than 3 years ago | (#35421098)

{{cite}}

I want to know what $150 card from two years ago plays your current games at max settings.

Re:why? (1)

stms (1132653) | more than 3 years ago | (#35421200)

My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?

Because now your epenis is tiny and girls will laugh.

Re:why? (1)

Kjella (173770) | more than 3 years ago | (#35421246)

Wait, I think I heard the exact same thing yesterday in the Intel Extreme cpu comments. Why? Because you can. This is luxury, like drinking a 70$ wine over a 15$ wine, nobody needs to do it but it's to spoil yourself. It's not necessary to be able to crank the quality settings all the way up to enjoy a game, but if you can afford it it's the little extra.

Re:why? (1)

Anonymous Coward | more than 3 years ago | (#35421408)

Because mine does it with higher quality settings and with more than enough FPS to not have any slowdowns during high intensity moments, while yours doesn't.

Flagship (1)

sexconker (1179573) | more than 3 years ago | (#35420464)

This is not the flagship.
This is the super aircraft carrier.

The flagship is the one the defines the generation.
The flagship is almost always the one that is launched first.

The AMD flagship for this generation is the 6970.
The 6990 is simply two of them on a single PCB.

People still buy stuff like this? (0)

Anonymous Coward | more than 3 years ago | (#35420508)

I understand having a flagship to the product line, it gets attention and bragging rights. However, I can't imagine anyone owning one of these cards. Cost, power and noise just make it too much. Maybe the vendors understand this too, which is why they went to dual gpu set ups for the high end, just too expensive to develop for the ultra high end, which ends up just being a marketing tool rather than true ROI...

Re:People still buy stuff like this? (2)

John Napkintosh (140126) | more than 3 years ago | (#35421214)

There are people that buy stuff like this just to say they have it, so they can go around on interwebforums posting their synthetic benchmark results and bragging about it.

Some of them will probably never actually play a game with it.

Ready for Doom 4? (1)

cstanley8899 (1998614) | more than 3 years ago | (#35420710)

Honestly.... what games are even going to stress this card in the foreseeable future?

Re:Ready for Doom 4? (1)

Ephemeriis (315124) | more than 3 years ago | (#35420890)

Honestly.... what games are even going to stress this card in the foreseeable future?

The obvious joke is Crysis 2...

But, seriously, something like this is pure overkill.

I've got a two-year-old video card that I bought for $150 at the time, and it still plays everything just fine.

Ouch... (1)

Mortiss (812218) | more than 3 years ago | (#35420778)

"the 6990 'punches all other GPUs in the nuts." ...and steals your wallet at the same time. Aside from the epeen factor, realistically which currently available games require such a hardware. AFAIK, all the currently released games (e.g. Bulletstorm) run comfortably on the Nvidia 260, 280 cards at the highest settings (1920x1080 resolution) So a simple question, why bother...

If it were only priced in line with GTX 560 SLI (0)

Anonymous Coward | more than 3 years ago | (#35420918)

it would be worth it. The 33% price premium seems excessive.

What ever happened to VR? (4, Interesting)

wisebabo (638845) | more than 3 years ago | (#35421472)

Whatever happened to VR? (Virtual Reality) A decade or two ago, it seemed to be (short of direct neural interfaces) where user interfaces were heading. I even remember going to a Disney mini-theme park where they had some true VR rides (you wore a tracking headset) so that you could ride Aladdin's carpet.

Back then it seemed as if the main thing keeping this technology back was the room-sized SGI supercomputer required to render a reasonable scene in real time. I remember a presentation by the CEO of SGI saying that all they needed to get to was 60M triangles/sec, then VR would be achieve mass appeal. (Then again, he also dismissed delivering video from computers by saying computers wouldn't become video "jukeboxes" so maybe he wasn't so good at predicting the future.) Anyway, I don't know the latest spec's but I'm sure a modern video card could blow away one of those old SGI "Reality Engines".

So why aren't we all wearing goggles (and wearing spandex) and looking like the characters in "The Lawnmower Man"? Is it because micro-displays never got good enough? Or something else?

Re:What ever happened to VR? (1)

DigiShaman (671371) | more than 3 years ago | (#35421606)

Two major problems in order.

1. Socially unacceptable. Not a technical issue, but a social/psychological one. It's hard to interact with friends in a home where everyone decides to blind themselves from reality. Ironic, I know.

2. the HUD visor or helmet were (still?) exceedingly expensive due to the tiny LCDs spec-ed at SVGA and XGA resolutions. Proper marketing and economies of scale could resolve this.

Re:What ever happened to VR? (1)

sandytaru (1158959) | more than 3 years ago | (#35421728)

Honestly, if I stare at 3D virtual reality for longer than a few minutes, I get nauseous. I think augmented reality is more likely to take off. Instead of viewing a fake world, view additions to the real world via the glasses.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?