Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The First Quad SLI Benchmarks

Zonk posted more than 8 years ago | from the oggle-drool-drool-oggle dept.

109

An anonymous reader writes "X-bit labs have a preview of NVIDIA's Quad SLI system based on two GeForce 7900 GX2 cards. On each GeForce 7900 GX2 is allocated 512 MB of on-board memory, which is connected through a special bridge chip with 16X PCIe lanes to the other daughter card and the system. The two GPUs on the card work in SLI mode. The core and memory are clocked lower than a single GPU card at 550 MHz and 1.2GHz (DDR). For Quad SLI, NVIDIA has introduced a new mode of SLI, AFR of SFR where each card alternately renders a frame split between the two GPUs of one card after the other. The GX2 cards are benched (when possible) at resolution of 2560 by 1600 with 32X SLI AA and compared to a Crossfire x1900 XTX system on a variety of games."

cancel ×

109 comments

HTIALOA (5, Funny)

yogikoudou (806237) | more than 8 years ago | (#15232804)

Hell That Is A Lot Of Acronyms

Re:HTIALOA (1)

ThePengwin (934031) | more than 8 years ago | (#15234608)

OMFG you are right! why dont they stop using them though... it makes people feel inferior because they dont know what it stands for...

Re:HTIALOA (1)

Pleb'a.nz (712848) | more than 8 years ago | (#15235502)

Well, you haven't had a look in the photography scene lately have you ? Nikon is a real winner with it's lense naming..

There's NIKON 70-200 F2.8 G IF-ED AF-S but the winner goes too.... AF-S DX VR Zoom-Nikkor 18-200mm f/3.5-5.6G IF-ED
Eat your heart out :)

Unlike CPU, dual GPU costs double (3, Insightful)

LiquidCoooled (634315) | more than 8 years ago | (#15232817)

Why is this?

Its obvious we expect more processing power, but the prices nowadays are silly.
Its also fucked up the benchmarking because you can't just look for the card your interested in, you have to check for it being in SLI or QSLI mode.

Re:Unlike CPU, dual GPU costs double (0)

Anonymous Coward | more than 8 years ago | (#15232833)

you use it because you can

Re:Unlike CPU, dual GPU costs double (1)

imsabbel (611519) | more than 8 years ago | (#15232968)

1st, dual core IS more expensive with cpus, too. Just compare.

Well, because unlike cpus, the margins are a LOT lower.
If you disregard the highest end, you pay maybe 200-300$ for a card that has 100$ worth of memory alone on it, too.
And that for chips made in the newest processes with 300-400 mm^2 die size.

While CPUs are barely reaching 150mm^2. Thats why one can easily add another die to a cpu without breaking the manufacturing bounderies, while this is not possible with gpus.
Another reason: GPUs are much more bandwith starved then cpus. Having two on one die (thus giving each only 50% memory bandwith) would be suboptimal.

Also, the themal management is already at the limits with the emission of a single gpu die, dual dies wouldnt be coolable without water or massive heatpiping.

Re:Unlike CPU, dual GPU costs double (2, Informative)

masklinn (823351) | more than 8 years ago | (#15233514)

Reason may be that dual-GPUs are not dual core but two GPUs (usually) on two different PCB?

In a word, they're merely sticking two full graphic cards together, while dual-core CPUs stick the cores and the dual-CPU handling logic in a single physical package.

Dual GPU is twice as expensive to buy because it's twice as expensive to make in the first place.

X-bit found a tit on that one, no? (5, Insightful)

DrunkenTerror (561616) | more than 8 years ago | (#15232825)

Twenty-seven pages? Gimme a fucking break. Think they're milking it a touch?

Re:X-bit found a tit on that one, no? (4, Insightful)

suv4x4 (956391) | more than 8 years ago | (#15232845)

Twenty-seven pages? Gimme a fucking break. Think they're milking it a touch?

They are, and this is why when I see an article with more than 3 pages I just click the last page for the conclusions by habit.

I don't know why the editors think their readers can be arsed to click for a new page after each word, because it definitely doesn't work.

Re:X-bit found a tit on that one, no? (1)

Kjella (173770) | more than 8 years ago | (#15233355)

They are, and this is why when I see an article with more than 3 pages I just click the last page for the conclusions by habit.

Actually, I jumped directly to the Oblivion benchmarks, and didn't bother with the conclusion. It might as well have been a page anchor though.

Re:X-bit found a tit on that one, no? pet peeve 2 (0)

Anonymous Coward | more than 8 years ago | (#15233384)

yup, its just them trying to save bandwidth or some nonsense by making sure half the readers stop reading after the 1st page. its just too much goddamned trouble to click on the rest. frankly sites that do this loose my respect. they are sh*tting on my experience for no good reason.

Re:X-bit found a tit on that one, no? (1, Funny)

Anonymous Coward | more than 8 years ago | (#15232901)

It's worth reading all the pages just because of how hilariously bad they're written.
But is the first implementation of the quad SLI really brings extreme speeds and quality? Let's find out together!

BTW use the print version, one page + no ads.

Printer Friendly (4, Interesting)

TubeSteak (669689) | more than 8 years ago | (#15232912)

At least the good folks at Xbit Labs have a printer friendly link.

http://www.xbitlabs.com.nyud.net:8090/articles/vid eo/print/geforce7900-quad-sli.html [nyud.net]

Probably good to have a Coralized link anyways, their site was slowing down for me.

Re:Printer Friendly (0)

Anonymous Coward | more than 8 years ago | (#15234474)

That was a MUCH better link. Articles should be submitted like THAT!. I'd mod you up if I could, but instead just wanted to say thank you.

Re:X-bit found a tit on that one, no? (2, Funny)

EmoryBrighton (934326) | more than 8 years ago | (#15232936)

Twenty-seven pages? Gimme a fucking break. Think they're milking it a touch?

here's everything in one page:
http://www.xbitlabs.com/articles/video/print/gefor ce7900-quad-sli.html [xbitlabs.com]

Read this somewhere:

They like to have a lot of pages so they .. ...NEXT PAGE...
can show a lot more ads and increase thir pageviews ...NEXT PAGE...
But the joke is on them really ...NEXT PAGE...
Because most of us block ads anyway

Re:X-bit found a tit on that one, no? (1)

Rufus211 (221883) | more than 8 years ago | (#15233101)

Not really, X-bit has some of the best, in-depth reviews out there. Sure they're long but they have lots of meat to them. It's not /new page/ like Tom's /new page/ reviews which /new page/ are completely useless.

When two is not enough... (5, Funny)

andytrevino (943397) | more than 8 years ago | (#15232826)

Re:When two is not enough... (5, Funny)

Bogtha (906264) | more than 8 years ago | (#15233037)

From the Onion article, February 2004:

Stop. I just had a stroke of genius. Are you ready? Open your mouth, baby birds, cause Mama's about to drop you one sweet, fat nightcrawler. Here she comes: Put another aloe strip on that fucker, too. That's right. Five blades, two strips, and make the second one lather. You heard me--the second strip lathers. It's a whole new way to think about shaving. Don't question it. Don't say a word. Just key the music, and call the chorus girls, because we're on the edge--the razor's edge--and I feel like dancing.

From CNN [cnn.com] , September 2005:

Gillette has escalated the razor wars yet again, unveiling a new line of razors on Wednesday with five blades and a lubricating strip on both the front and back.

Re:When two is not enough... (0)

Anonymous Coward | more than 8 years ago | (#15233761)

This is probably the funniest thing I've ever seen on slashdot.

Old news... (2, Funny)

suv4x4 (956391) | more than 8 years ago | (#15232827)

Who cares about quad SLI, gimme the octet SLI, I just sold my house and I'm ready to buy one.

Re:Old news... (1)

Homology (639438) | more than 8 years ago | (#15232885)

Who cares about quad SLI, gimme the octet SLI, I just sold my house and I'm ready to buy one.

Didn't your mommy tell you that Monopoly money [wikipedia.org] aren't real money?

Re:Old news... (0)

suv4x4 (956391) | more than 8 years ago | (#15232897)

Didn't your mommy tell you that Monopoly money [wikipedia.org] aren't real money?

Shit :( I'm gonna fucking kill that Monopoly banker.

Re:Old news... (2, Interesting)

Firehed (942385) | more than 8 years ago | (#15233172)

Just don't throw a chair at him, Steve.

I wonder (5, Insightful)

masterpenguin (878744) | more than 8 years ago | (#15232873)

I wonder what percentage of people who will be running quad sli 7900's live in their parents basement.

Although I'm a college student, my experences are that once people graduate college(and are making the money to afford these toys), generally they realize what a waste of money it is to stay on the bleeding edge of PC Gaming tech

I don't know though, perhaps there is a larger market for these than I think.

Re:I wonder (1)

andytrevino (943397) | more than 8 years ago | (#15232959)

There's something to be said for being on the bleeding edge, I suppose; to some people, money is no object, and not all of them live in their parents' basements. This technology is marketed towards the Alienware crowd that has no problem dropping $5,000 on a flashy all-out computer system. The rest of us are probably not going to be gaming on a quad-SLI system any time soon.

I agree, however, that the bleeding edge becomes sub-par so quickly that it's like buying a brand-new car -- it loses some absurd percentage of its value the moment you drive it off the lot.

In my opinion, it's a much better idea to buy out of the midrange, then upgrade. Most modern games don't even make full use of the power of the latest graphics card technology anyways. A $200 graphics card now and a $200 graphics card a couple years down the line will enable you to play all the games released during that period with relatively good performance while remaining very cost-effective, especially if you take advantage of opportunities like ATI's Trade-Up program [ati.com] .

Re:I wonder (1)

JPribe (946570) | more than 8 years ago | (#15232992)

Last time I looked the Dell Renegade gaming systems were already *SOLD-OUT* and they are equipped with this graphics setup...and they are what, $10,000 per system??? What does that tell you about the market for these?

Markets (1)

hackwrench (573697) | more than 8 years ago | (#15233270)

That's what really gets me when the Republicans bash the Democrats [newshounds.us] for alternatives to gasoline, saying that it costs the average person more money. It costs people who buy those sorts of things more money. It is an incomplete substitute for the gas market and will take price pressure off of that market.

Re:I wonder (0)

Anonymous Coward | more than 8 years ago | (#15233061)

and people who live in basements should be disregarded and discounted because they're not really people right?

Re:I wonder (1)

DrSkwid (118965) | more than 8 years ago | (#15233145)

Get more experience then

Re:I wonder (0)

Anonymous Coward | more than 8 years ago | (#15234440)

Don't you mean experence?

Re:I wonder: small change (0)

Anonymous Coward | more than 8 years ago | (#15233316)

its really nothing compared to say a porsche:P and you do see expensive cars being driven around, so the money is out there.

is it necessary? perhaps, look at oblivion, it totally rapes current 2 card sli. whereas with other games one can fsaa+af while running at insane resolutions normal sli cant even handle max settings really in oblivion. let alone with fsaa/af

Re:I wonder (4, Interesting)

Kjella (173770) | more than 8 years ago | (#15233347)

Although I'm a college student, my experences are that once people graduate college(and are making the money to afford these toys), generally they realize what a waste of money it is to stay on the bleeding edge of PC Gaming tech

That's one of the phases. But there's another phase in which you find that there's a lot easier to free up money than to free up time. Or to put it in other words, that you'd rather pay to have real fun than to spend time having sorta fun on the cheap. I had a machine (AMD2000+) that became unstable. Tried RAM tests, CPU burn, 3Dmark loops, disk scans & defrags, voodoo and exorcism to no use, nothing revealed an actual problem except practical use.

I bought myself a new machine and retired it to one of the world's most overpowered home file servers. Why? Because I'd literally wasted *days* of my spare time, annoyance and grief over surprise reboots. I was so pissed I considered getting a Mac, but the x86 Macs weren't out yet. Why? "Just works(tm)". That kind of time I'd been wasting myself far more than covered the distance if I put any reasonable "price" on it.

Another thing I don't do is seriously price chase. I find a serious online retailer (either one I know previously, or one with a good customer base and rep), and as long as their prices aren't really out of whack (looking at 2-3 serious shops, I'm usually within 5% of those I know cut corners on stock, service and support) I buy it. Before I'd checking for various special offers and calculating if the postage still made it preferable to buy from different suppliers etc., try out various semi-serious sites with attractive prices etc.

To bring this back to Oblivion... I find it a very good game playing at half-res (960x600) on my 1920x1200 24" LCD monitor. I've tried it at 1920x1200 just to see what it looks like, and I don't feel it makes that much of a difference. That, and that I like my XPC that doesn't require a huge case and doesn't sound like an airplane taking off, which I imagine this will. But if I seriously felt "I need a quad-SLI to play this in 1920x1200 to really enjoy this game", I wouldn't really have a problem doing that.

Compared to the number of hours I've spent (and would spend with future games, presumably it'd last a little while) it wouldn't be unreasonable. Just like this LCD is way overkill if you want to put it like "Do you really need more than a mainstream 19" LCD?" the answer is no. But well, then I'd have a slightly bigger number in an account statement somewhere. I don't mean the cash is burning in my pocket. But if FPS games is what you do for fun, it's not an unreasonably expensive hobby compared to many others.

I know one who spent $3000 on a piano, one that spent $3000 on an HD camcorder, someone that likes to tune up his car for God knows what. All for their personal hobby, because that's what they do in their spare time, and they want their spare time to be fun. You need to have some disposable income to do that. Around here, it's easy to "rent/buy yourself to death", with a too expensive apartment/house. Then you sit there, don't go out, don't make any big purchases, you make the rent but live a sparse, plain and boring life. You choose what makes you happy.

I can't resist. (0)

Anonymous Coward | more than 8 years ago | (#15233863)

Around here, it's easy to "rent/buy yourself to death", with a too expensive apartment/house. Then you sit there, don't go out, don't make any big purchases, you make the rent but live a sparse, plain and boring life.

...and buying 4 $400 graphics cards makes you Hugh Hefner?

Re:I wonder (4, Interesting)

Aladrin (926209) | more than 8 years ago | (#15234180)

A lot of people don't get this concept. But it really DOES happen. I constantly have NO free time. I've even considered moving closer to work and spending hundreds of dollars a month extra to save a 1.5 hour daily drive. 7.5 hours a week doesn't sound like much, until you find yourself avoiding going to the grocery store for as long as possible because you simply don't have time.

I'm playing off my student loans at about 6x the minimum payments. Money is definitely not the issue. Just time.

Likewise, where I used to play every game that came out, now I only hand-pick the very best ones and I get seriously ticked if any are crap and waste my time. It's quite a marked change from the boy who always said 'I'm bored.' to the person I am today.

Re:I wonder (1)

Surt (22457) | more than 8 years ago | (#15234549)

Pay the extra to get rid of the drive. Reverse your thinking about it: how much would someone have to pay you to drive an hour and a half every day for no reason? Are you in fact getting paid that much?

Re:I wonder (0)

Anonymous Coward | more than 8 years ago | (#15236277)

In a word: Yes.

Time is not only sucked by driving though, work itself, family, house, cars, yard, pets. All take time. Family is probably the number 1 time sink, but, to be honest, if it isn't and you have kids, you probably shouldn't have had kids.

I, like the GP, am paying off loans in an effort to simplify our lifestyle. I should be down to an easily affordable single salary lifestyle within 2 years, due to paying off everything but the house. Once that's done, I'll move jobs, even taking a cut in pay to get a better quality of life.

Re:I wonder (1)

Xabraxas (654195) | more than 8 years ago | (#15234784)

I had a machine (AMD2000+) that became unstable. Tried RAM tests, CPU burn, 3Dmark loops, disk scans & defrags, voodoo and exorcism to no use, nothing revealed an actual problem except practical use.

Sounds like a bad power supply to me.

Re:I wonder (0)

Anonymous Coward | more than 8 years ago | (#15234988)

$3000 on a piano? That's like spending $30 on an LCD, or $1.50 on a digital camera.

Re:I wonder (0)

Anonymous Coward | more than 8 years ago | (#15233508)

The funny part about this quad SLI setup is that its a total nightmare, stability wise, with games. You'd think that paying more would make you have less problems, but honestly, how many games are designed with 4 graphics card linked up in mind? 0.

$2000 for shoddy support is hilariously not worth the money. I'd much rather waste it all on hookers and blow, at least that way I'd have a hell of a story to tell.

Some people just have lots of money (1)

Sycraft-fu (314770) | more than 8 years ago | (#15235088)

Guy I know goes nuts like this, has SLI'd Geforce 7800s and such, way overkill. He's a Lieutenant in the US Army (not in Iraq). The military basically covers all his expenses, so he's got money to throw around and this is what he chooses to throw it at.

Then of course there are people that just make tons of money. Another friend makes over $200,000 per year and, while he doesn't buy things like this, he does drop the same kind of money on silly gadgets. For example a PIX 515 to guard his home network. Necessary? No, a 501 would work just as well (all he really needs is their VPN, he's a Cisco Engineer) but he likes it, and can afford it.

That said, the market isn't huge, but then it doesn't have to be at those prices, espically since it's using chips that are found in lesser costing (and thus more widely produced) cards.

Yeah, that all *SOUNDS* very impressive... (1, Redundant)

pla (258480) | more than 8 years ago | (#15232876)

...But good luck getting anything but the demo that ships with your prepackaged pair of identical cards to run on such a setup.

Don't worry, though - The sequel to your favorite game might support such a configuration (assuming you have the right card model, the right rev of that model, the right motherboard, the right BIOS, and the right OS) somewhere around the time single-GPU cards have 8x (i.e. twice what this would yield, if you can get it to work) the power of anything available today.


Does this have serious geek-cred? Sure. Would anyone but a total masochist try to run such a configuration, for anything more than bragging rights? HELL no!

Re:Yeah, that all *SOUNDS* very impressive... (1)

masterpenguin (878744) | more than 8 years ago | (#15232898)

Does this have serious geek-cred? Sure. Would anyone but a total masochist try to run such a configuration, for anything more than bragging rights? HELL no!
I think this is beyond bragging rights, its like people who buy hummers just to have the biggest toy on the block. We call it compensation =)

Re:Yeah, that all *SOUNDS* very impressive... (1)

Deviant Q (801293) | more than 8 years ago | (#15232972)

Er. That would be a very nice high-and-mighty I'm-better-than-you-because-I'm-not-jealous-of-you r-real-ultimate-power-promise speech, if it were true. As it stands, it appears that the config supports:

  • Call of Duty
  • Chronicles of Riddick
  • Doom III
  • Far Cry
  • F.E.A.R.
  • Half-Life 2
  • Quake 4
  • Serious Sam 2
  • The Elder Scrolls IV: Oblivion
  • Project Snowblind

Now granted, I didn't check every one of those pages to make sure they didn't contain a "we couldn't get it to run" blurb, but a random sampling produces successes in all cases (although sometimes the results are unexpected).

Re:Yeah, that all *SOUNDS* very impressive... (4, Insightful)

m50d (797211) | more than 8 years ago | (#15233090)

Erm, wtf? The driver shipped with the card plugs it into opengl and directx, the game outputs to those and doesn't care, and everything is happy.

Re:Yeah, that all *SOUNDS* very impressive... (1)

pla (258480) | more than 8 years ago | (#15233613)

Erm, wtf? The driver shipped with the card plugs it into opengl and directx, the game outputs to those and doesn't care, and everything is happy.

Kinda like the late great Voodoo 3? Yeah, a real breeze to use. Just pop it in and let the drivers do the rest. Riiiiiiight...

Re:Yeah, that all *SOUNDS* very impressive... (1)

Kagura (843695) | more than 8 years ago | (#15234063)

Wow, what an ignorant statement. The Voodoo line of cards didn't use OpenGL nor Direct3D. Rather, they used their own proprietary library called 3dfx. That's why one had to be worried about whether or not one's graphics card would be supported by their software. Now, there are very few problems unique to only one card manufacturer, and everybody is working through Direct3D. The olden days of video card compatability problems are all but gone.

Re:Yeah, that all *SOUNDS* very impressive... (1)

Rufus211 (221883) | more than 8 years ago | (#15234273)

Company was 3dfx. The library was Glide.

Re:Yeah, that all *SOUNDS* very impressive... (1)

pla (258480) | more than 8 years ago | (#15235674)

Wow, what an ignorant statement. The Voodoo line of cards didn't use OpenGL nor Direct3D [...] Now, there are very few problems unique to only one card manufacturer, and everybody is working through Direct3D. The olden days of video card compatability problems are all but gone.

Wow, what an ignorant statement.

Do you have any idea what a driver does? As someone who has made a living writing them, allow me enlighten you a tad...

OpenGL and Direct3D provide an Application Programming Interface for games and other graphically-intense programs to use to more easily access the features available on the hardware.

You still rest at the mercy of NVidia or ATI to write their driver to export the hardware features as that API. For the basics, they do so fairly well, and as a result, we do indeed enjoy a welcome relief from the hassle of hardware-specific configuration.

For the basics.

Now tell me - Why do you suppose the top of the troubleshooting list for high-end games includes "disable custom pixel and/or vertex shaders" (and with DX10, you can add Geometry Shading to that as well)?

Answer - Because those features still depend on the game knowing how to deal with each particular card to get it right. For example, The ATI X1000 skipped Vertex Texturing yet can call itself 'Vertex Shader 3.0 compliant" by the loophole of disallowing the use of any filtering on any texture format. Now, technically the correct DX9c sanity checks would notice that little quirk and the program would respond accordingly... But from the POV of a programmer trying to implement that API, it strikes me as similar to needing to ask the CPU if it has power, and somehow magically the answer comes back "no".

Re:Yeah, that all *SOUNDS* very impressive... (1)

bedessen (411686) | more than 8 years ago | (#15233403)

So I guess you missed the part of the article where they played it with about a dozen of today's current most popular games off the shelf. You know, that whole benchmarking part?

Re:Yeah, that all *SOUNDS* very impressive... (1)

pla (258480) | more than 8 years ago | (#15233662)

So I guess you missed the part of the article where they played it with about a dozen of today's current most popular games off the shelf. You know, that whole benchmarking part?

So I guess you didn't make it up to page 10. You know, the page titled "Bang Bang: Here Come Problems"? Where they show horrible mangled screenshots and make such comments as (Bolding mine):
Before we proceed with the benchmark scores, we would like to stress that Nvidia GeForce 7900 quad SLI technology does not seem to be mature enough so far. We have experienced a lot of significant and insignificant issues with nearly all the games we have tested with it, including 3DMark benchmarks
and:
During the testing we also experienced numerous crashes and freezes


But hey, kudos, your comment (and two others) did well enough to convince the mods to spank me.

Ass.

Re:Yeah, that all *SOUNDS* very impressive... (1)

bedessen (411686) | more than 8 years ago | (#15233699)

Of course I read that. That's not the point. I'm not debating that the driver support at the moment blows.

You made the assertion that this technology requires special software support from the games. That is not true, it is all handled in the driver in a way that is transparent to the app. Yes, it's buggy now, but that is beside the point.

very cool and all, but (3, Insightful)

Prophetic_Truth (822032) | more than 8 years ago | (#15232879)

This will be yester year's technology when the next architecture comes out. In the video card market new archs seem to happen every year. My new cap on video cards is $300/year. I got a 7900gt, you can do a voltage mod and buy a $30 cooler and by overclocking, get the same performance as a 7900gtx. They both use the exact same gpus. Google for guides

I'm guessing you don't drive a Ferrari either? (1)

nick_davison (217681) | more than 8 years ago | (#15233337)

As the article starts out with:

You can buy a $200,000 Italian sports car or a $30,000 Japanese car and add $20,000 in parts to get almost the same performance. But you'll likely never get the same shit-eating grin.

Now, for most people, a Ford or a Honda is plenty. They'd much rather have an OK car and the $180,000 difference that they never had anyway. But that doesn't devalue what Ferrari or Lambourghini offer those who are willing to pay for it.

Similarly, yes, 1960's Ferrari probably can't hold a candle to 1980's higher end Nissan - but the driver who can afford a Ferrari in 1960 has had 20 years of awesome enthusiast's driving and has likely bought 1980's Ferrari that Nissan's 1980 model still can't touch.

Granted, timescales are the same in the gaming world but there is still a period of time where you have a significantly faster system. Sure, it may only be 12 months, not 20 years... But then it only costs $500-1500 more every 12 months ($10,000-$30,000 over 20 years) rather than $150,000 more once every 20.

The point is, these things aren't for everyone. But a lucky few get to live in a world where such options are possible and are quite happy to pay many times the cost if only for that exclusive 20% extra - and, as with cars, it gives those who're fans of the genre but don't have the money fun dreams to drool over.

For what it's worth, I'm guessing the Indian programmer that's about to take your job laughs at your craziness for buying a $20-30,000 car when he gets much better price/performance from a cheap motorbike that lets him take your job. Everything's relative.

Re:I'm guessing you don't drive a Ferrari either? (1)

divide overflow (599608) | more than 8 years ago | (#15233728)

> Similarly, yes, 1960's Ferrari probably can't hold a candle to 1980's higher end Nissan - but the driver who can afford a Ferrari in 1960 has had 20 years of awesome enthusiast's driving and has likely bought 1980's Ferrari that Nissan's 1980 model still can't touch.

May I also point out that if you kept that 1960's Ferrari maintained and in good condition it would likely be worth MUCH MORE--even in inflation adjusted dollars--than you paid for it. It would be fun AND a good investment. This is almost never true for personal computer equipment.

Re:I'm guessing you don't drive a Ferrari either? (1)

CaseyB (1105) | more than 8 years ago | (#15233811)

Don't forget to add the cost of maintaining a Ferrari in collector condition for forty years. That's probably a large multiple of the original price.

Re:I'm guessing you don't drive a Ferrari either? (1)

divide overflow (599608) | more than 8 years ago | (#15234276)

> Don't forget to add the cost of maintaining a Ferrari in collector condition for forty years. That's probably a large multiple of the original price.

Ok, no problem.

Let's say I bought a brand new Ferrari 250 GT "Nembo" Spyder [sportscarmarket.com] for $12,000 back in 1964, then like most owners kept it garaged and drove it mostly evenings and weekends. And let's say that over the years I'd also spend another $36,000 on repairs and servicing...three times its original purchase price.

After ten years parts would be difficult to get, so I'd mostly keep it in the garage under the car cover and take it out for an occasional rally. Let's forget the insurance since I'd be paying a good rate for any fast car, and since I'd mostly keep it garaged my agent would offer me a reasonable rate. Most of my recent maintenance would be cleaning and polishing my vintage toy, and possibly a tune up once a year, so let's throw in another $2,000 a year for that.

So, how much is my car worth now? If you clicked on the link above you'd already know the estimated value is between $675,000-$900,000. Even counting all my other costs I'd gladly take that appreciation any day of the week.

Re:I'm guessing you don't drive a Ferrari either? (1)

i.r.id10t (595143) | more than 8 years ago | (#15233972)

What'll make you sick is the cars for sale ads in the old car magazines. I've got R&T from the 50s to the mid 90s, and seeing Carrera-engined Speedsters for $2,000 with the engine blown (core engine is worth $50k+ today alone), '65 Mustangs in good shape for under $1k, not to mention the prices on the *new* on the show room muscle cars. Heck, even original spare parts are worth $$ to some people, especially the concours crowd.

My personal recommendation for GPUs (1)

Sycraft-fu (314770) | more than 8 years ago | (#15235105)

Pick a price you are willing and able to afford once per year, and buy there. You are generally better off getting a lesser card more often than a great card once and a while. So set a range for yourself and then upgrade about once per year. For gamers, I recommend shooting for the $150-200 range if possible. Get a card like that once per year, and you'll find all games will run fine, even near the end of that cycle. Don't give in to the temptation to get a more expensive card thinking it'll be good for longer, it really won't. In additon to gaining a lot of speed, graphics cards get new features all the time. A more frequent upgrade cycle is the way to go.

Why? (0)

Poromenos1 (830658) | more than 8 years ago | (#15232893)

What is the reason for this? Why would you spend $1000 for high framerate? At least for casual (or even hardcore) gaming, I find this stupid.

Re:Why? (0)

Anonymous Coward | more than 8 years ago | (#15233034)

What is the reason for this? Why would you spend $1000 for high framerate? At least for casual (or even hardcore) gaming, I find this stupid.

No shit it's stupid for casual gaming. That's not exactly the market they're aiming for.

Why do people bitch and moan about excessive graphics systems? Obviously somebody out there is buying these things. How is that affecting you?!?

Re:Why? (1)

@madeus (24818) | more than 8 years ago | (#15233500)

What is the reason for this? Why would you spend $1000 for high framerate?

Playable framerates (e.g. >30, more like ~60) when playing modern games with very high quality detail (which makes the experience more immersive).

Playing top dollar for a new title then playing it at 15 FPS is something that would be pretty stupid (I can't see how that would be 'fun' no matter how 'casual' a gamer you are - and if you are happy turning down the the graphics details to low end then you might as well settle for an a cheaper, older title).

People easily spend far more than that on their TV's (high end HD TV's cost 2-3+ as much as as a high end gaming system), presumably you think that's stupid too.

Do you honestly think there is no point in improving graphics beyond Quake ("Oh that's true 3D, we can stop improving now, there is no point in making the environment any more immersive)?

Re:Why? (1)

42Penguins (861511) | more than 8 years ago | (#15233537)

To be leet.

People with a lot of money (obviously) like to spend it on nice things! If you have the means to afford a quad-SLI rig, your priorities change.

Me, I'll stick to my 64mb Radeon.

What resolution? (2, Interesting)

DarthChris (960471) | more than 8 years ago | (#15232933)

FTFS (emphasis added):
"The GX2 cards are benched (when possible) at resolution of 2560 by 1600 with 32X SLI AA and compared to a Crossfire x1900 XTX system on a variety of games."

Who actually has a monitor capable of such a high resolution?
Secondly, correct me if I'm wrong, but CrossFire currently is two cards side by side, and if four cards don't perform significantly better than two, I'd be very worried.

Re:What resolution? (1)

flobberchops (971724) | more than 8 years ago | (#15232969)

Me, I run a Quad SLI on my Alienware 19x10 foldout 17" widescreens notebook.

Re:What resolution? (2, Informative)

null-sRc (593143) | more than 8 years ago | (#15232979)

Who actually has a monitor capable of such a high resolution?

*raises hand*

and so does anyhow who bought the 3007wfp on sale during recent dell days... Oblivion at native res is only about 30fps... would prefer to quad it up for a decent 100+ fps

Re:What resolution? (1)

imsabbel (611519) | more than 8 years ago | (#15232980)

Hm. Thats the resolution of the DEll 30" LCDs.

Any maybe you are asking the wrong question: Its not that people would buy those cards and then wonder what do do with the resolution (although i am sure such will be, too), but rather people who already spend 1000s for such ultra highres toys like the 30" Apple or dells, or those IBM displays with 4xxx*3xxx that sell for half a fortune.

Re:What resolution? (1)

Firehed (942385) | more than 8 years ago | (#15233195)

Well going by the article, it's not at all worth it. Seeing that a 7900GT can run most current games fairly comfortably at 1920x1200 (23-24" widescreen), you can save a grand on the cards and almost two grand more on the monitor. Plus, going by the pre-slashdotting benchmarks, the performance wasn't nearly what it could (should) have been - it got it's arse kicked by crossfire X1900XTs in quite a few configs. I knew the ATI flagship had a lead over that of nVidia, but it's certainly not over twice as powerful. Maybe it's just wonky drivers or something, but it seems that merely (normal) SLI'ing 7900GTs would allow you to up the resolution a notch - albeit a big notch - and play comfortably.

Quad SLI == Two Physical Cards. (0)

Anonymous Coward | more than 8 years ago | (#15232994)

Quad SLI is also two physical cards.

Re:What resolution? (1)

m50d (797211) | more than 8 years ago | (#15233068)

I do. I keep it at 1600 since that's all the EDID claims it will do, but it has no trouble with that res.

Re:What resolution? (2, Informative)

Surt (22457) | more than 8 years ago | (#15234560)

2560x1600 is a nice resolution to use with the cinema display, or with the dell 3007.
http://www1.us.dell.com/content/topics/topic.aspx/ global/products/monitors/topics/en/monitor_3007wfp ?c=us&l=en&s=gen&~section=specs [dell.com]
It's not too expensive a monitor, popular with gamers who have the kind of money to buy quad sli.

Site down (1)

AjStone (743464) | more than 8 years ago | (#15233053)

I think the site has been Slashdotted...

The real question (1, Interesting)

Anonymous Coward | more than 8 years ago | (#15233148)

How much does it take to cool these things?

Even if someone gave me $1k and told me I could only use it on a quad-sli setup, I don't think I'd take it, mainly because I suspect that the cards would fry everything in a five-mile radius without watercooling.

What's the freakin' point? (1, Interesting)

sweez (971938) | more than 8 years ago | (#15233180)

"...Nvidia GeForce 7900 quad SLI is not faster than ATI Radeon X1900 XT CrossFire (which is known for high performance in high resolutions and with FSAA) across the board and may even lose to dual GeForce 7900 GTX setup."

So... can anyone explain *what's the point* then?

Re:What's the freakin' point? (1)

TrappedByMyself (861094) | more than 8 years ago | (#15233314)

So... can anyone explain *what's the point* then?

So some lonely dude spending all his $$ on computer toys can get a few brief seconds of joy as he posts his specs on a message board somewhere.


Not dissimilar to the feeling I get when I make fun of other people. Ooooohhhhhh what a rush.

Re:What's the freakin' point? (1)

heson (915298) | more than 8 years ago | (#15236000)

More like posting his spec as a 10 line sig in every post he makes on every forum he has the time post on.

Re:What's the freakin' point? (1)

sweez (971938) | more than 8 years ago | (#15236569)

Haha, you just gotta love that kind of people...

Re:What's the freakin' point? (1)

sweez (971938) | more than 8 years ago | (#15236467)

Yeah, but when you're making fun of people, no-one can come up, say "dude, but my SLI/Crossfire set-up is faster (and 2 times cheaper) than your quadriple-quantum-field-holographic one" and make you feel like you've been poked in the eye.

Well, if you make fun of people, they might actually come up to you and *really* poke you in the eye, but that's another matter...

Re:What's the freakin' point? (2, Funny)

pnaro (78663) | more than 8 years ago | (#15233423)

So he can run Aero when Vista ships?

Re:What's the freakin' point? (1)

masklinn (823351) | more than 8 years ago | (#15233541)

Wanking at your... uh... number of GPUs?

AFR = higher latency (1)

CreateWindowEx (630955) | more than 8 years ago | (#15233292)

This would seem like not the best solution for, e.g., first-person shooter gamers, who mainly want high frame rates to lower the total latency (time between controller movement and seeing the result of that on the screen, not to be confused with network latency). In the worst case, in a Quad AFR setup, if you are running at 30 fps, that would mean each GPU is actually rendering at 7.5 fps, or 133 milliseconds per frame, versus the 33 milliseconds per frame of 30 fps. (Not counting the additional latency incurred by input methods, deferred rendering, double buffering, etc). Obviously 30 fps Quad AFR would still look much smoother than 7.5 fps, but it wouldn't be as responsive as 30 fps non-AFR.

For a demonstration of what true low-latency game-play is like, try playing something like Kaboom! with a paddle on a real Atari 2600... no double-buffering there!

Re:AFR = higher latency (1)

Barny (103770) | more than 8 years ago | (#15234064)

Yeah, but if you read the info (hell, even the summary) it states that it does not do quad AFR, it loadballances each pair of cards and AFRs that, and given that i have seen (on the inquirer.net) this thing pulling about 45fps in fear on one of those 30" monsters, the latancy would be negligable anyway.

This arguement was raised when they first started with the modern SLI tech, especially when the renderer doesn't support the SLI properly, I have never found any more lag than would be otherwise present (objectively) in any game that runs on it (and of course thanks to the new nvidia drivers you can disable SLI any time a dx window is not active).

Another thing to point out here (not seen anyone else mention it) that these use g71M chips, effectively overclocked 7900GO chips, obviusly to keep the power useage by the total graphics setup under the 700W mark :)

Didn't one of the card companies try this before? (1)

Zantetsuken (935350) | more than 8 years ago | (#15233399)

iirc, didn't one of the video card companies the thing with cards alternating on rendering frames? didnt they call it sli back then also except it stood for something else and that was the only way it speed things up (alternating frames)?

Re:Didn't one of the card companies try this befor (1)

OneoFamillion (968420) | more than 8 years ago | (#15233501)

I think most consumer multi-card solutions have relied on dividing the workload for single frames... But there was ATI's Rage Fury MAXX of course, which sported two chips that did alternate frames.

Re:Didn't one of the card companies try this befor (1)

redmund (955596) | more than 8 years ago | (#15233653)

That would be the, now defunct, 3dfx and their Voodoo2 cards (I still have my 12 meg sitting in a box at home somewhere). Their SLI referred to ScanLine Interleave, which, I guess, could be considered a form of AFR. One card drew all the even lines on the screen, the other drew all the odd lines, and you got a pretty sweet performance bonus.

3dfx went bust a number of years ago and nVidia bought up all the leftovers, presumably taking 3dfx's sli tech and developing from there.

Re:Didn't one of the card companies try this befor (1)

Zantetsuken (935350) | more than 8 years ago | (#15233820)

ya, thats the one I was thinking of...

Re:Didn't one of the card companies try this befor (1)

awing0 (545366) | more than 8 years ago | (#15233922)

I have a pair of 12MB Voodoo 2s somewhere too. The Voodoo 5500 had dual processors making it SLI on one board. The 6000 was to have 4 processors but never made it to market because of an AGP bug and the fact that NVIDIA single GPU cards (GeForce 2) were getting as fast or faster than 3dfx's SLI setups.

oooooh... benchmark porn! (1)

gerbouille (663639) | more than 8 years ago | (#15233658)

i'm wetting my pants just by thinking about it

The First Quad SilLy Benchmarks (1)

ergean (582285) | more than 8 years ago | (#15233693)

The new megahurtz madnes. How many CPUs/GPUs do you have? Ohhh.. that is so yesterday.

To bad they don't come back in style in 10 years.

People, people, people (3, Informative)

eebra82 (907996) | more than 8 years ago | (#15233917)

Reading the comments above made me realize that a lot of people don't understand what NVIDIA has done here. Let me point out a few things for you:

* This is not hardware for the mass market. In fact, even the dual SLI setup is overkill and mainly used as "we knew how to do it and so we did it to prove it".
* This system is not supposed to be cheap and most definitely not intended to be the most effective cost per fps solution.
* Although only a few will buy this, it is far more valuable for NVIDIA to kill ATI:s chances of de-throning them from the performance top.
* Such excessive memory bandwidth is suitable for extreme resolutions that are currently unsupported by over 95 percent of the monitors, but the point is not that we should play our games at these levels, but to prove that it is possible.
* NVIDIA gets an edge over ATI along game developers because, performance-wise, they will be able to run the future games on setups comparable to single cards that are two or even three generations away.
* Yes, it's a waste of electricity, but if you're a member of Green Peace, then wait a few more generations before you buy a cow approved graphics card that fits into this category.
* One user was upset, claiming that it would be stupid to waste $1000 on a setup like this. I agree, but if you happen to drive a Ferrari and if you are debt free and got a few million bucks stored, then why not settle for the best if you can afford it? And you can obviously get your 17-year-old Slashdot-reading neighbour to put in watercooling or whatever to make it silent, too. Point is, some people will buy this, and being able to afford something isn't being stupid.

Last but not least, we should all remember that the CPU is the new bottleneck now. It will be interesting to see what a CPU a year from now can do to this rig.

Re:People, people, people (1)

BCW2 (168187) | more than 8 years ago | (#15234450)

Just like last year when ATI came out with the first card with 512MB on it. When asked what the use was for this an Exec. said "none, we did it because we could and it makes a great ad campaign".

Someone will make use of the capabilities but it will never be mainstream.

The real bottleneck on systems is still the buss, when buss speed is closer to CPU speed that will disappear. Of course any time you access the hard drive it seems like a 486. That is still a major bottleneck for those with less than 1GB of RAM. With the RAM prices today that is not much of an excuse though.

Re:People, people, people (1)

sweez (971938) | more than 8 years ago | (#15236511)

Well yes, that all makes sense... but how are they doing anyone any good by being *slower* than current SLI/Crossfire setups? That's not a really good show-off...

Octa-SLI (1)

DigiShaman (671371) | more than 8 years ago | (#15233942)

Can I get Octa-SLI with four of those cards on this Motherboard [hothardware.com] ?

Things that make you go hmmmm... (1)

amavida (898618) | more than 8 years ago | (#15234079)

I remember when I got suckered into buying 2 voodoo cards placed into my PC in SLI mode.

the demo disc that came with it showed off -: DAZZLING :- effects...
  and _NONE_ of the fucking games I owned or subsequently bought got any significant benefit from it...

hmmmmm...

No wonder people are buying consoles - they are tired of being milked!

And the Award Goes To... (0)

Anonymous Coward | more than 8 years ago | (#15234362)

And the Award for the Most Incomprehenible Slashdot article this week goes to...!

DX10 (1)

Rdickinson (160810) | more than 8 years ago | (#15234454)

So $1000 gets you cards to throw in the bin once Vista comes out , by that time you'll be wanting octo 8800GTX SLI 1GB cards ...

This is so fast it'll NEVER be obsolete!! (1)

brxndxn (461473) | more than 8 years ago | (#15235082)

To those that say this will be obsolete in a year.. NONSENSE!! It'll never be obsolete! This is finally so fast that this will be THE LAST PIECE OF COMPUTER HARDWARE EVER TO BUY! It can only appreciate in value.

I'm investing in 10.

All those predictors judging this new hardware from prvious countless years of industry/market behavior judging from thousands of computer products.. they're all wrong. ROFL @ them..

Air flow? (1)

CommanderEl (765634) | more than 8 years ago | (#15235283)

Whats the point really? 4 separate graphics cards in 1 PC case? I have 1 7900GT and that loud enough, I wouldnt want to hear 4 of them at once! Realistically, its all aimed at the high end user market who would are more likely to purchace high end QuadroFX cards. I hope I dont see this on the consumer market any time soon, I would just feel compelled to upgrade again. I do believe that there is no need to force this technology onto the gaming scene, game developers arent making use of the dual core CPU benefits. CPU's are still the bottleneck of the gaming world at the moment. You will see a FPS limit on benchmarks purely because of CPU limitations. More 3d Mark Wank scores sure but who cares about them :P Why spend all this time and money investing in 4 single core cards in SLI, a 3 year old technology? Why not put some time into developing more robust and scalable applications for video cards, more to what crossfire is all about. I would lean towards a dual core GPU card rather than 2 or 4 cards in SLI. Think about all the savings in power and heat.

2 to many (1)

Deth Rot (912254) | more than 8 years ago | (#15235296)

So this ,for now, is a case of 2 in the hand is better than 4 in the bush lab. I really see a hard time for 4 graphics card to even become HI end Geekdom computers. Especally when we have barely even touched the capability with 2 graphic cards. The creators of the games are barely messing with SLI/Crossfire let alone quad. This is way to over the edge... Look at SLI when it first came out around what... 2000 and is just now being accepted? Put it back in storage Nvidia and give us the public about 5 years before we even think about wanting quad core. :\
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...