Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Reintroduces SLI with GeForce 6800 Series

CowboyNeal posted more than 10 years ago | from the two-cards-are-better-than-one dept.

Graphics 432

An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI. Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"

cancel ×

432 comments

Sorry! There are no comments related to the filter you selected.

George W. Bush (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9549838)

can suck my metal cock!

When (0, Flamebait)

instanto (513362) | more than 10 years ago | (#9549839)

All great news.. but WHEN can I find it available in the stores, that would be NEWS.

Re:When (5, Insightful)

dave420 (699308) | more than 10 years ago | (#9549870)

So you don't want to hear about the cure for cancer until it's in your pharmacy? News is just that - new stuff. Just because you can't fork over some money for something doesn't mean it's not newsworthy or of interest to the /. community. Are you American? :-P

Re:When (0)

Anonymous Coward | more than 10 years ago | (#9549908)

Roughly a month or two after an NVidia card capable of out performing the 6800 by a factor of 3 is released. Of course, the model able to equal the performance of the dual 6800 will be under $99 and produce less heat than one 6800.

So the SLI is best in the case that you'd want to spend money on a dual Xeon with dual overpriced 6800's so that you can have two fancy UV sensitive cooler mechanisms to really impress your friends with.

A week later, I'll buy the Celeron which performs better than both of this Xeons combined with the NVidia 9400 model. It'll have cost me 1/10th as much as this setup, but it won't look as cool.

Re:When (0, Troll)

martingunnarsson (590268) | more than 10 years ago | (#9549955)

So until then it doesn't deserve to be mentioned or discussed on Slashdot? In that case, not many Slashdot-headlines qualify as news.

Re:When (1)

instanto (513362) | more than 10 years ago | (#9550015)

True.

BREAKING NEWS: Nvidium Geforce 6800 SLI now available in stores worldwide (all necessary cables (batteries not included) and motherboards included!).

NDA Leak. (2, Insightful)

Anonymous Coward | more than 10 years ago | (#9549998)

All great news.. but WHEN can I find it available in the stores, that would be NEWS.

Dude, this is an NDA leak! If you're trying to imply nVidia is peddling vaporware, well, you might be right, but in this case they're actually not the ones doing the peddling, because their SLI setup is still under NDA.

BUY THEM HERE (2, Funny)

swordboy (472941) | more than 10 years ago | (#9550017)

Buy them here [shoplifestyle.com] .

Damn (4, Funny)

MikeDX (560598) | more than 10 years ago | (#9549841)

There goes my savings again!

Hmmm... (4, Funny)

cOdEgUru (181536) | more than 10 years ago | (#9549845)

Its sad that my first born has to go..

But perversly exhilarating to hold an SLI configuration in my hands instead..

Re:Hmmm... (3, Funny)

7-Vodka (195504) | more than 10 years ago | (#9550084)

No daddy noo!!
Please daddy don't trade me for some extra FPS. Everyone knows the human eye can't tell the difference.

Shut up you little brat. I can tell and it ruins my game man!

For Rich Folks Only (5, Interesting)

Brain Stew (225524) | more than 10 years ago | (#9549858)

These cards are expensive enough, now they are suggesting we buy 2!?

I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you.

As for myself, I am a poor college student not even able to afford 1 of these cards. A situation I think is similar to a lot of other geeks/gamers.

Which begs the question, who is this aimed at?

Re:For Rich Folks Only (4, Insightful)

King_of_Prussia (741355) | more than 10 years ago | (#9549876)

14 year old 1337-sp33king white boys living with their rich parents. The same people who will use these computers to play counterstrike with hacks on.

Re:For Rich Folks Only (1)

TheAcousticMotrbiker (313701) | more than 10 years ago | (#9549878)

This is aimed at:
1) gamers
2) nerds
3) rich kids

and later on, with the quadro line:
4) companies/professionals who really want a lot of bang for their bug (buck I mean .. honest typo, I swear)

Re:For Rich Folks Only (-1, Troll)

Anonymous Coward | more than 10 years ago | (#9549886)


Hardcore gamers.

The guys that buy Alienware PC's that cost more than a used car. Guys that upgrade processors every couple months. Guys who live in their parents basements [phantomta.com] with $10,000 on their Visa.

The rest of us are happy with our 386/486/Pentium 90 that has been letting us use Lynx and vi since 1993, so why change.

cczz

My Voodoo 2 SLI Story (4, Interesting)

PIPBoy3000 (619296) | more than 10 years ago | (#9549979)

I picked up a Voodoo 2 card way back when for the incredibly high price of $300 (which was a ton close to ten years ago with the money I was making). A couple years later, I picked up my second Voodoo 2 for $30.

Think of it as an inexpensive way to nearly double your video card's performance at a fairly cheap price when others are upgrading to the new version of the card that is only 40-50% faster (unlike the SLI mode which is rumored to be 75-90% faster).

The tricky part will be that you have to have a motherboard to support it, which for now will only be the ones made for high-end workstations.

Re:For Rich Folks Only (2, Insightful)

GuyinVA (707456) | more than 10 years ago | (#9550030)

If you wait a couple of months after it's release, you can probably save 50%. It's just another graphics car that will be outdated in a few months.

Never underestimate... (4, Insightful)

Kjella (173770) | more than 10 years ago | (#9550035)

...priorities. If gaming is your life (or if you're a working man with a gaming fix), two of these aren't that "extreme". People easily spend 10k$+ more on a car than a car that'd get them from A to B just as safely and easily, just for style and more luxury.

If gaming is what you do a considerable number of hours of your life, why not? Even as a student, it'd be some weekends without being completely wasted (and maybe work an hour or two as a weekend extra), and you'd have it.

All that being said, from what I saw with the last cards it looked to me like GPU speed was starting to go beyond what conventional monitors and CPUs could do. And those really huge monitors are usually far more expensive than the GFX cards, even two of them.

2xGF6800 = 10000 NOK
Sony 21" that can do 2048 x 1536/86 Hz = 14000 NOK ...and that was the 3rd webshop I had to go to in order to actually find one of those - most now have some legacy 17 and 19" CRTs and the rest LCDs, which go no further than 1600x1200 (even at 21") and don't need an SLI solution.

Personally, I'll probably stick to GF4600 until hell freezes over, I just don't manage to get hyped up on the FPS games anymore. I'd rather go with a HDTV + HD-DVDs, should they ever appear...

Kjella

"begs the question" (2, Informative)

image (13487) | more than 10 years ago | (#9550049)

> Which begs the question, who is this aimed at?

I recently learned this here, so please don't take this as a criticism.

The phrase "begs the question" doesn't mean what you think it means. It does not mean, "this leads to the question."

Rather, it is a term used in logic to indicate a fallacy in which the question or statement itself tries to prove its truth by asserting its own truth. This is commonly known as circular reasoning. More here [nizkor.org] .

I agree with you about wondering who the product is aimed at, though.

Re:For Rich Folks Only (0)

Anonymous Coward | more than 10 years ago | (#9550063)

I'm hardly rich, but I bought two $500 NVIDIA cards a year ago. For every college kid living off top ramen, there's another person working hard, making okay money and able to spend it.

Besides, duh, after a couple iterations of these, the price drops. You don't HAVE to have the latest and greatest. Wait until the second wave of the cards are out and the first wave are affordable.

Yeah, but... (-1, Offtopic)

Reverant (581129) | more than 10 years ago | (#9549859)

Can I have a cluster of these in each of my beowulf linux clusters that are clustered all over the world? And most importantly can I...

1) Steal a pre-production model.
2) Make 1000 copies.
3) ???
4) Profit!

nvidia is going under! (-1, Troll)

Anonymous Coward | more than 10 years ago | (#9549860)

1. Resort to idiotic 3DFX-like measures to get high performance

2. Watch company slowly die

It's been fun nvidia. Thanks.

Re:nvidia is going under! (5, Insightful)

Chas (5144) | more than 10 years ago | (#9549981)

1: Resort to idiotic 3DFX-like measures to get high performance

Note: A 77% increase in gaming performance isn't "high performance". Considering that the 6800 is ALREADY a massive leap forward over it's predecessor, it's INSANE PERFORMANCE!

How would something like 1600x1200 with maxed FSAA and maxed AF, while never dropping below 60fps, grab you by the short and curlies?

2: Watch company slowly die.

Nobody's suggesting that everyone and their brother run out and get SLI'd GeForces on a Xeon platform. (Those already spending 4-5000 dollars on such a platform aren't necessarily going to shrink from an additional $4-500, especially if it nearly doubles video performance.)

This is going to probably be limited to those who'd normally use Quadro cards (productivity) and the elite few with more money than sense.

Not that everyone won't WANT one...

SLI? (0)

Anonymous Coward | more than 10 years ago | (#9549864)

What the hell does SLI mean? And why does anyone care? I.e., what are the real world (pc gaming) results of paying more money than I can afford to use this technology?

Re:SLI? (2, Informative)

Anonymous Coward | more than 10 years ago | (#9549874)

What the hell does SLI mean?

Scan Line Interleave. Every other line of the screen is drawn by the other graphics card.

Re:SLI? (0)

Anonymous Coward | more than 10 years ago | (#9549898)

Reminds me of that old-fangled tv technology interlacing. I thought we left that behind because it sucks. What goes between the two video cards and the one display screen so as the results are combined?

Re:SLI? (0)

Anonymous Coward | more than 10 years ago | (#9549930)

This [hardwareanalysis.com] . It's all digital now.

Re:SLI? (4, Informative)

levell (538346) | more than 10 years ago | (#9549903)

SLI (Scan line Interleaving) means that if you have two graphics cards in your computer then they can each draw part of the screen. So for a lot more money you get better graphics and a higher frame-rate.

Re:SLI? (3, Informative)

LiquidCoooled (634315) | more than 10 years ago | (#9549923)

SLI means 2 different things, yes 2.

Both specified in the article. They really are confusing the issue more than required.

from the article:

in something called an SLI, Scan Line Interleave, configuration.

and then:

Both 6800 series PCI-E cards are connected by means of a SLI, Scalable Link Interface, dubbed the MIO port, a high-speed digital interconnect

removing the bumf however leaves the following definition of SLI:
"Buy 2 cards so you can do the same job as an ATI".

note: I'm only jealous, I made a booboo and bought an fx5900 :(

nVidia boys don't cry! (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#9549974)

note: I'm only jealous, I made a booboo and bought an fx5900 :(

Don't be sad, with the next nvidia driver release 3DMark will run 1-5% faster!! You've never run a benchmark as fast before!

Of course, it won't help your games, but it does sell nvidia hardware so...

Re:SLI? (1)

tokennrg (690176) | more than 10 years ago | (#9550037)

From recently reading an article on [H]ard OCP (www.hardocp.com) about the PCI-E x9800 Pro games and stuff don't even use the full potential of the 8x AGP Bus. Does using 2 video cards really help when we can't use what's alread there?

Re:SLI? (-1, Troll)

LiquidCoooled (634315) | more than 10 years ago | (#9550077)

Blame it on the benchmark people.
Everytime they release new benchies, things where the framerate becomes choppy, adding an extra 999 lights, or bumpmapping britneys tits or whatever they do, it makes our "feeble" graphics cards look lacking.

In reality, for the things we do here, the 5900 I have is no better than the 4200 it replaced.
Both are in daily use still (my son has the 4200 in his) and when we play the same games, tbh I cannot tell the difference, the rest of the machines are similar, just mine with upgraded graphics.

I have yet to see where my investment went.

I paid the price for keeping the house amd/nvidia, and that loyalty cost me a great deal of money.
I won't be making the same mistake again.

Re:SLI? (1)

koody (575863) | more than 10 years ago | (#9550106)

What the hell does SLI mean? And why does anyone care? I.e., what are the real world (pc gaming) results of paying more money than I can afford to use this technology?

The Scalable Link Iinterface is according to the article what two Voodoo cards used to communicate with. nVidia calls their port MIO. I think the correct acronym is Scan-line Interleave mode, so the article might have gotten the acronym wrong. But I remember fow 3dfx used sli to connect two Voodoos together. It's also all there in the article for anyone to read....nevermind.

Actually someone predicted this when 3Dfx was bought by nVidia. The thread is here [aliensoup.com] . An interviewer with a clue asked nVidia about this in March 2002 but nVidia declined to comment [simhq.com] . Hell the SLI possibility was even discussed on slashdot [slashdot.org] .

As to when this might be useful it is suggested in the article that you could buy two "regular" 6800GT cards and get way better performance than from a single 6800 Ultra Extreme card. I'm not saying it's cheap or even practical right now but it might be in the future when the cards are a bit cheaper and PCI-E-x16 is commonplace.

ATI X800 advertisement (3, Funny)

Tsunayoshi (789351) | more than 10 years ago | (#9549867)

Am I the only one to find it hilarious that at the top of the page there was a Flash ad for an ATI Radeon X800?

Re:ATI X800 advertisement (-1)

Anonymous Coward | more than 10 years ago | (#9549905)

Well I guess so. This is /. after all, since when do we RTFA?!

Re:ATI X800 advertisement (0)

Anonymous Coward | more than 10 years ago | (#9549985)

Am I the only one to find it hilarious that at the top of the page there was a Flash ad for an ATI Radeon X800?

yes.

Re:ATI X800 advertisement (1)

Sporkinum (655143) | more than 10 years ago | (#9550038)

Yes you are. I think most slashdot readers don't see any advertisements.

Re:ATI X800 advertisement (2, Funny)

sosegumu (696957) | more than 10 years ago | (#9550051)

What is "flash?"

Re:ATI X800 advertisement (1, Informative)

squoozer (730327) | more than 10 years ago | (#9550080)

Actually I think you might be the only one not viewing the site in Mozilla / Firefox [mozilla.org] with adblock [mozdev.org] installed.

Look at all these great sites you could block:

  • http://*.falkag.net/*
  • http://*.bluestreak.com/*
  • http://*.tangozebra.com/*
  • http://*.maxserving.com/*
  • http://*.speedera.net/*
  • http://*.mediaplex.com/*
  • http://*.fastclick.net/*
  • http://*.advertising.com/*
  • http://*.pointroll.com/*
  • http://*.msads.net/*
  • http://*.atdmt.com/*
  • http://verio.co.uk/*
  • http://*.googlesyndication.com/*

I just wish adblock came with these as defaults :o)

Just a band aid.. (5, Interesting)

eddy (18759) | more than 10 years ago | (#9549871)

... till we have multi-core and/or multi-GPU consumer cards. (they're already available [darkcrow.co.kr] at the high-end)

Questionmark.

3D Glasses (0, Funny)

Anonymous Coward | more than 10 years ago | (#9549872)

sweet... so now i can get the same performance using my 3d Glasses as all you 2D slacker for only 2 or 3 times the price... wicked!!

(now = sometime in the future)

Time to... (1, Funny)

Korgan (101803) | more than 10 years ago | (#9549875)

Break out my old Voodoo2 cards again... SLI? Sure... My voodoo2 cards in series from my Radeon x800 :-) Who says you have to have a 6800? ;-)

is nvidia seeming more and more.. (4, Interesting)

gl4ss (559668) | more than 10 years ago | (#9549883)

like 3dfx they bought?

maybe they shouldn't have.. sure they probably had some great people and so on but ultimately "it didn't work out".

"hey, we can't keep up! let's just use brute force on increasing our cards capabilities!!! that's cheap and economical in the long run keeping our company afloat, right? right??"

Re:is nvidia seeming more and more.. (0)

Anonymous Coward | more than 10 years ago | (#9549982)

That would give them another two and a half years before they're bought by their strongest competitor. I'm wondering if ATI will be infected by their future aquisition too.

Re:is nvidia seeming more and more.. (2, Informative)

dasmegabyte (267018) | more than 10 years ago | (#9550099)

Except that NVidia is keeping up with their competitors in most other areas. Whatever performance loss or gain margin they have with ATI, it isn't enough to say, hands down, X is better than Y.

If you remember the last days of 3dfx, what they were selling was more expensive, slower, had a lower resolution and a distinctly washed-out look compared to comparable Nvidia parts. In fact, I remember convincing several people at a LAN party to dump their Voodoo 2 cards for the TNT, because although the frame rate was much lower (sometimes by half), games were still playable and the performance hit for using higher resolutions was greatly reduced and achieving a res like 1024x768 or even 1280x1024 didn't require an additional card. Which meant that I could snipe from a further distance with more precision. Not to mention the clarity of those 32 bit textures; my god, I shudder at the thought of going back to banding 16 bit hell.

No Thanks... (1)

SpermanHerman (763707) | more than 10 years ago | (#9549888)

I'll stick with my Radeon 9800 Pro... until I jump to X800 =D

Now, the question becomes... (2, Interesting)

FrO (209915) | more than 10 years ago | (#9549889)

Can you hook up 4 monitors to this badass configuration?

Re:Now, the question becomes... (-1)

Anonymous Coward | more than 10 years ago | (#9549900)

RTFA.

Yes you can.

Re:Now, the question becomes... (4, Informative)

PhrostyMcByte (589271) | more than 10 years ago | (#9550059)

"For workstation users it is also a nice extra that with a SLI configuration a total of four monitors can be driven off of the respective DVI outputs on the graphics cards, a feature we'll undoubtedly see pitched as a major feature for the Quadro version of the GeForce 6800 series SLI configuration."

Yes, it's /., and I RTFA. ph33r.

Re:Now, the question becomes... (1)

Titanium Angel (557780) | more than 10 years ago | (#9550064)

For the lazy ones who don't RTFA:

For workstation users it is also a nice extra that with a SLI configuration a total of four monitors can be driven off of the respective DVI outputs on the graphics cards, a feature we'll undoubtedly see pitched as a major feature for the Quadro version of the GeForce 6800 series SLI configuration.

Ohh I see... (2, Funny)

Anonymous Coward | more than 10 years ago | (#9549890)

So THAT'S how we can run Longhorn! It makes sense!

Oh wait...

Math experts (-1, Troll)

slycer9 (264565) | more than 10 years ago | (#9549894)

I don't get it...

200% more GPU.
200% more expense on said GPU.
77% more performance???

Re:Math experts (4, Informative)

dave420 (699308) | more than 10 years ago | (#9549914)

it's only 100% more GPUs (and therefor cost), actually. They're only adding one more to the mix. 200% more would be 3 in total.

The lower-than-100% increase reflects the fact the cards aren't working together fully. As they said, it's still early days, and expect to get that figure to nearer 90%.

Re:Math experts (0)

Anonymous Coward | more than 10 years ago | (#9549916)

It's easy to understand. They're getting their asses kicked routinely by ATI, so they have to resort to 3DFX-like measures to try to get some real performance happening. What a shame.

Re:Math experts (0)

greenreaper (205818) | more than 10 years ago | (#9549928)

If you're going to criticise, double-check your own figures. Adding another card is 100% more for both GPU and expense, not 200% (that would be three cards).

Re:Math experts (0)

Anonymous Coward | more than 10 years ago | (#9549935)

um... maybe you mean 100% more GPU
or 200 % the gpu, 177% the power

Re:Math experts (0)

Anonymous Coward | more than 10 years ago | (#9549940)

You mean:

100% more GPU.
100% more expense on said GPU.
77% more performance???

Right?

I don't know where you got 77, nvidia claims their target is 90% but I'm going to guess the real numbers for general use are a bit below that.

Re:Math experts (1)

lachlan76 (770870) | more than 10 years ago | (#9549970)

I think you may mean 100% more GPU, and 100% more expense. 200% would be for 3 cards.

77% more performance because the drivers are not yet optimised, and because I'd assume that the tests were done on the same machine.

Not all of the processing is done on the video card, what makes you think that another one will take all the load off the CPU?

Re:Math experts (1)

tunah (530328) | more than 10 years ago | (#9549976)

100% more GPU and expense. 77% more performance. It's a better deal than going from the slightly-obsolete products to the first tier, anyway...

Re:Math experts (2, Informative)

caffeineboy (44704) | more than 10 years ago | (#9549980)

It's a pretty simple case of diminishing returns. If there are now 2 cpus in charge of doing rendering, they have to spend some of their power cooperating and communicating rather than just crunching numbers all of the time.

Of course that is a terrible over simplification. There are cases in which 2 cpus are actually slower than one, notably SMP P1 chips that had the L2 cache on the motherboard.

Power Requirements (5, Interesting)

Anonymous Coward | more than 10 years ago | (#9549895)

So, One card that requires a 400 Watt power supply + Another card that requires a 400 Watt power supply = The need for an 800 Watt power supply?!

Re:Power Requirements (2, Informative)

Henriok (6762) | more than 10 years ago | (#9550086)

Of course not! One card need a COMPUTER with a 400 W powersupply. There's a lot more than a graphics bord that needs power in a computer.

New Motherboards (3, Interesting)

Anonymous Coward | more than 10 years ago | (#9549896)

It's a bit presumptuous to assume that when these SLI cards come out, the only motherboards supporting multiple PCI-E x16 slots will be Intel Xeon based. As far as I knew, AMD were planning on doing 939 based motherboards with multiple PCI-E.

At any rate, doesn't this sort of make the whole Alienware Video-Array seem like a bust?

Not for the meek... (1, Funny)

vashathastampedo (627544) | more than 10 years ago | (#9549897)

/richboyon Anyone who says that this kind of setup isn't necessary simply cannot afford the necessary gear. I am looking forward to the envy and hate from all the punk kids playing with their mom's computer. Truly the Ferrari concept brought to the desktop. /richboyoff Now, what is a realistic price for a system such as this?

Re:Not for the meek... (0)

Anonymous Coward | more than 10 years ago | (#9550074)

doesnt' matter it takes one poor kid with Ski11z and a pentium III with a Geforce 5200 and simply setting all the settings really low to get 100+ framerate and still own your overpriced arse.

money never EVER equals skills. Hell I remember one kid that waxed everyone and he even played the last tourney with only one hand to give everyone an advantage...

he still won easily.

remember richboy = poser/wannabe

This raises the question: (2, Interesting)

vi (editor) (791442) | more than 10 years ago | (#9549906)

Why don't they make a graphics card with two GPU and double memory size ? Or wouldn't fit on of these buggers into a computer case ? Yes they exploit the dual PCIex busses, but it doubt that they really use the would bandwith.

Re:This raises the question: (1)

man_ls (248470) | more than 10 years ago | (#9550083)

Each PCIe channel is, IIRC, 150MB/sec independant bandwidth, and they come in 1, 4, and 16-channel slots.

a 16-channel PCIe slot is 2.4GB/sec of bandwidth...

I bet a high-resolution FX card would use most of that. But then again, they probably use PCIe-16 because PCIe-4 would be far too little.

Reliability (4, Insightful)

lachlan76 (770870) | more than 10 years ago | (#9549912)

Am I the only person who thinks that holding the two together with a non-flexible medium and is held on only with solder is a bit dangerous? Not that the solder would break, but when it is removed, it could be a bit tricky. Perhaps a cable on there would be safer.

Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it. Maybe for engineers and artists, but will the average person have any use for it? I don't feel that an extra AU$3000 is worth it for the extra frame rate in games.

For the pros though it would be very good though.

Re:Reliability (3, Funny)

Zocalo (252965) | more than 10 years ago | (#9550012)

Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it.

Look on the bright side; most Xeon systems already have the second PSU that you are going to need to power the extra card and turbofan based cooling system.

Re:Reliability (4, Interesting)

PhrostyMcByte (589271) | more than 10 years ago | (#9550020)

Never mind how they are held together. The Geforce 6 already requires a shitload of power (2 molex connectors on the rear of it) and puts out a lot of heat. So you have two very hot cards right next to eachother, one of them getting really bad airflow. If one of your $500 video cards doesn't die, your PSU surely will!

Re:Reliability (2, Funny)

Diabolical (2110) | more than 10 years ago | (#9550022)

I don't feel that an extra AU$3000 is worth it for the extra frame rate in games

Get out of here you heathen...........

morons re-unbale pateNTdead eyecon0meter? (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9549920)

just kidding about stuff that anti-matters.

consult with/trust in yOUR creators... crystal clear reception since/until forever. see you there?

ALX (2, Interesting)

paradesign (561561) | more than 10 years ago | (#9549931)

How does this stack up against Alienwares ALX dual graphics card system. I remember reading an article where the Alienware guys bashed the SLI method. Theirs, each card renders half the screen, either top or bottom, not every other line.

Re:ALX (2, Insightful)

kawaichan (527006) | more than 10 years ago | (#9550005)

dude, alienware basically is using nvidia's SLI method for their alx boxes

noticed that they were using two 6800s for their benchmarks?

Re:ALX (1)

gl4ss (559668) | more than 10 years ago | (#9550101)

why was it back then said that they had just extra software do the trick? because that's what they said.

this nvidias own solution doesn't really seem like the alienware's.

Re:ALX (1)

trayl (79438) | more than 10 years ago | (#9550016)

from the article :

"In essence the screen is divided vertically in two parts; one graphics card renders the upper section and the second graphics card renders the lower section. The load balancing algorithms however allow it to distribute the load across the graphics processors. Initially they'll both start out at 50% but this ratio can change depending on the load."

Um, so it stacks up identically?

Re:ALX (1)

PhrostyMcByte (589271) | more than 10 years ago | (#9550096)

Nvidia gives everybody the chance to use dual cards, without buying a $5000 Alienware system. And because it's directly by the vendor, I bet the quality/speed is better. This is going to lose AW a good deal of money, of course they are going to bash it.

4 slots (4, Insightful)

MoreDruid (584251) | more than 10 years ago | (#9549933)

OK, I'm all for performance gain and pushing the limit, but geez, 2 of these cards take up 4 slots. How are you supposed to squeeze in your Audigy card with extra connectors and still put in your extra firewire/usb?

And I'm also wondering how the heat is going to be transferred away from the cards. It looks like you need some serious cooling setup to keep those two babies running.

I NEED IT NOW! (-1)

Anonymous Coward | more than 10 years ago | (#9549936)

I've always wondered how long it would take before I could effectively render each individual hair of a monsters rear end with full physics and textures per hair in Doom 3. I need this NOW!!!!

Bah... (5, Insightful)

mikis (53466) | more than 10 years ago | (#9549947)

Call me when they put two GPUs on one card... Or even better, when they put two cores on one chip. Soon enough motherboard will be an add-on to graphic card.

Plus, many people were upset about power and cooling requirements. This monster would occupy FOUR slots and require, what, a 600W PSU? (ok, just kidding, "only" 460W should be enough)

Re:Bah... (0)

Anonymous Coward | more than 10 years ago | (#9549997)

It's funny; I have a 460w PSU and I thought I need it for all the drives I was running. I had to take the computer apart for testing so I had a really old super small case with I think a 160w PSU. That thing ran every compontent that the 460w did. So I wonder how you can calculate the rating need for ths PSU?

Re:Bah... (4, Informative)

cK-Gunslinger (443452) | more than 10 years ago | (#9550089)


So I wonder how you can calculate the rating need for ths PSU?

Wonder no longer! Power Supply Article [firingsquad.com]

Well, the power rating... (1)

Kjella (173770) | more than 10 years ago | (#9550104)

...means you have maximum power draw from everything at once (and maxed out on "empty" slots too). Your 160W PSU might have been able to run the system at idle, but at max load it might fail.

Hard disks are not a major draw of current anyway, I checked Seagate's website and they draw 13-14W at max each. It's the 100W+ CPUs + 100W+ GPUs + cooling systems that make up the most part. Each PCI slot also adds a lot to the requirement, since they can each draw a lot (even if they fairly rarely do, depends on type of card).

Kjella

WOW! (-1, Troll)

sitcoman (161336) | more than 10 years ago | (#9549948)

That is so great! You know I've been waiting for SLI for just years now, and thanks to Nvidia, I can finally...

...I can...

...well what the fuck is SLI anyway?

Lately I've been thinking about an idea for a great website, which I'd like to call ">/." (greater than slashdot) or something. For this site I would just rip all of /.'s articles and edit them so that visitors to my site could tell wtf the article is actually about. I would also check for dupes and other things that editors generally do when they haven't been replaced by stuffed penguins sitting on bags of money.

So, WHO'S WITH ME?! :P

Re:WOW! (1, Funny)

greenreaper (205818) | more than 10 years ago | (#9550052)

The target reader is assumed to know what SLI is already. You are therefore not meant to be here . . . so go away!

Shiney (1)

Kris_J (10111) | more than 10 years ago | (#9549949)

Who needs a new car? A rig like this looks like something worth building. Time to start looking for an interesting new case and a place to put another PC.

Move along, nothing to see. (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9549956)

Move along, nothing to see. Just minimal requirements for Windows Longhorn!

Mr Jones... (2)

LoganTeamX (738778) | more than 10 years ago | (#9549983)

Your bank loan is ready. This is beyond pointless. At least it was affordable with the Voodoo2s.

gah (0)

Anonymous Coward | more than 10 years ago | (#9549995)

great nvidia, 1 Kw PSUs now needed no doubt.....

and four slot spaces? eek. I'll wait for the ATi MAXX solution (you know they'll go one better), that way we dont need dual PSUs (the nvidia requirements are stupid, seperate power rails per molex connector, not allowed to share the spare connector on each rail? 8 connectors in a sense.. jeez) or liquid nitrogen cooling..... (id never ever try a dual 6800 ultra with the default sinks, eurgh).

If you ask me, 2x 6800 GTs or so are cheaper, almost as fast, less requirements and less space. Although new mobo would still be needed for dual PCI-e.

Do a MAXX or Voodoo5 for crying out (hell, put the second card on a daughterboard and using the same slot) just don't do a Volari.........

PS, Alienware can smeg off. rip off sods. It isn't $5000 for a new mobo and 2 cards in a system that would cost 1000 or so and be kickass...

About time too (1)

StoatBringer (552938) | more than 10 years ago | (#9550003)

Anyone playing Counterstrike at anything less than 1024 frames per second is clearly a n00b!

I'm in two minds about this. On the one hand, only a tiny number of games require anything like this sort of horsepower. But on the other hand, These sorts of expensive, high-end systems will be commonplace in 2-3 years time. I bought a GeForce FX 5900 when it was the top-end card, and now it's (kind-of) obsolete!

Buy all the hardware were want (1)

Timesprout (579035) | more than 10 years ago | (#9550010)

The proliferation of aimbots and wallhackers will still mean you just look better getting pwn3d.

If I remember correctly... (3, Insightful)

mustardayonnaise (685416) | more than 10 years ago | (#9550028)

John Carmack said about a year and a half ago that Doom 3 would run 'well' on a top-end system of that time- which was a 3.06 GHz P4 equipped with a Radeon 9700 Pro. What's frightening/upsetting is that this SLI setup really isn't coming into play to satisfy the games of today like Doom 3- it's coming into play for the games of next year and the year after. It's just a little off-putting that in order to play the newest games you need a SET of graphics cards with those kind of power and space requirements.

POWER requirements (1)

Danathar (267989) | more than 10 years ago | (#9550070)

Whoa...with one GPU power requirement being a 500W power supply. Where the Hell am I going to get a power supply to run two of these beasts. Soon my PC is going to draw more than the beer fridge!

Only Nvidia? (2, Interesting)

ViceClown (39698) | more than 10 years ago | (#9550079)

This is the mobo design Alienware came up with, right? My understanding is that you can use ANY two video cards that are the same and are PCI-X. You could just as well do two ATI cards. Who submitted this? Nvidia marketing? :-)

Re:Only Nvidia? (1)

My name isn't Tim (684860) | more than 10 years ago | (#9550107)

I believe ALX draws top and bottom separate where SLI draws alternate lines. Not the same idea exactly but same end result I suppose. Be great to see a comparison. When the R420 comes out I'm sure ATI will make this seem pitiful.

Currently... (1, Funny)

Viceice (462967) | more than 10 years ago | (#9550093)

Please register or login. There are 9 registered and 3599 anonymous users currently online. Current bandwidth usage: 872.66 kbit/s

How many of you are there hitting refresh just to see the hit counter go up? :D

SLI (4, Informative)

p3d0 (42270) | more than 10 years ago | (#9550097)

Is it too much to ask to define, or at least hyperlink, the acronyms you use?

SLI stands for Scan Line Interleave [gamers.org] .

Insane Requirements? (2, Insightful)

Fizzleboink (700171) | more than 10 years ago | (#9550098)

I seem to remember that one of these cards took up 2 slots, and needed a third just for good air flow. How much space are these going to take up? Also, just one of these bad boys needed something like 400-500W of power. What kind of power supply is needed for 2???

My question... (3, Informative)

SageMadHatter (546701) | more than 10 years ago | (#9550100)

When a single 6800 card requires a 480watt power supply [anandtech.com] and two dedicated power lines, what would the power requirements be for two [hardwareanalysis.com] of these cards in the same computer system?

Xeons? (4, Insightful)

ameoba (173803) | more than 10 years ago | (#9550105)

Why would they design something like this and force it to use a Xeon?

For starters, the Xeon is still stuck at a 533MHz FSB, limiting its performance. Add in the fact that they're ridiculously overpriced & most games show little to no performance improvement when running on an SMP system. A single P4 or Athlon64 will stomp the Xeon in almost all gaming situations.

Of course, with this tech a ways away & there not really being any PCI-E motherboards on the market now that Intel's recalled them all, I guess they're betting on high-end enthusiast boards to ship with the second x16 slot by the time this thing is actually ready for market...

Really, the biggest application for this kinda power that I can forsee would be game developers who want to see how well their games scale for next-gen video hardware...
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>