Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Dual Video Cards Return

michael posted more than 9 years ago | from the double-bubble dept.

Graphics 264

Kez writes "I'm sure many Slashdot readers fondly remember the era of 3dfx. SLI'd Voodoo 2's were a force to reckoned with. Sadly, that era ended a long time ago (although somebody has managed to get Doom III to play on a pair of Voodoo 2's.) However, Nvidia have revived SLI with their GeForce 6600 and 6800 cards. SLI works differently this time around, but the basic concept of using two cards to get the rendering work done is the same. Hexus.net has taken a look at how the new SLI works, how to set it up (and how not to,) along with benchmarks using both of the rendering modes available in the new SLI." And reader Oh'Boy writes "VIA on its latest press tour stopped by and visited in the UK and TrustedReviews have some new information on VIA's latest chipsets for AMD Athlon 64, the K8T890 and the K8T890 Pro which supports DualGFX. But what has emerged is that DualGFX after all doesn't support SLI, at least not for the time being, since it seems like nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI. VIA did on the other hand have two ATI cards up and running, although not in SLI mode."

cancel ×

264 comments

F1rstest (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#10902603)

This video card is the fir$t poster I have found.

looks good (-1, Offtopic)

Ghost-in-the-shell (103736) | more than 9 years ago | (#10902609)

First post

Looks good !

TWO Cards! (-1, Redundant)

Squeebee (719115) | more than 9 years ago | (#10902629)

Double Your Pleasure, Double Your Fun!

Re:TWO Cards! (-1, Redundant)

Bradac_55 (729235) | more than 9 years ago | (#10902678)

The first time around was a fad, this time around it'll be sad. Let me guess at the prices if they ever develop a real product, $500(us) a piece?

Re:TWO Cards! (4, Funny)

Rew190 (138940) | more than 9 years ago | (#10902773)

TWO Cards! (Score:-1, Redundant)

I think that's the first time the actual moderation of a post has made me laugh more than the post itself.

Re:TWO Cards! (1, Funny)

Anonymous Coward | more than 9 years ago | (#10902827)

I think you stumbled onto something.

From now on, every article about SMP, clustering, etc. will have troll posts for the express purpose of getting Redundant mods.

New trend ? (5, Insightful)

FiReaNGeL (312636) | more than 9 years ago | (#10902641)

Dual video cards... soon dual-core CPUs, is it a sign that we're slowly approaching the Moores Law limit? The 'dual' strategy allow for further performance gains.... but I can't see myself using more than 2 video cards (hell, I can't even see myself using more than 1), so that will be a very temporary solution.

And we're not even speaking of how much power (wattage) these 'dual solutions' consume...

Re:New trend ? (5, Funny)

Scoria (264473) | more than 9 years ago | (#10902687)

The 'dual' strategy allow for further performance gains....

Eventually, barring any further technological advances, perhaps we'll even result to modular clustering. Once again, the enthusiast's computer will be larger than a refrigerator! ;-)

Re:New trend ? (0, Troll)

krymsin01 (700838) | more than 9 years ago | (#10902740)

Once again? heh.. yeah...

Note to self (1)

Scoria (264473) | more than 9 years ago | (#10902751)

Please substitute resort for result, and never program a database query manager while commenting here.

Re:New trend ? (1)

LWATCDR (28044) | more than 9 years ago | (#10902882)

Duel video cards are not the same as duel-core CPUs. It is more like dare I say it a beowolf cluster. When will we see a true duel gpu card or maybe a duel core gpu?
What I am more interested in is will we see smp boards supporting duel-core AMD-64 cpus. It could be interesting since from what I read the AMD64 is NUMA when using more than one cpu but I would guess that the duel core would be more of an UMA.

Re:New trend ? (2, Insightful)

Masami Eiri (617825) | more than 9 years ago | (#10903043)

Voodoo 4 and 5. 4 had 2, I believe, and 5 had 4 (!). The 5 needed an external power brick. They still were both outgunned (at least in bang-for-the-buck) by the GeForce.

Re:New trend ? (1)

EpsCylonB (307640) | more than 9 years ago | (#10903170)

Duel video cards are not the same as duel-core CPUs. It is more like dare I say it a beowolf cluster. When will we see a true duel gpu card or maybe a duel core gpu?

I'm no expert but I was under the impression that the bottleneck in current graphics cards is the amount of memory and the speed of the bus that the texture data has to travel down. Aparrently the actual 3d geometry is very easy to process, its the rendering and its associated problems that slow things down.

Re:New trend ? (1)

Surt (22457) | more than 9 years ago | (#10903258)

You will be able to buy multi-cpu multi-core setups pretty much immediately after the multi-core cpus are available.

Buy the second a year later (5, Insightful)

PIPBoy3000 (619296) | more than 9 years ago | (#10902902)

The real benefit, from my perspective is that it's a low-cost way to upgrade your video card in between new computers. I bought my first Voodoo 2 for $300. My second cost $30.

Re:Buy the second a year later (3, Funny)

tygerstripes (832644) | more than 9 years ago | (#10903155)

...and, of course, the convenience of being able to fry your breakfast on the case!

Re:New trend ? (2, Informative)

Anonymous Coward | more than 9 years ago | (#10902903)

Multiple GPUs will be good to have as there are lots of uses for GPUs additionally to pretty pictures.

The Folding@home (http://folding.stanford.edu/ [stanford.edu] ) is about to enter the GPU based Folding:
http://forum.folding-community.org/viewtopic.php?p =75287#75287 [folding-community.org]

Interesting times ahead...

Re:New trend ? (5, Insightful)

GreatDrok (684119) | more than 9 years ago | (#10902950)

Dual video cards... soon dual-core CPUs, is it a sign that we're slowly approaching the Moores Law limit? The 'dual' strategy allow for further performance gains

I don't think so. Quoting from Intel's web site: "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." Many people assume Moores Law states that speed of processors will double every 18 months and that the fact that it is becoming difficult to increase clock speed now means that Moores Law is finished. However, increasing speed is a consequence of higher clock speeds and higher transistor counts. Dual cores means you can increase the number of transistors per IC more and actually use them to do real work rather than simply adding a huge cache (as was done with the latest Itanic). End result, more speed, higher transistor count, and Moores Law still fine. In fact, dual cores could mean that the transistor count increases at greater than Moores Law in the near term. Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....

Re:New trend ? (1)

Neward Rylet (634838) | more than 9 years ago | (#10902959)

what about Dual Dual video cards?

Power consumption (4, Informative)

Hoi Polloi (522990) | more than 9 years ago | (#10902992)

And we're not even speaking of how much power (wattage) these 'dual solutions' consume...

SLI power consumption can be significant! [anandtech.com]

Re:Power consumption (1)

Tim C (15259) | more than 9 years ago | (#10903125)

According to those figures, two 6800 Ultras in SLI consume 35% more power than a single 6800 Ultra, when the system is under load.

That's a lot, but still a damn sight better than double.

Re:New trend ? (1)

LabRat007 (765435) | more than 9 years ago | (#10903015)

My personal opinion is that this really just boils down to marketing. They want a new product for the high end consumers to buy and the dual video card thing hasn't been used in a while. Given time we will see it fade and return again.

Re:New trend ? (1)

Surt (22457) | more than 9 years ago | (#10903212)

Dual video cards is just a poor engineering solution to how to render using two GPUs. Soon enough they'll be making multi-core and multi-gpu cards (there is some motivation to go to multi-gpu before multi-core to avoid manufacturing losses).

Looks better on SLI Voodoo2's than on my Rad 9800. (5, Funny)

Lostie (772712) | more than 9 years ago | (#10902646)

...because you can see more than 1 colour (black).

Offtopic.... (0)

Anonymous Coward | more than 9 years ago | (#10903287)

SLI Stands for 2 things: Scan Line Interleave and Scalable Link Interface. The first was available on the Voodoo series of graphics cards. This should not be confused with the latter, which is an entirly different concept (and acronym), and is used on the Nvidia line of graphics cards.

Who to Trust (4, Insightful)

superpulpsicle (533373) | more than 9 years ago | (#10902647)

Every review I have seen has claim SLI to be the wave of the future giving you ridiculous speed boost. But don't all video card reviews do that now? Last I checked on some of the older Tom's hardware, anantech reviews, my hardware should be polling in 70 fps for some games. I'd be lucky to hit 35 consistently... that's reality.

Re:Who to Trust (1)

Hoi Polloi (522990) | more than 9 years ago | (#10903025)

But with SLI you'll be able to use two boards to reach the fps you were promised to reach with your single board!

See, it works out in the end.

Re:Who to Trust (1)

ArsonSmith (13997) | more than 9 years ago | (#10903082)

I don't know they give an example of a single G256 compared to dual G256 in SLI mode and it is almost 10x faster. With the addition of one card I get 10x faster that sounds great to me.

-Both logic and RTFA will hurt me

Re:Who to Trust (2)

mikael (484) | more than 9 years ago | (#10903145)

Every review I have seen has claim SLI to be the wave of the future giving you ridiculous speed boost. But don't all video card reviews do that now?

If you look at the history of video cards, you will see that whenever they succeed in reaching the limit in one particular technology, they will continue to move on something else. First it was screen resolution, then pixel depth, followed closely by 2D pixblitting, then 3D acceleration, multi-texturing, then programmable vertex and finally fragment programs, with the hardware then supported dual DVI output. Since vertex and fragment programs are already written in high level languages and accelerated using multiple pipelines to the limit of clock speed and die size, SLI is the only path left.

On framerates... (1)

temojen (678985) | more than 9 years ago | (#10903276)

Why do you want more than 30fps? At 30 fps the motion is smooth, and there is no interference with the light flicker (on North American 60Hz AC, Incandescent & CF Lights). Any extra GPU cycles can go to improving the quality of those 30 frames.

Intel & SLI (4, Informative)

DeadBugs (546475) | more than 9 years ago | (#10902658)

It is worth noting that NVIDIA will be bringing SLI to the Intel platform according to this press release:

http://www.nvidia.com/object/IO_17070.html [nvidia.com]

I'm looking forward to a P4 NForce board.

Re:Intel & SLI (1)

Jeff DeMaagd (2015) | more than 9 years ago | (#10903109)

There is already an Intel-based PCIe board that has dual graphics slots, I think based on the 925 chipset.

The only thing different here is that nVidia might introduce a cheaper way to get such a board.

looks like shit (-1, Troll)

Anonymous Coward | more than 9 years ago | (#10902674)

see topic title

the whole "because i can do it" angle is really lame

DO SOMETHING USEFUL FOR A CHANGE YOU NERDS!

A plus... (1)

ReeprFlame (745959) | more than 9 years ago | (#10902691)

Dual video cards are a plus. Not only wil lthere be one GPU processing your requests, but you have the better ability to use trwo displays with more performance. As the future goes, boxes will be able to process more and will need to display more graphics intense software. Also, I believe that it is a good idea to have one box to power several stations and that is where several video cards come into play.... One will not be acceptable several years into the future especially for performance power users [gamers, multimedia, etc].

AlienWare (5, Informative)

Spez (566714) | more than 9 years ago | (#10902693)

You can already buy from the alienware luxury collection some gaming systems featuring SLI

http://www.alienware.com/ALX_pages/choose_alx.aspx [alienware.com]

6073 (1, Informative)

Anonymous Coward | more than 9 years ago | (#10902918)

dollars....

SLI? (0)

Anonymous Coward | more than 9 years ago | (#10902701)

What is SLI?

Re:SLI? (2, Informative)

sjaskow (143707) | more than 9 years ago | (#10902724)

SLI = Scan Line Interleave, the cards alternate drawing lines on the monitor.

Re:SLI? (3, Informative)

Gates82 (706573) | more than 9 years ago | (#10902891)

Not true anymore. The new SLI format does not stand for the scanned lin..... Rather then having one video card take odd lines and the other even the cards actually break apart the frame to be rendered and determine where more detail (thus processing power) is need and the concentrate on those areas, so card one may be calculating more pixels, and card to might be concentrating on a detailed area, but they are both doing about the same amount of computation.

--
so really, who is hotter? Alley or Alley's sister?

Re:SLI? (4, Informative)

Gates82 (706573) | more than 9 years ago | (#10902919)

Hate to reply to my own post but here is a link. Nvidia SLI [nvidia.com] SLI = Scalable Link Interface

--
So really, who is hotter? Alley or Alley's sister?

Re:SLI? (1, Informative)

stratjakt (596332) | more than 9 years ago | (#10902754)

Scan Line Interleaving.

Two video cards, one draws all the even scan lines for the final display, and one draws all the odd ones.

SLI != SLI (4, Informative)

Jahf (21968) | more than 9 years ago | (#10902708)

First it is mildly interesting to note that SLI from Voodoo was "scan-line interleaving", as in every other line was alternated between the 2 cards. Nvidia SLI is "scalable link interface" and instead renders the top half of the image on one and the bottom on the other.

It does make me wonder if the technology is capable of truly scaling ... ie ... more than 2 cards? Could be useful for scientific simulations or even getting closer to the idea of "ToyStory in realtime" (and no arguments here ... using the same shaders as Pixar used in the movies in realtime is not feasible today ... cheap tricks to get close, maybe).

However, given the cost, and looking at what the 6800 can handle by itself, and comparing -those- to the evolution of games it appears to me that it will be no more costly to simply upgrade to a 6900/7000/whatever when it is required, as I can easily get by for the next year or two on a 6800 Ultra especially if including the fact that I would need a new computer to run it on since I don't have PCI-E (though I do have PCI-X, but not for gaming needs). And will be saving on electricity and mean time to failure (though that doesn't seem to be an issue much with video cards).

Not saying I don't see the attraction, but I don't get anywhere NEAR interested in 3D gaming enough to be spending that kind of dough.

Re:SLI != SLI (4, Funny)

Linuxthess (529239) | more than 9 years ago | (#10902828)

Could be useful for scientific simulations or even getting closer to the idea of "ToyStory in realtime"...

I propose a new acronym, TSIRT which will be the standard of rendering performance, similar to the "LOC" (Library Of Congress) reference when comparing download speeds.

Re:SLI != SLI (4, Funny)

ArsonSmith (13997) | more than 9 years ago | (#10903113)

I created a top notch 3d rendering engine and all I got was this TSIRT.

Re:SLI != SLI (0)

Anonymous Coward | more than 9 years ago | (#10903278)

Throw 2 TSIRT PC's in a lake, compare benchmarks ==
Wet TSIRT Contest!
sorry.

Re:SLI != SLI (0)

Anonymous Coward | more than 9 years ago | (#10903040)

I see this as nVidia buying the wrong assets from 3Dfx. The riva series and early GeForce cards were able to make design choices that greatly increased the capacity of the cards. With the latest cards I see the same behavior that 3Dfx had just before going under. Don't innovate and make better, just throw more GPU's at the problem. Pretty soon I expect to see an GeForce 7700 that is two 6900 GT's on one card, with the option of SLI for the processiong power of 4 GPU's. With a little bit of ducting your computer will be able to hover.

Re:SLI != SLI (1)

Fortun L'Escrot (750434) | more than 9 years ago | (#10903051)

think of it like this. with the option to go dual gfx core you can still continue to upgrade your gfx card as normal.

BUT if you do not want to spend that much dough, you can always buy a cheaper card and stick it in.

UNLESS this Nvidia SLI thing requires you to have two identical cards in which case many gamers will just get budget to mid-range cards and when newer games come out double up on their cards. either way it means you can upgrade for cheaper but at a price ie power consumption.

Re:SLI != SLI (1, Interesting)

Anonymous Coward | more than 9 years ago | (#10903061)

First it is mildly interesting to note that SLI from Voodoo was "scan-line interleaving", as in every other line was alternated between the 2 cards. Nvidia SLI is "scalable link interface" and instead renders the top half of the image on one and the bottom on the other.

Why don't they render the left and right side instead of top and bottom? Is it because it's easier to sync with the beginning of the horizontal sweep of a CRT?

Double The Money (5, Funny)

Squeebee (719115) | more than 9 years ago | (#10902709)

So the guys at Nvidia were sitting around when in walk the PHB and says "Guys, we need to make more money". And flunkie one says "Hey, let's release a new card, all the fanboys will rush out and buy it!" PHB says "Well that's ok, but we do that enough already". Flunkie two says "I know, let's convince the users that the one overkill video card they buy is not enough, let's convince them that the need to buy TWO!" And the rest my friends, is history! Stay tuned for the new quad-card cash-vacuum, coming soon.

quad-card cash-vacuum (1)

Tumbleweed (3706) | more than 9 years ago | (#10902874)

Stay tuned for the new quad-card cash-vacuum, coming soon.

Interestingly, that might even work. According to the tests I saw (Anandtech or TechReport, can't remember), the PCIe videocards are only using about 4x of the available 16x anyway, so even with dual cards, they're only using half of the available PCIe lanes, so if they can figure out how to do it, quad cards _could_ work, in theory.

Not that you'd find enough suckers with enough money to make it worthwile, I bet. :)

I just wish my recently-purchased 5900XT wasn't so bad at DirectX 9. I only (currently) play an OpenGL came (BZFlag!), anyway, but I'd like to have the option of playing DX9 games at reasonable framefrates in the future. I guess a 6600GT is in my future, somewhere.

Re:Double The Money (2, Insightful)

Thought Harvester (736600) | more than 9 years ago | (#10902955)

And I went out and bought one.

You know what? Comments like yours are worthless. Thanks for your opinion that you think gaming isn't worth spending money on. The fact of the matter is, I am a gaming hobbyist. I like games, and I really like games running well on my rig. Setups like this push the dollar envelope, true, but how is it any worse than spending $1000 on a new golf driver?

Come to think on it, SLI is better than a driver because the improvements are evident and more dramatic compared to more inexpensive solutions. It improves my overall gaming experience and in my mind is worth every penny.

Why don't you tell us what your hobbies are, so the collective group can crap all over them.

Re:Double The Money (0)

Anonymous Coward | more than 9 years ago | (#10903098)

Apparently his hobby is making jokes, something you arn't terribly familiar with. Learn to laugh at yourself, it does wonders.

Re:Double The Money (0)

ArsonSmith (13997) | more than 9 years ago | (#10903048)

Doesn't this imply almost the oposite of what you say. Isn't it more like pay $400 for a video card and 6 months later a new twice as fast $400 card comes out, well the one I bought 6 months ago is only $40 now, sli twice as fast as my old card and only $40. By the time I needed quad cards they'd be in the bargin bin $10 with $10 rebate.

Re:Double The Money (1)

hooqqa (805765) | more than 9 years ago | (#10903075)

This is absolutely retarded. With the voodoo sli you get to buy a new piece of hardware without throwing away your old card. This is more like, "Would you like to buy /half/ a video card?" - not that it's suprising or anything...

Re:Double The Money (1)

Ann Coulter (614889) | more than 9 years ago | (#10903117)

I would gladly buy two nVidia based PNY Quadro FX 4400 cards with a dual Opteron motherboard that supports SLI. I would use these two graphics cards in non-SLI mode most of the time so that they can drive 4 1600x1200 (or 2 3840x2400) screens at the same time, and use SLI only to simulate very detailed environments. I would also buy Matrox QID Pro cards that handle 4 monitors per card, a total of eight monitors. This setup would cost at least $5700 for the video cards alone ($900 for the QID Pro and about $2400 for one Quadro).

Quadros have 128 bit IEEE compliant floating point processors. This translates to extremely precise numerical analysis that occurs in a super-parallel (or doubly super-parallel) platform. I thought I was dreaming when I heard that the GeForce 6800 GPUs had 64 bit floating point processors. Now these Quadros are godsends as I can develop algorithms that have not even dreamed of planning on because of the lack of a parallel architecture that supports the kind of precision I need.

For simulation purposes, I have not worked with much 3D objects (or projections onto 3D space) because of the lack of precision of most desktop cards. With workstation cards, I can finally analyze 3D point sets and the like.

In conclusion, the newer Quadros are definitely worth the money to me.

Re:Double The Money (4, Interesting)

Raptor CK (10482) | more than 9 years ago | (#10903265)

More like, "Hey, the last generation videocard is now obsolete, and no one wants it! How do we fix this next time?"

"I know, let's make it so that if you buy a second one a year later, it'll work WITH the first one!"

No one needs to buy two right off the bat. One is usually more than enough for any modern game. But one for a few hundred now, and the other for less than $100 later? That's a bargain basement upgrade, and one that's far more sensible than getting the new mid-range card now, and the new mid-range card a year from now.

Now, if someone *wants* to buy two top of the line cards today, more power to them. They want the ultra-high-resolution games with all the effects cranked up, and they have the money. It makes their games look nicer, while my games run well enough. We both win, and Nvidia no longer sits on piles of unused chips.

this is sweet (1)

the_2nd_coming (444906) | more than 9 years ago | (#10902720)

because the current crop of high end cards are physically incapable of rendering Doom 3 on the best settings. that requires 512 MB or VRAM. with SLI, you put 2 256 MB video cards in and you can!!!

Re:this is sweet (4, Informative)

stratjakt (596332) | more than 9 years ago | (#10902786)

No, it doesn't work that way.

Each card renders half of the same image. So each card needs access to the full texture set.

So 2x256 cards still only gives you 256 megs for your textures.

There's something I don't understand here. (0)

Anonymous Coward | more than 9 years ago | (#10903006)

According to your sig, you should be smoking way too much pot to know that.

quadro (1, Informative)

Anonymous Coward | more than 9 years ago | (#10902908)

High-end Quadros [nvidia.com] have 512 MB RAM. plus, they're dirt cheap ;)

power consumption??? (2, Insightful)

Hackura (603389) | more than 9 years ago | (#10902734)

My question is whos got the 1100watt power supply that running 2 6800's is going to "require"?

Re:power consumption??? (0)

Anonymous Coward | more than 9 years ago | (#10903005)

The same person that has 2 motherboards, 2 cpus, 2 harddrives, etc to go with the 2 6800s.

1 6800 will not take up all 550w of a power supply.

(And yes, I know the post was a joke)

SLIing other GeForces (1)

shadowzero313 (827228) | more than 9 years ago | (#10902746)

Is the SLI only compatible with the new GeForce 6x00 series or can you use an older GeForce set?

Re:SLIing other GeForces (2, Informative)

mesach (191869) | more than 9 years ago | (#10903001)

its only compatible with like cards capable of SLI, you cannot just throw 2 cards in your box and run the latest drivers and get SLI.

there is a bridge adapter for the cards, if you look around they apparently come in PCB and Ribbon styles, and connect to a funky new cutout on the PCB on top of the Card.

Re:SLIing other GeForces (1)

shadowzero313 (827228) | more than 9 years ago | (#10903023)

damn, i can't get a new board and a second GF4Ti. Time to get a job.

Re:SLIing other GeForces (1)

teg (97890) | more than 9 years ago | (#10903057)

Is the SLI only compatible with the new GeForce 6x00 series or can you use an older GeForce set?

Only some of the new GeForce 6x00 cards (not all) can be used for SLI. You need a special connector on the cards.... also, there are no PCI express versions of older GeForce cards anyway AFAIK.

Kkooooll!! (-1)

Anonymous Coward | more than 9 years ago | (#10902759)

II ddiidd tthhiiss oonnccee. IItt wwaass eeaassyy ttoo sseett uupp,, bbuutt iitt rreeaallllyy ddiiddnn''tt hhaavvee mmuucchh aaffffeecctttt oonn oovveerraallll ppeerrffoorrmmaannccee. IItt ddiidd rreeqquuiirree aa ssoommee ppaattcchh ttoo mmyy ssppeellcchheecckkeerr, IIIIRRCC.

Should have invested in... (5, Funny)

vasqzr (619165) | more than 9 years ago | (#10902763)



Dual webservers. Would have delayed the Slashdotting.

Re:Should have invested in... (0)

Anonymous Coward | more than 9 years ago | (#10903121)

Writing your comment in bold made it twice as funny. Next time try italics.

Ironic? (5, Insightful)

goldspider (445116) | more than 9 years ago | (#10902779)

I find it funny that some of the people who lamented the $15/mo. for WoW in the last article are probably the same people who will go out and drop $600 for a top-notch SLI video setup.

why so little support for gamers? (1)

sowdog81 (739008) | more than 9 years ago | (#10902788)

Why aren't there more quad/dual processor motherboards built with gamers in mind? It seemed like an inevitable choice for the must-have-it gamer? There is a big enough market that there are 500usd graphic cards and 700 usd processors. It be nifty if this sort of thing had become mainstream :D

Re:why so little support for gamers? (1)

dead sun (104217) | more than 9 years ago | (#10903021)

What's the last well parallelized game you played? I had a dual Pentium 3 setup and while it was certainly nice for other things, any gaming I did was only really helped by being able to run system processes on the second cpu.

I would absolutely love to have a dual desktop again, but mostly just because it was more responsive and handled multitasking under load so much more gracefully. Little better in computing than having a processor running at 100% and still having a usable desktop running on the other processor.

Re:why so little support for gamers? (5, Informative)

meestaplu (786661) | more than 9 years ago | (#10903059)

Right now, the answer is pretty simple. If you want a game to use multiple processors at the same time, you need to include more than one execution thread--the programmer has to divide the work in such a way that two or more processors can do it. It's quite hard to build a multithreaded game; there was some SMP support in Quake III, but it wasn't very stable and didn't provide a huge performance boost.

With a multithreaded application, you have to guard against strange bugs that are very, very hard to fix. If your multithreaded application runs into a deadlock every hundred thousand frames or so, it will be next to impossible to isolate, and production will end up being slower than it already is. While I'm sure that writing multithreaded games will happen in the near future, I don't think it will catch on very quickly.

Re:why so little support for gamers? (1)

tygerstripes (832644) | more than 9 years ago | (#10903089)

The answer's pretty recursive; there aren't enough gamers with dual/quad processor platforms out there to warrant coding big-name games to make use of multiple-processors. So, there aren't many gamer-oriented multi-processor platforms out there... Besides, there really isn't much in terms of processing power that people demand at the moment. There aren't any games that would benefit more from a twice-the-price processor than from a twice-the-price graphics card. When it comes to gaming, we don't want clever - we just want pretty. Sad, but true.

Re:why so little support for gamers? (1)

Enrique1218 (603187) | more than 9 years ago | (#10903133)

What games actually use the second processor? Quake 3 was the only game I knew of that could use two processors. I believe the gain was minimal. This is probably why ID elected to drop the feature from Doom 3. Moreover, the gpu is more imporatant than the processor when it come to performance. Alot of decent gaming systems will use a mid-range processor and put extra money inro a high end video card. One thing, I would like to know is whether games would benefit from being design for 64-bit processors .

most famous acronym? (0)

Anonymous Coward | more than 9 years ago | (#10902812)

Come on, how useful can this article be, consider that the author opens with the statement that SLI is "that most famous of three letter acronyms." Huh? How about ABC, CPU, etc? Hell, even GPU is a better known acronym, and that's in the 3d world itself! The article was interesting, but the author needs to learn to moderate his or her prose.

That giant sucking sound... (3, Funny)

Aggrazel (13616) | more than 9 years ago | (#10902815)

LOL a link off the front page to a page filled with hundreds of screenshots?

I weep for that man's router.

Ouch on Costs! (3, Informative)

Evil W1zard (832703) | more than 9 years ago | (#10902818)

I can't imagine shelling out another couple hundred bucks for another XT pro and then shelling out even more money for a more robust power supply and better cooling as well. Its prolly great for those who can afford it, but I know I won't be doubling up anytime soon.

What I'd like to see.. (3, Insightful)

Smidge204 (605297) | more than 9 years ago | (#10902821)

I'd like to see something set up so onboard video hardware can take advantage of this. It's difficult to get a motherboard that doesn't have onboard video anyway, and if you buy the right video card (ie: same manufacturer) they can both run to get an added performance boost. (You should, of course, be able to install any graphics card, but won't get anything extra for it)

=Smidge=

Re:What I'd like to see.. (1)

macklin01 (760841) | more than 9 years ago | (#10902968)

Nice idea.

On a related note, I'd like to be able to use SLI with two cheap video cards. Perhaps two cheap vid cards could be work together to give performance comparable to that of a much more expensive single card. -- Paul

Re:What I'd like to see.. (1)

The Other White Boy (626206) | more than 9 years ago | (#10903253)

the 6600GT's are in the $150-200 range. two of them in SLI according to HardOCP keeps up with and sometimes surpasses a single 6800 Ultra, which would run you $500-600 last i checked.

See also the UK "PC Pro" magazine (3, Informative)

Pete (big-pete) (253496) | more than 9 years ago | (#10902851)


This month the UK "PC Pro" magazine has a review [pcpro.co.uk] of the Scan White Cobra [scan.co.uk] gaming machine.

This is a fine example of SLI running with jaw dropping performance...a quote from the review puts Doom 3 running at 98fps!

Now I know what I want for Christmas, just not a snowball's chance in hell of getting one! :)

-- Pete.

Re:See also the UK "PC Pro" magazine (1)

Elwood P Dowd (16933) | more than 9 years ago | (#10903105)

Now I know what I want for Christmas, just not a snowball's chance in hell of getting one!

A snowball would fare worse in a Scan White Cobra.

Old article on human eyes (0)

Anonymous Coward | more than 9 years ago | (#10903252)

For some reason what you said made me look up this article, it's 2-3 years old but good read about the misconceptions about human eye fps.
http://amo.net/NT/02-21-01FPS.html

Ummm....huh? (0)

Anonymous Coward | more than 9 years ago | (#10902871)

Would it be difficult to tell the unwashed masses (me included) what an 'SLI' is??

Yes, I could eventually figure it out, but if it's the point of the article....

Re:Ummm....huh? (0, Redundant)

affliction (242524) | more than 9 years ago | (#10903225)

Scan Line Interleave. Each card renders half of the screen for a given frame.

Hercules? (1, Offtopic)

michaelmalak (91262) | more than 9 years ago | (#10902878)

And here I thought the story was about once again running a debugger on a Hercules Monographics card while the app being debugged runs on the color card.

Tom's Hardware also has a test (1, Informative)

Anonymous Coward | more than 9 years ago | (#10902921)

Tom's Hardware also did [tomshardware.com] review the SLI setup.

Already Slashdotted... (1)

PrintedChickenQuack (807803) | more than 9 years ago | (#10902927)

Good work team!

32x (2, Interesting)

Ann Coulter (614889) | more than 9 years ago | (#10902939)

The PCI Express standard allows for 32x lanes. The nVidia SLI uses two 8x lanes. Wouldn't it be nice if a motherboard supported two (or more) 32x lanes and 32x graphics cards working in parallel? Think ray tracing because at those bandwidths, and the fact that there is a ergonomic limit on how small a pixel on a display can be, one can have the average size of a triangle be smaller than a pixel. This isn't true ray tracing but the effect is there.

On a similar note, are GPUs a good platform for genuine ray tracing?

Re:32x (4, Insightful)

harrkev (623093) | more than 9 years ago | (#10903200)

Ummmm... Ray tracing does NOT depend on the video card. If all you are doing is ray tracing, get an old Voodoo 2 or something for $10 from eBay.

Ray Tracing uses the CPU to do all of the work. Video chips are optimized to do a lot of "shortcuts" and "tricks" to render a scene, and the math is completely different. Trying to make them do something else is like trying to strap fins on a donkey and turn it into a fish.

A dual-core CPU, on the other hand, would work wonders on a ray tracing.

GPGPU (3, Interesting)

ryanmfw (774163) | more than 9 years ago | (#10902993)

This is actually a very interesting possibility for general purpose GPU programming, which aims to offload as much easily parallelizable operations off to the video card. If you can have two, running off of PCIe, you could get a big return in speed, allowing some very cool stuff to be done much quicker.

Check out http://www.gpgpu.org/ [gpgpu.org] for cool stuff. And if I'm not mistaken, it is already possible to use SLI.


Cheers,

What about the nforce4? (2, Informative)

lrwx (800141) | more than 9 years ago | (#10903013)

Other than the fact that this is old news. I would have figured that the focus would be more on the new nforce4 chipset http://www.nvidia.com/page/nforce4_family.html [nvidia.com] familiy. There are three board types in this family The Nforce4 Standard, the Nforce4 Ultra, and the Nforce SLI. As a matter of fact Asus is releaseing an sli board based on this right now called the A8N-SLI with a slew of added features that you could expect out of and asus board including dual gigabit ethernet ports! Why the via board is even being covered is beyond me the nforce is a much more better chipset. Here is a [H]ardOCP benchmark page here http://www.hardocp.com/article.html?art=Njk2 [hardocp.com] . Enjoy. ;)

May the GeForce be with you! (3, Funny)

asliarun (636603) | more than 9 years ago | (#10903024)

nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI
A case of nVidia acting on the SLI?

VIA SLI pictures Houston AMD Tech Tour October (1)

ruiner5000 (241452) | more than 9 years ago | (#10903111)

We posted pictures here [amdzone.com] and here [amdzone.com] of the VIA SLI last month from AMD's tech tour in Houston. More interesting is our pics of the Tyan dual nForce 4 chipset board. That is two nForce 4 chipsets, two full 16X PCI Express slots, and two CPU sockets for Opteron.

SLI is a rip off. (3, Interesting)

Anonymous Coward | more than 9 years ago | (#10903163)

I'll wait for the dual GPU on a single card solution. You gain nothing from having 2 cards, the dual PCI express boards still have the same bandwidth the lanes are just split between the two.

This simply forces you to get a new motherboard. Which I guess is a win for intel and nvidia eh?

Let's see, get dual cards which requires a new motherboards, or wait and get a new video card that has gual GPU"s which takes about 10 minutes to install at most.

I bet you ATI will do the dual GPU solution first and nvidia will go "fuck we should have learned from 3dFX's voodoo 5500"

I had a 5000 series card, dual Gpu's on the SAME card amazing concept!

The dual voodoo cards made sense in a day when you had a lot of spare pci slots. But ever since we've gone to the methodolgy of a single graphic slot it's not simply a matter of slapping in a new video card and connecting an sli connector, you have to get a whole new motherboard.

I DO agree with a previous statement made that is if we could go up to 4 cards and 4 cpu's on a system. that kind of flexibility would be awesome.

SLI confuses me. (1)

Wescotte (732385) | more than 9 years ago | (#10903189)

Doesn't SLI work by rendering basically every other line (or half the screen) for each card and then combining them? Is there any work being done with more complex situations?

Like ok card 1 you render this building, card 2 you render this tree. Ok done card 1? Now go ahead and render that scary monster! Card 2 get your ass in gear and finish rendering that tree! Now combining all these completed elements would be much more complex than just merging pixel data but I think for significantly more complex geometry it would be the way to go.

We are getting better 3D enviroments with alot more detail but it seems like we're just hacking our way thru. For example (and I forget what the actual term is) but I saw a tech demo for the Unreal 2 engine where they take original high poly models and create a special texture that allows the lighting to produce the acurate depth. Sure you get amazing visuals but other things suffer. Like accurate collision detection or just a ball bouncing off the wall. It's simply not mathmatically possible to accurately simulate how the ball would react in these situations.

6 months ago.... (1)

Tongue In A Box (664849) | more than 9 years ago | (#10903198)

Did I hit the Slashdot archeological nostalgia site? This was news 6 months ago.

Modern CPU's cannot handle this... (2, Insightful)

Omniscientist (806841) | more than 9 years ago | (#10903222)

One NVIDIA 6800 GT bottlenecks by most CPU's out today. The GPU has 222 million transistors, more than most CPU's available. In fact I'm not even aware of any CPU that exceeds that (I'm talking about home use processors).

If you get two 6800 GT's working together, well if one GT is bottlenecked from most CPU's (the GPU has to actually wait a little bit more for the CPU to catch up), how can that CPU possibly catch up to two?

I say that we should wait to buy SLI technology until better CPU's come out, or if you have a dual CPU setup, or even until dual core CPU's come out.
Well, that sounds expensive to me, better start saving...

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...