Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Dual Video Cards Return 264

Kez writes "I'm sure many Slashdot readers fondly remember the era of 3dfx. SLI'd Voodoo 2's were a force to reckoned with. Sadly, that era ended a long time ago (although somebody has managed to get Doom III to play on a pair of Voodoo 2's.) However, Nvidia have revived SLI with their GeForce 6600 and 6800 cards. SLI works differently this time around, but the basic concept of using two cards to get the rendering work done is the same. Hexus.net has taken a look at how the new SLI works, how to set it up (and how not to,) along with benchmarks using both of the rendering modes available in the new SLI." And reader Oh'Boy writes "VIA on its latest press tour stopped by and visited in the UK and TrustedReviews have some new information on VIA's latest chipsets for AMD Athlon 64, the K8T890 and the K8T890 Pro which supports DualGFX. But what has emerged is that DualGFX after all doesn't support SLI, at least not for the time being, since it seems like nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI. VIA did on the other hand have two ATI cards up and running, although not in SLI mode."
This discussion has been archived. No new comments can be posted.

Dual Video Cards Return

Comments Filter:
  • New trend ? (Score:5, Insightful)

    by FiReaNGeL ( 312636 ) <`moc.liamtoh' `ta' `l3gnaerif'> on Tuesday November 23, 2004 @04:40PM (#10902641) Homepage
    Dual video cards... soon dual-core CPUs, is it a sign that we're slowly approaching the Moores Law limit? The 'dual' strategy allow for further performance gains.... but I can't see myself using more than 2 video cards (hell, I can't even see myself using more than 1), so that will be a very temporary solution.

    And we're not even speaking of how much power (wattage) these 'dual solutions' consume...
    • by Scoria ( 264473 ) <{slashmail} {at} {initialized.org}> on Tuesday November 23, 2004 @04:43PM (#10902687) Homepage
      The 'dual' strategy allow for further performance gains....

      Eventually, barring any further technological advances, perhaps we'll even result to modular clustering. Once again, the enthusiast's computer will be larger than a refrigerator! ;-)
    • Duel video cards are not the same as duel-core CPUs. It is more like dare I say it a beowolf cluster. When will we see a true duel gpu card or maybe a duel core gpu?
      What I am more interested in is will we see smp boards supporting duel-core AMD-64 cpus. It could be interesting since from what I read the AMD64 is NUMA when using more than one cpu but I would guess that the duel core would be more of an UMA.
      • Re:New trend ? (Score:3, Insightful)

        by Masami Eiri ( 617825 )
        Voodoo 4 and 5. 4 had 2, I believe, and 5 had 4 (!). The 5 needed an external power brick. They still were both outgunned (at least in bang-for-the-buck) by the GeForce.
        • Re:New trend ? (Score:2, Informative)

          by adler187 ( 448837 )
          No the Voodoo 6 6000 needed the external power brick because it had 4 chips. The 5500 had only 2 and the 4500 had 1.

          From wikipedia [wikipedia.org]

          My memory differs from the wikipedia, I seem to remember there being a Voodoo 4 4000 and I believe the 3dfx site listed it as the Voodoo 6 6000 and not a Voodoo 5 6000. Although most of my information was looking around on the 3dfx site back in the day so it may be they listed cards that weren't actually released (like the 5000 which I remember seeing there too). you can f
      • Duel video cards are not the same as duel-core CPUs. It is more like dare I say it a beowolf cluster. When will we see a true duel gpu card or maybe a duel core gpu?

        I'm no expert but I was under the impression that the bottleneck in current graphics cards is the amount of memory and the speed of the bus that the texture data has to travel down. Aparrently the actual 3d geometry is very easy to process, its the rendering and its associated problems that slow things down.
      • You will be able to buy multi-cpu multi-core setups pretty much immediately after the multi-core cpus are available.
      • Duel video cards are not the same as duel-core CPUs. It is more like dare I say it a beowolf cluster. When will we see a true duel gpu card or maybe a duel core gpu?

        We've seen dual-core GPUs already. What do you think a multi-pipeline GPU is?
      • "Duel video cards"

        Isn't that what Zell Miller [about.com] runs on his PC?
      • Just was at AMD's 64bit technote presentation (or whatever they called it) They've got dual-core CPUs in the works right now...I want to say it was like Q2 or Q3 of 2005. I might be wrong, I can't quite remember. At that point I was getting a little ansy to go look at all the cool displays.

        Barring BIOS support, you'll be able to drop the dual-core CPUs into existing boards (assuming the board itself supports it - which from the sounds of things, some will just require a BIOS update)
    • by PIPBoy3000 ( 619296 ) on Tuesday November 23, 2004 @04:57PM (#10902902)
      The real benefit, from my perspective is that it's a low-cost way to upgrade your video card in between new computers. I bought my first Voodoo 2 for $300. My second cost $30.
      • ...and, of course, the convenience of being able to fry your breakfast on the case!
      • The real benefit, from my perspective is that it's a low-cost way to upgrade your video card in between new computers. I bought my first Voodoo 2 for $300. My second cost $30.

        A friend of mine did the same with CPUs back in the day. He bought a dual Pentium 2 board with one processor - together it ran about $400. Then a year or so later he bought two matched processors for less than $80 (combined) and gave the old one to his brother. We're still using the dual P2 system as a file server.
      • This only works if the feature sets / API don't change. But video cards today always try to show what NEW technology they are implementing like vertex shaders and whatnot.

        So 1 year later you can add more brute force, but chances are that you will be behind the technology curve.

        They didn't work before. Nothing has changed since. They wont work now.

        Dual cards are just a way for the card sellers to make more money. We see them now ONLY because its easy to do on the PCI Express bus.

        What I would like for
    • Re:New trend ? (Score:2, Informative)

      by Anonymous Coward
      Multiple GPUs will be good to have as there are lots of uses for GPUs additionally to pretty pictures.

      The Folding@home ( http://folding.stanford.edu/ [stanford.edu]) is about to enter the GPU based Folding:
      http://forum.folding-community.org/viewtopic.php?p =75287#75287 [folding-community.org]

      Interesting times ahead...
    • Re:New trend ? (Score:5, Insightful)

      by GreatDrok ( 684119 ) on Tuesday November 23, 2004 @05:02PM (#10902950) Journal
      Dual video cards... soon dual-core CPUs, is it a sign that we're slowly approaching the Moores Law limit? The 'dual' strategy allow for further performance gains

      I don't think so. Quoting from Intel's web site: "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." Many people assume Moores Law states that speed of processors will double every 18 months and that the fact that it is becoming difficult to increase clock speed now means that Moores Law is finished. However, increasing speed is a consequence of higher clock speeds and higher transistor counts. Dual cores means you can increase the number of transistors per IC more and actually use them to do real work rather than simply adding a huge cache (as was done with the latest Itanic). End result, more speed, higher transistor count, and Moores Law still fine. In fact, dual cores could mean that the transistor count increases at greater than Moores Law in the near term. Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....

      • moores laws truly only regaurds transitors density. CPU speed or clock speed does not follow moores law.
      • Re:New trend ? (Score:3, Interesting)

        by owlstead ( 636356 )
        Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....

        As long as the software developers do (with their crazy per-processor schemes it doesn't matter. Microsoft got that right in one go (I still don't like them, but they seem to do more right lately). Others will probably follow suite, at least for the PC/small server market.

        And the rest is academic. Call it what you like, as long as it speeds up my PC and gives me better response time? Since the proc
      • Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....

        That's the rub right there. Moore's law under its broad interpretation - "computers get exponentially faster" - was great because new processors could run the same old programs with exponentially increasing speed. (Moore's law under its narrow interpretation - transistor count - is quite useless, since nobody cares about transistor counts per se).

        N parallel processors are never as good as a single

      • Moore's law says double the number of transistors every 18 months. So a year and a half after Opteron, a dual-core would be par, so AMD is actually a little late. But a quad-core a year and a half after the dual would also be par.
    • Power consumption (Score:5, Informative)

      by Hoi Polloi ( 522990 ) on Tuesday November 23, 2004 @05:05PM (#10902992) Journal
      And we're not even speaking of how much power (wattage) these 'dual solutions' consume...

      SLI power consumption can be significant! [anandtech.com]

      • According to those figures, two 6800 Ultras in SLI consume 35% more power than a single 6800 Ultra, when the system is under load.

        That's a lot, but still a damn sight better than double.
        • Re:Power consumption (Score:3, Informative)

          by gordyf ( 23004 )
          That's comparing the entire system's loads, though, and not just the video cards themselves. That means that adding another video card and changing nothing else raised their entire computer's power usage by 35%. That's a fair amount.
    • My personal opinion is that this really just boils down to marketing. They want a new product for the high end consumers to buy and the dual video card thing hasn't been used in a while. Given time we will see it fade and return again.
    • Dual video cards is just a poor engineering solution to how to render using two GPUs. Soon enough they'll be making multi-core and multi-gpu cards (there is some motivation to go to multi-gpu before multi-core to avoid manufacturing losses).
    • Re:New trend ? (Score:2, Interesting)

      by cplusplus ( 782679 )

      And we're not even speaking of how much power (wattage) these 'dual solutions' consume...

      A long time ago I had an Obsidian X-24 graphics card, which was basically an SLI Voodoo2 on one card that drew its power from a single PCI slot. It used so much power that that my computer would just power off without warning quite frequently.
      A 350 watt power supply fixed the problem (I had a 250 watt), and that was a LOT of power back then. Now I have a 400 watt Antec power supply which was the recommended solution f

    • by Charcharodon ( 611187 ) on Tuesday November 23, 2004 @06:13PM (#10903729)
      You obviously don't have much imagination if you can't think of a use of more than two video cards/monitors.

      As a lover of flight sims I'll be first in line to buy a mother board that can support 10 video cards. Along with an array of cheap monitors I will finally have a wrap around view of the sim world. This can apply easily to any game.

      First person shooters could finally have peripheral vision (one center and two on the sides) along with a inventory and map screen. Brings the grand total to five.

      Driving games could finally have a true perspective instead of the stupid 3rd person or 1/3 screen in car view. So at least three monitors.

      RTS resource monitors, sat view, and ground maps. Well that could become quite the array depending on how much you wanted covered. Say anywhere from 3-12 monitors.

      Same for Massive Multiplayer Online Games. I could see a use without trying hard that would require at least six monitors.

      You could double, tripple or even quadruple up on the number of required cards for any one monitor that would require higher end graphics. There are always those twisted monkeys that come up with graphics that won't run on any one GPU these days. For example those lovely to the horizon maps that show up in various games that add about 100meters of high detail every year. I see another scenario where people boost their systems performance by picking up cheaper versions of cards they own to keep their graphics improving without breaking the bank. (We can all remember when GF 2 cards cost $400 each, that'll buy you 50 of them these days.

      Who could afford all this you ask? Well just about anyone these days. I've got a stack of 17inch CRT monitors in the garage I picked up for $5 a piece that are just begging to be used. With the advent of sub $100 video cards and CRT monitors, and the fact that not every output would have to be super hi rez. Perpheral views, 2d maps, and inventory lists would be just fine on something to the equivalent to a GeForce 4 MX ($32 new). You could seriously enhance your gaming machine for the price of one top of the line latest and the greatest video card from ATI/Nvidia.

      So you keep your two monitor display, for me I'm going to check to see if the wiring in my computer room can handle the extra 10 monitors I plan on adding.

    • but I can't see myself using more than 2 video cards.
      That's because you need to use Dual Webcams!
  • by Lostie ( 772712 ) * on Tuesday November 23, 2004 @04:40PM (#10902646)
    ...because you can see more than 1 colour (black).
  • Who to Trust (Score:5, Insightful)

    by superpulpsicle ( 533373 ) on Tuesday November 23, 2004 @04:40PM (#10902647)
    Every review I have seen has claim SLI to be the wave of the future giving you ridiculous speed boost. But don't all video card reviews do that now? Last I checked on some of the older Tom's hardware, anantech reviews, my hardware should be polling in 70 fps for some games. I'd be lucky to hit 35 consistently... that's reality.
    • But with SLI you'll be able to use two boards to reach the fps you were promised to reach with your single board!

      See, it works out in the end.
    • I don't know they give an example of a single G256 compared to dual G256 in SLI mode and it is almost 10x faster. With the addition of one card I get 10x faster that sounds great to me.

      -Both logic and RTFA will hurt me
    • Every review I have seen has claim SLI to be the wave of the future giving you ridiculous speed boost. But don't all video card reviews do that now?

      If you look at the history of video cards, you will see that whenever they succeed in reaching the limit in one particular technology, they will continue to move on something else. First it was screen resolution, then pixel depth, followed closely by 2D pixblitting, then 3D acceleration, multi-texturing, then programmable vertex and finally fragment programs,
    • Why do you want more than 30fps? At 30 fps the motion is smooth, and there is no interference with the light flicker (on North American 60Hz AC, Incandescent & CF Lights). Any extra GPU cycles can go to improving the quality of those 30 frames.
      • Like always, 30 fps isn't the key. You might see a smooth movie at 30 or even 25 fps but for interative 3d enviroments 60 is better, and for some really really anal retentive visual processing people, no number is too high.
        • also keep in mind 30 fps walking down a hallway in Half-Life 2 could easily turn into 5 fps the minute you enter the outdoors and 8 marines start lobbing grenades at you, if the game wasn't programmed to compensate...

      • by blaine ( 16929 ) on Tuesday November 23, 2004 @06:39PM (#10903991)
        There's a couple good reasons you need more than 30FPS. For one thing, you never want the framerate to drop below smooth. So, even if you can average 30FPS, that doesn't mean you aren't going to sometimes drop to 15FPS or lower.

        However, there's a much more important factor at work here that confounds the film-vs-video-card comparison: video game frames are not the same as film frames. The biggest problem in this regard is motion blur. Here's a little exercise. Try it out in real life if you have the equipment, or just think along through it:

        Let's say you were to use a video camera and capture 30 frames in 1 second. The subject is your own hand, waving up and down quickly.

        Now let's say you rendered a 1 second video using the 3D engine du jour, also 30 frames, of a hand waving up and down quickly.

        If you were to look at the 30 film frames, they would not be crisp. Each one of them would likely exhibit motion blur. However, when played at a rate of 30fps, to the human eye, that motion blur looks smooth.

        If you were to look at the 30 rendered frames, there is no motion blur. Each frame is rendered crisply. The problem with this is, when played at 30fps, instead of smoothly moving from one frame to the next, the hand appears to jump between frames. There is no intermediate data to allow a smooth flow from frame to frame.

        There are two ways around this: first, you could simulate motion blur in the engine. Second, you can pump the FPS up high enough that there is intermediate data for your eye to take in, and do the motion blur on it's own. The former of these options seems much more likely.
    • Possible reason... (Score:2, Interesting)

      by apharov ( 598871 )
      Would you happen to be using a motherboard with a VIA chipset? My old MB used a KT400 chipset. I didn't notice anything strange when using a Radeon 7200 on it but when I upgraded to a 9800Pro the speed I got was way slower than what it should have been. A couple of nights on tweaking and googling and I came to the conclusion that KT400 AGP support was s**t, especially with ATI video cards.

      One more night of examining other motherboards and I decided to buy a mb based on nForce2Ultra chipset. After install
  • Intel & SLI (Score:5, Informative)

    by DeadBugs ( 546475 ) on Tuesday November 23, 2004 @04:41PM (#10902658) Homepage
    It is worth noting that NVIDIA will be bringing SLI to the Intel platform according to this press release:

    http://www.nvidia.com/object/IO_17070.html [nvidia.com]

    I'm looking forward to a P4 NForce board.
    • There is already an Intel-based PCIe board that has dual graphics slots, I think based on the 925 chipset.

      The only thing different here is that nVidia might introduce a cheaper way to get such a board.
    • i'm really not trying to be a troll or anything, but why would you go intel right now when the amd 64 chips have so many advantages (price, performance, power, heat)?
  • AlienWare (Score:5, Informative)

    by Spez ( 566714 ) on Tuesday November 23, 2004 @04:43PM (#10902693)

    You can already buy from the alienware luxury collection some gaming systems featuring SLI

    http://www.alienware.com/ALX_pages/choose_alx.aspx [alienware.com]
  • SLI != SLI (Score:4, Informative)

    by Jahf ( 21968 ) on Tuesday November 23, 2004 @04:45PM (#10902708) Journal
    First it is mildly interesting to note that SLI from Voodoo was "scan-line interleaving", as in every other line was alternated between the 2 cards. Nvidia SLI is "scalable link interface" and instead renders the top half of the image on one and the bottom on the other.

    It does make me wonder if the technology is capable of truly scaling ... ie ... more than 2 cards? Could be useful for scientific simulations or even getting closer to the idea of "ToyStory in realtime" (and no arguments here ... using the same shaders as Pixar used in the movies in realtime is not feasible today ... cheap tricks to get close, maybe).

    However, given the cost, and looking at what the 6800 can handle by itself, and comparing -those- to the evolution of games it appears to me that it will be no more costly to simply upgrade to a 6900/7000/whatever when it is required, as I can easily get by for the next year or two on a 6800 Ultra especially if including the fact that I would need a new computer to run it on since I don't have PCI-E (though I do have PCI-X, but not for gaming needs). And will be saving on electricity and mean time to failure (though that doesn't seem to be an issue much with video cards).

    Not saying I don't see the attraction, but I don't get anywhere NEAR interested in 3D gaming enough to be spending that kind of dough.
    • by Linuxthess ( 529239 ) on Tuesday November 23, 2004 @04:52PM (#10902828) Journal
      Could be useful for scientific simulations or even getting closer to the idea of "ToyStory in realtime"...

      I propose a new acronym, TSIRT which will be the standard of rendering performance, similar to the "LOC" (Library Of Congress) reference when comparing download speeds.

    • Re:SLI != SLI (Score:3, Informative)

      by JoeNiner ( 758431 )

      First it is mildly interesting to note that SLI from Voodoo was "scan-line interleaving", as in every other line was alternated between the 2 cards. Nvidia SLI is "scalable link interface" and instead renders the top half of the image on one and the bottom on the other.

      Actually, nvidia's solution does either [anandtech.com], based on their own testing of which performs better for a given game. The drivers include profiles of the 100 most popular 3D titles which state which technique to use.

    • 3d gaming? Wouldn't you like 2-4 monitors for a desktop or for video or photo editing (drooling at the thought.)

      Everyone keeps getting caught up in the idea that you would have to use this with just one monitor. I see it's potential much like a raid controller. Sure you can have two drives run together twice as fast but you can also use it to control the drives individually and increase the number you have. I can run either 2-3 RAIDs with 4 IDE devices on my computer or control 10 IDE/SATA devices wit


    • AnantTech has a nice article [anandtech.com] about the Nvidia SLI.
      It includes an explanation on how it works, power consumption and benchmarks of several games.
  • by Squeebee ( 719115 ) <squeebee AT gmail DOT com> on Tuesday November 23, 2004 @04:45PM (#10902709)
    So the guys at Nvidia were sitting around when in walk the PHB and says "Guys, we need to make more money". And flunkie one says "Hey, let's release a new card, all the fanboys will rush out and buy it!" PHB says "Well that's ok, but we do that enough already". Flunkie two says "I know, let's convince the users that the one overkill video card they buy is not enough, let's convince them that the need to buy TWO!" And the rest my friends, is history! Stay tuned for the new quad-card cash-vacuum, coming soon.
    • Stay tuned for the new quad-card cash-vacuum, coming soon.

      Interestingly, that might even work. According to the tests I saw (Anandtech or TechReport, can't remember), the PCIe videocards are only using about 4x of the available 16x anyway, so even with dual cards, they're only using half of the available PCIe lanes, so if they can figure out how to do it, quad cards _could_ work, in theory.

      Not that you'd find enough suckers with enough money to make it worthwile, I bet. :)

      I just wish my recently-purchas
      • I haven't seen any boards with more than two x4+ slots yet, are there any? If only there was some way to split the x16 slot, since according to this [gamepc.com] article not even an ATI X800 really stresses a x4 slot.
        • The problem is that there aren't going to be any chipsets with any more than 20 PCIe lanes, so once you split an x16 off from that, that leaves only 4 left. If you have one x4 slot and 1 x16, that's it, you're done.

          I don't know if the x16 graphics slot is handled differently than other PCIe slots. If so, you'd have to manage that somehow, as the upcoming SLI mobos have done. Really, I'd rather see chipsets with more than 20 PCIe lanes. I'd like a mobo with an x16 for graphics, say 2 x4 slots, and 2 1x slot
    • And I went out and bought one.

      You know what? Comments like yours are worthless. Thanks for your opinion that you think gaming isn't worth spending money on. The fact of the matter is, I am a gaming hobbyist. I like games, and I really like games running well on my rig. Setups like this push the dollar envelope, true, but how is it any worse than spending $1000 on a new golf driver?

      Come to think on it, SLI is better than a driver because the improvements are evident and more dramatic compared to
    • I would gladly buy two nVidia based PNY Quadro FX 4400 cards with a dual Opteron motherboard that supports SLI. I would use these two graphics cards in non-SLI mode most of the time so that they can drive 4 1600x1200 (or 2 3840x2400) screens at the same time, and use SLI only to simulate very detailed environments. I would also buy Matrox QID Pro cards that handle 4 monitors per card, a total of eight monitors. This setup would cost at least $5700 for the video cards alone ($900 for the QID Pro and about $2

    • Re:Double The Money (Score:5, Interesting)

      by Raptor CK ( 10482 ) on Tuesday November 23, 2004 @05:27PM (#10903265) Journal
      More like, "Hey, the last generation videocard is now obsolete, and no one wants it! How do we fix this next time?"

      "I know, let's make it so that if you buy a second one a year later, it'll work WITH the first one!"

      No one needs to buy two right off the bat. One is usually more than enough for any modern game. But one for a few hundred now, and the other for less than $100 later? That's a bargain basement upgrade, and one that's far more sensible than getting the new mid-range card now, and the new mid-range card a year from now.

      Now, if someone *wants* to buy two top of the line cards today, more power to them. They want the ultra-high-resolution games with all the effects cranked up, and they have the money. It makes their games look nicer, while my games run well enough. We both win, and Nvidia no longer sits on piles of unused chips.
      • Why not buy a $120 card now which will play any game at decent resolution and framerate and another newer card in 18 months or so when the next round of demanding games come out? You're still spending way less than one top of the line card and the only thing you can't do is turn on all of the eye candy or play at insane resolutions. (the one counterargument that I can think of is needing good framerates for a large lcd's native resolution, but I play turn based strategy games so what do I know =)
  • by Hackura ( 603389 )
    My question is whos got the 1100watt power supply that running 2 6800's is going to "require"?
  • by vasqzr ( 619165 ) <`vasqzr' `at' `netscape.net'> on Tuesday November 23, 2004 @04:49PM (#10902763)


    Dual webservers. Would have delayed the Slashdotting.

  • Ironic? (Score:5, Insightful)

    by goldspider ( 445116 ) on Tuesday November 23, 2004 @04:50PM (#10902779) Homepage
    I find it funny that some of the people who lamented the $15/mo. for WoW in the last article are probably the same people who will go out and drop $600 for a top-notch SLI video setup.
  • by Aggrazel ( 13616 ) <aggrazel@gmail.com> on Tuesday November 23, 2004 @04:52PM (#10902815) Journal
    LOL a link off the front page to a page filled with hundreds of screenshots?

    I weep for that man's router.
  • Ouch on Costs! (Score:3, Informative)

    by Evil W1zard ( 832703 ) on Tuesday November 23, 2004 @04:52PM (#10902818) Journal
    I can't imagine shelling out another couple hundred bucks for another XT pro and then shelling out even more money for a more robust power supply and better cooling as well. Its prolly great for those who can afford it, but I know I won't be doubling up anytime soon.
  • by Smidge204 ( 605297 ) on Tuesday November 23, 2004 @04:52PM (#10902821) Journal
    I'd like to see something set up so onboard video hardware can take advantage of this. It's difficult to get a motherboard that doesn't have onboard video anyway, and if you buy the right video card (ie: same manufacturer) they can both run to get an added performance boost. (You should, of course, be able to install any graphics card, but won't get anything extra for it)

    =Smidge=
  • by Pete (big-pete) ( 253496 ) * <peter_endean@hotmail.com> on Tuesday November 23, 2004 @04:54PM (#10902851)

    This month the UK "PC Pro" magazine has a review [pcpro.co.uk] of the Scan White Cobra [scan.co.uk] gaming machine.

    This is a fine example of SLI running with jaw dropping performance...a quote from the review puts Doom 3 running at 98fps!

    Now I know what I want for Christmas, just not a snowball's chance in hell of getting one! :)

    -- Pete.

  • Hercules? (Score:2, Offtopic)

    by michaelmalak ( 91262 )
    And here I thought the story was about once again running a debugger on a Hercules Monographics card while the app being debugged runs on the color card.
  • 32x (Score:3, Interesting)

    by Ann Coulter ( 614889 ) on Tuesday November 23, 2004 @05:01PM (#10902939)

    The PCI Express standard allows for 32x lanes. The nVidia SLI uses two 8x lanes. Wouldn't it be nice if a motherboard supported two (or more) 32x lanes and 32x graphics cards working in parallel? Think ray tracing because at those bandwidths, and the fact that there is a ergonomic limit on how small a pixel on a display can be, one can have the average size of a triangle be smaller than a pixel. This isn't true ray tracing but the effect is there.

    On a similar note, are GPUs a good platform for genuine ray tracing?

    • Re:32x (Score:5, Insightful)

      by harrkev ( 623093 ) <kevin.harrelson@ ... om minus painter> on Tuesday November 23, 2004 @05:23PM (#10903200) Homepage
      Ummmm... Ray tracing does NOT depend on the video card. If all you are doing is ray tracing, get an old Voodoo 2 or something for $10 from eBay.

      Ray Tracing uses the CPU to do all of the work. Video chips are optimized to do a lot of "shortcuts" and "tricks" to render a scene, and the math is completely different. Trying to make them do something else is like trying to strap fins on a donkey and turn it into a fish.

      A dual-core CPU, on the other hand, would work wonders on a ray tracing.
      • Wait and see. Sometime soon I bet a graphics card with "64-bit support" or something will be announced, and it will turn out to be able to work with double precision floats. High-end GPU's are generic vector FPU engines hidden between fancy drivers anyway, they just tend to be stuck in fixed-point or 32-bit floating point.
        • Uhhhhh, the Quadro 4000 and 4400 boards have "128-bit support". The Geforce 6800 has "64-bit support". With the exception of the Quadro 4400 boards, they are already available. Read the specifications on them.
        • As another poster pointed out the current generation of consumer cards are 64bit already. The Quadro line is 128bit, which is what you'd really want for raytracing. Btw with the C like shader languages already around it should be relativly easy to write a raytracer for modern (DX9 level) cards.
  • GPGPU (Score:4, Interesting)

    by ryanmfw ( 774163 ) on Tuesday November 23, 2004 @05:05PM (#10902993)
    This is actually a very interesting possibility for general purpose GPU programming, which aims to offload as much easily parallelizable operations off to the video card. If you can have two, running off of PCIe, you could get a big return in speed, allowing some very cool stuff to be done much quicker.

    Check out http://www.gpgpu.org/ [gpgpu.org] for cool stuff. And if I'm not mistaken, it is already possible to use SLI.


    Cheers,

  • by asliarun ( 636603 ) on Tuesday November 23, 2004 @05:09PM (#10903024)
    nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI
    A case of nVidia acting on the SLI?
  • We posted pictures here [amdzone.com] and here [amdzone.com] of the VIA SLI last month from AMD's tech tour in Houston. More interesting is our pics of the Tyan dual nForce 4 chipset board. That is two nForce 4 chipsets, two full 16X PCI Express slots, and two CPU sockets for Opteron.
  • SLI is a rip off. (Score:3, Interesting)

    by Anonymous Coward on Tuesday November 23, 2004 @05:20PM (#10903163)
    I'll wait for the dual GPU on a single card solution. You gain nothing from having 2 cards, the dual PCI express boards still have the same bandwidth the lanes are just split between the two.

    This simply forces you to get a new motherboard. Which I guess is a win for intel and nvidia eh?

    Let's see, get dual cards which requires a new motherboards, or wait and get a new video card that has gual GPU"s which takes about 10 minutes to install at most.

    I bet you ATI will do the dual GPU solution first and nvidia will go "fuck we should have learned from 3dFX's voodoo 5500"

    I had a 5000 series card, dual Gpu's on the SAME card amazing concept!

    The dual voodoo cards made sense in a day when you had a lot of spare pci slots. But ever since we've gone to the methodolgy of a single graphic slot it's not simply a matter of slapping in a new video card and connecting an sli connector, you have to get a whole new motherboard.

    I DO agree with a previous statement made that is if we could go up to 4 cards and 4 cpu's on a system. that kind of flexibility would be awesome.
  • One NVIDIA 6800 GT bottlenecks by most CPU's out today. The GPU has 222 million transistors, more than most CPU's available. In fact I'm not even aware of any CPU that exceeds that (I'm talking about home use processors).

    If you get two 6800 GT's working together, well if one GT is bottlenecked from most CPU's (the GPU has to actually wait a little bit more for the CPU to catch up), how can that CPU possibly catch up to two?

    I say that we should wait to buy SLI technology until better CPU's come out, or if

  • by BollocksToThis ( 595411 ) on Tuesday November 23, 2004 @05:38PM (#10903389) Journal
    I was sitting around yesterday wondering to myself "How can I make the inside of my computer hotter than the fires of hell?".
  • by UnknowingFool ( 672806 ) on Tuesday November 23, 2004 @06:38PM (#10903987)
    Does this mean that if I spend $600, my pr0n will be twice as detailed?
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday November 23, 2004 @08:18PM (#10904972)
    Comment removed based on user account deletion

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...