Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Running Video Cards in Parallel

michael posted more than 10 years ago | from the seeing-double dept.

Graphics 263

G.A. Wells writes "Ars Technica has the scoop on a new, Alienware-developed graphics subsystem called Video Array that will let users run two PCI-Express graphics cards in parallel on special motherboards. The motherboard component was apparently developed in cooperation with Intel. Now if I could only win the lottery."

Sorry! There are no comments related to the filter you selected.

In other news... (5, Funny)

Unloaded (716598) | more than 10 years ago | (#9138789)

...Microsoft announced that Clippy had broken the before unheard of 2,000 fps barrier.

MD tag LLY 347 (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#9138811)


You stupid, stupid bitch... you nearly killed my wife and I on the beltway this morning. If I ever get a chance to see you or your car again when you're not running from me... you'll wish you were a better driver.

Re:In other news... (5, Funny)

david.given (6740) | more than 10 years ago | (#9139174)

Microsoft announced that Clippy had broken the before unheard of 2,000 fps barrier.

However, they went on to say that Clippy was still intact. They're going to try again using a bigger catapult and with a concrete-reinforced barrier.

what will this do? (-1, Redundant)

Anonymous Coward | more than 10 years ago | (#9138799)

what will this do exactly?

Re:what will this do? (3, Funny)

m4vrick (535695) | more than 10 years ago | (#9138831)

Now you can play solitary and minesweaper at the same time on a maximized screen. :)

Rats (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9138801)

first post and I don't have a slashdot account :-(

Man am I out of the loop. (4, Funny)

Randolpho (628485) | more than 10 years ago | (#9138802)

PCI-Express? What happened to AGP?

Seriously, I've been out of the PC market for too long. Alas, poor wallet. I had cash flow, Horatio.

Re:Man am I out of the loop. (4, Informative)

Laebshade (643478) | more than 10 years ago | (#9138855)

PCI-Express is meant to replace AGP. From what little I've read into it, it will require lower voltages than AGP and has a wider bus.

Re:Man am I out of the loop. (4, Informative)

Plutor (2994) | more than 10 years ago | (#9138924)

You've been out of the PC market for about a decade then, if you've never heard of PCI-Express. It's been proposed and talked about and raved about for years, but it's just now finally coming to market. The best thing is that it's not limited to a single slot per board! That's why this parallel thing is even possible.

Re:Man am I out of the loop. (1, Interesting)

Anonymous Coward | more than 10 years ago | (#9138959)

You can have several AGP slots on board. It's just that none of the manufacturers were interested in making such motherboards.

Re:Man am I out of the loop. (1)

zelphior (668354) | more than 10 years ago | (#9139298)

Are there any motherboards out on the market that have dual agp slots? I did a little bit of googling, and couldn't find any, and some sites state that no one makes them.

Re:Man am I out of the loop. (5, Interesting)

The_K4 (627653) | more than 10 years ago | (#9139162)

Since the PCI-Express spec defines switches (these are like P2P-bridges only they have 2 sub-buses) a mother board manufacturer could add 2 of 3 of these and get 4 PCI-Express Graphics ports (or 7 and get 8 ports) the problem is that every time you do this you have to share the total bandwidth at the highest level. Since PCI-Express does have more bandwidth the AGP 8x and 1/2 of that bandwidth is dedicated up the other 1/2 dedicated down. So the down-stream (where video cards use most of their bandwidth) is greater the AGP8x's TOTAL bandwidth. So this data path bottle next shouldn't be bad if you have 2 cards (might work well for 4 if they use the bus right).

Re:Man am I out of the loop. (1)

dabug911 (714069) | more than 10 years ago | (#9138965)

PCI Express is like gigbits faster then AGP is, so it a good thing to have video cards moving in that direction.

Re:Man am I out of the loop. (5, Informative)

Auntie Virus (772950) | more than 10 years ago | (#9138972)

There's a White Paper on PCI Express from Dell: Here [dell.com]

More human-like (0)

kernelfoobar (569784) | more than 10 years ago | (#9138804)

This brings computers to be more human-like, since we have 2 eyes (most of us), soon every desktop will have to monitors!

Mouhahahah!!!

Re:More human-like (0)

Anonymous Coward | more than 10 years ago | (#9138832)

Not the probable use. SGI has always made computers with multiple graphics cards and graphics busses just to crunch more numbers, not for direct display. This could be used for the same purpose as well.

Re:More human-like (1)

kernelfoobar (569784) | more than 10 years ago | (#9138865)

On a more serious note: wouldn't this be great to compare GFX cards and settle the 'nVidia Vs. ATI' wars?

RTFA (0)

Anonymous Coward | more than 10 years ago | (#9139019)

RTFA.

FP (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9138806)

Sorry couldn't resist. AC powaaah!
This reminds me of the 3dfx Voodoo stuff! Cool.

Quad-screen? (5, Interesting)

Vrallis (33290) | more than 10 years ago | (#9138808)

Hell, I couldn't care less about parallel processing for the video cards.

I want tri-head or quad-head video, but with at least AGP speeds. You can do it now, but only with PCI cards getting involved.

Re:Quad-screen? (5, Informative)

houghi (78078) | more than 10 years ago | (#9138909)

I want tri-head or quad-head video, but with at least AGP speeds

So order one now. They are available here [matrox.com] at Matrox.

Re:Quad-screen? (0)

Anonymous Coward | more than 10 years ago | (#9139353)

I'll speak for him.

What he means is: "I want to play the latest FPS games on mutiple monitors..."

Sure, there are some ways to do it, but they're all hacks.

Re:Quad-screen? (2, Interesting)

gr8_phk (621180) | more than 10 years ago | (#9139101)

I'll second that. Flight simulation begs for 3 screens, as do some driving and other games.

On another note, I suspect the only way it will really accelerate single images is in cases where render-to-texture is used. i.e. per-frame generation of shadow or environment maps. The completed maps could then be passed to the card that actually has the active frame buffer to be used in regular rendering. Two cards could at BEST double performance and nothing ever scales optimally.

Re:Quad-screen? (0)

Anonymous Coward | more than 10 years ago | (#9139183)

dunno what agp speed is, but sgi have a 128 head monster (16 pipes [1.6GB/s each], eight channels per pipe); and that's been available for some for years.

their latest Ultimate Vision has upto 32 pipes. does it use PCI-X? dunno; maybe just PCI.

http://www.sgi.com/visualization/onyx4/configs.h tm l

Re:Quad-screen? (0)

Anonymous Coward | more than 10 years ago | (#9139401)

SGI uses XIO in their big machines.

Essentially it's a precurser to PCI-Express.

Next comes dual AGP graphics. (1)

Trigun (685027) | more than 10 years ago | (#9138814)

dual Nvidia 5950's under Linux!
Just in time for DNF!

Re:Next comes dual AGP graphics. (4, Informative)

DaHat (247651) | more than 10 years ago | (#9139083)

Nay, the AGP standard is built around a single slot and a single graphics card. To permit two AGP cards running natively (via the AGP bus) in a single system would be quite difficult if not impossible, far easier to look to the future and a new technology to make it work better then any sort of hack job that could be done today.

Re:Next comes dual AGP graphics. (1)

Trigun (685027) | more than 10 years ago | (#9139264)

Yes, I know. I was going for the Trifecta of things not going to happen.

Although, I've never tried the dual 5950s. I guess we can replace that line with subtle, abstract humor probably grossly misplaced, is understood on Slashdot.

Re:Next comes dual AGP graphics. (4, Informative)

Jeremy Erwin (2054) | more than 10 years ago | (#9139435)

It's in the AGP 3.0 spec [intel.com] .

AGP3.0 allows a core-logic implementation to provide multiple AGP3.0 Ports. Each AGP3.0 Port is a bridge device with multiple AGP3.0 devices hanging off the secondary bus. Each Port has a separate Graphics AGP aperture and GART that is independent and not shared with another AGP3.0 Port; however, these are shared across the devices within a single AGP3.0 Port.

Re:Next comes dual AGP graphics. (0, Offtopic)

thryllkill (52874) | more than 10 years ago | (#9139124)

I hate to sound like I am trolling, but can we stop with the DNF jokes? They are not funny any more.

Every time I log on to /.

1. Whiz bang new technology
2. Not in production yet
3. It'll be out in time for DNF

and just to keep the cliches running...

4. ???
5. Profit.

(I actually got heckled for not using the profit sequence in a numbered list here)

Press Release (5, Informative)

Anonymous Coward | more than 10 years ago | (#9138817)

over here: clicky [alienware.com]

3dfx has done it again ... (2, Informative)

GNUALMAFUERTE (697061) | more than 10 years ago | (#9139234)

The 2nd best graphics-related company ever (Behind SGI) Had this technology back in '97.
Actually, all the Voodoo Line, the best 3D Card ever, had this tech called SLI, that let you use 2 cards in parallel. All you need was 2 Voodoos of the same kind and a flat cable. You could buy the cable, but the pinout was just exactly the same as the one used for 3 1/2 Floppy Drivers, so if you cutted the part that went to the 2nd drive (The one with a few pins switched), you could make it really cheap, besides the cost of those cards.
All voodoos had 2 units, one for textures, the other to do all the math, this way, you could use 1 voodoo to do the math, and the other to process textures.

Nowdays, if you have 2 voodoos 2, and 500Mhz, you can easily take 70 FPS on Quake 3.

Voodoo (5, Interesting)

Eu4ria (110578) | more than 10 years ago | (#9138818)

Didnt the early voodoo cards allow something similar to this ? I know they had a pass through from your 'normal' video card but i seem to remember the ability of running more and they would each do alternating scan lines.

Re:Voodoo (4, Informative)

scum-e-bag (211846) | more than 10 years ago | (#9138851)

The company was 3dFx, and it was thier Voodoo II cards that allowed the use of two cards a few years back, sometime around 1998 IIRC.

Re:Voodoo (2, Interesting)

Trigun (685027) | more than 10 years ago | (#9138869)

Yes, they did. Unfortunately, at the time they were too expensive and took up all of your extra slots on your mobo. Now, with integrated everything, it's not so bad.

Good idea implemented too early. Such is life.

Re:Voodoo (3, Interesting)

naoiseo (313146) | more than 10 years ago | (#9138879)

without a special motherboard, yes.

I think you could string something like 4 voodoo rush cards together or something (who knows if you got 4x performance, but I'm sure it went up not down)

Problem was, by the time they put this out there, the tech it was running was months behind cutting edge. 4x something old is easily forgotten.

Re:Voodoo (0)

naoiseo (313146) | more than 10 years ago | (#9138914)

or something

Re:Voodoo (4, Informative)

UnderScan (470605) | more than 10 years ago | (#9138886)

SLI - scan line interleve, was available for 3dfx Voodoo IIs (maybe even Voodoo 1) where the first card would process all the odd lines & the second card would process all the even lines.

Re:Voodoo (5, Interesting)

kamelkev (114875) | more than 10 years ago | (#9138980)

Voodoo was basically the beginning of the performance PC market, with tons of wierd options and card types.

Benchmarks for the old 3dfx V2 SLI can be seen here:

http://www4.tomshardware.com/graphic/19980204/

I was (and still am, although its in the junk pile) a 3dfx V2 owner, the performance of that card was just amazing at the time. The Voodoo and the Voodoo2 definitely changed the world of 3d gaming.

Also of interest is an API that came out much later for the 3dfx chipsets that actually let you use your 3dfx chipset (they didn't call it a GPU back in the day) as another system processor. If you were an efficient coder you could actually offload geometric and linear calculations to the card for things other than rendering. I can't seem to find the link for that though, it may be gone forever.

Re:Voodoo (1)

alex_tibbles (754541) | more than 10 years ago | (#9139266)

Well I'm still running my twin Voodoo2s SLIed. Doesn't run any games more recent than Deux Ex (1), but Rollcage runs smoothly. When Doom3 comes out then it'll be retired...

Re:Voodoo (1)

Halthar (669785) | more than 10 years ago | (#9139251)

If I recall correctly, there were Voodoo1 SLI type systems, but they were primarily used in arcades, and were produced by Quantum3D. They may not have been actual SLI systems, but they did use multiple 3Dfx chips, at any rate.

Quantum3D also produced the Obsidian3D cards, which were single slot Voodoo2 SLI cards, unfortunately because of the weight of the cards, they had a problem with bending unless you had a way to brace the end farthest from the PCI slot inside your case. Very long cards, very heavy cards

They seem to have been built like tanks though. I still have one at home that I keep around for playing some of the GLIDE only games that came out when 3Dfx was doing well.

While not SLI related, there was a card, I think the Voodoo6 6000, that never got released (at least to the best of my knowledge), it was basically a quad voodoo5, I think. It looked somewhat similar to the Nvidia 6800, including needing two slots worth of space on the MoBo. It also had it's own external power supply that would connect to the back of the card next to the VGA connector.

It makes me wonder how many of those 3Dfx people are still plodding away at Nvidia after the buyout. The GeForceFX cards and the GeForce 6800 cards really do look alot like that beast that 3Dfx was trying to release before they got purchased by Nvidia.

Light on Info (2, Interesting)

the morgawr (670303) | more than 10 years ago | (#9138826)

The PR mess is light on information and I don't have flash to view their site. Can someone give some technical information? e.g. How does this work? What does it really do? What can a typical gamer actually expect (surely it doesn't just double your power by sending every other frame to each card)?

interesting technology (5, Interesting)

cheese_wallet (88279) | more than 10 years ago | (#9138835)

I think it is great that a company has the will to do something like this, even if it doesn't catch on. It's cool to try something new, instead of just hanging back and doing the tried and true.

I'll admit I haven't yet read the whole article, but even though it says that it isn't tied to any one video card, that doesn't say to me that it can have multiple disparate cards. If it is doing something along the lines of SLI, I would guess that the speeds would need to be matched between the two cards. And that would imply having two of the same card, whatever card the user chooses.

But maybe not... maybe it's the advent of asymetric multi video processing.

Re:interesting technology (4, Interesting)

jonsmirl (114798) | more than 10 years ago | (#9139067)

You can do this today with Chromium [sourceforge.net] .

Chromium replaces your OpenGL library with one that farms the OpenGL drawing out to multiple machines. It's how display walls [psu.edu] are built.

You can use the same technique for multiple card in the same box.

Re:interesting technology (1)

cheese_wallet (88279) | more than 10 years ago | (#9139087)

"You can use the same technique for multiple card in the same box."

Yes, but wouldn't you still need extra hardware to merge the display of these multiple cards to a single monitor?

Re:interesting technology (1)

Soul-Burn666 (574119) | more than 10 years ago | (#9139188)

How about this:
Each card will recieve a number of scanlines to process according to it's strength, therefore making the rendering speed similar, and after that syncing the speed to the lower one.

There are still problems with different features that might or might not be available on one of the cards such as pixel shaders. Also anti-aliasing can be weird.

I think that's the sound of a pin dropping (1, Offtopic)

Phidoux (705500) | more than 10 years ago | (#9138852)

So they run sound cards in parallel too?

Old School (-1, Redundant)

Anonymous Coward | more than 10 years ago | (#9138856)

Voodoo two SLI [amazon.com]

I don't think the author really got it there... (0)

Anonymous Coward | more than 10 years ago | (#9138871)

The author of the Ars article seems to be under the impression that this would allow one to use two graphics cards from different manufacturers in the same machine. The most likely meaning in the material he provided was that people could use this with any manufacturer's cards, as long as the cards were identical. I doubt there could be that kind of cooperation in task-splitting implmented at the driver level given drivers from two different (competing!) manufacturers.

Re:I don't think the author really got it there... (2, Insightful)

Enrique1218 (603187) | more than 10 years ago | (#9139084)

Maybe they don't have to cooperate. Graphics card gennerally support the same standard (vga/directx/opengl). Perhaps, the video array will have its own driver/software component to receive the game data then parcel the data to each card.

Doom III (0)

Anonymous Coward | more than 10 years ago | (#9138875)

Welll, there go my savings. Who needs food anyway!

this isn't new (5, Informative)

f13nd (555737) | more than 10 years ago | (#9138877)

Alienware didn't invent this
the PCI and PCI Express have had this written into spec
AGP does too, but when was the last time you saw dual AGP slots on a mobo? (they do exist)

Re:this isn't new (1)

241comp (535228) | more than 10 years ago | (#9139007)

Do any non-Mac dual AGP motherboards exist? If so, could you list some or all of them so that I can do some research? Thanks!

Re:this isn't new (1)

SlamMan (221834) | more than 10 years ago | (#9139126)

No Apple motherboards do dual-AGP either.

I can dream...

Re:this isn't new (4, Informative)

BenBenBen (249969) | more than 10 years ago | (#9139129)

The AGP port spec lays it out; AGP is a preferred slot on the PCI bus, with four main enhancements (pipeline depth etc) designed to... Accelerate Graphics. Therefore, if you had more than one PCI bus, you could technically have more than one AGP port. However, I cannot find a single motherboard that offers 2 AGP slots, including looking in numerous AV/editing specialists, where I'd expect this osrt of thing to tip up.

Re:this isn't new (1)

LoudMusic (199347) | more than 10 years ago | (#9139260)

AGP does too, but when was the last time you saw dual AGP slots on a mobo? (they do exist)

Would you mind to enlighten us as to where we might find such a board?

Big Deal - PCI Express. Any one can add two video (3, Insightful)

liquidzero4 (566264) | more than 10 years ago | (#9138893)

So what technology did Alienware create here? None..

So they have one of the first MB's with two PCI Express slots. Big deal, soon MB's will contain many PCI-Express slots. Hopefully a lot more than 2.

Re:Big Deal - PCI Express. Any one can add two vid (1)

jcostantino (585892) | more than 10 years ago | (#9139028)

Like this? [apple.com]

Re:Big Deal - PCI Express. Any one can add two vid (1)

strictnein (318940) | more than 10 years ago | (#9139149)

That's PCI-X. PCI-X is different than PCI-Express.

too many standards (2, Funny)

jcostantino (585892) | more than 10 years ago | (#9139459)

I'm going back to ISA...

Re:Big Deal - PCI Express. Any one can add two vid (1)

strictnein (318940) | more than 10 years ago | (#9139096)

Can anyone show me an upcoming motherboard with two 16x PCI Express slots? From what I've read most (all? except for this one) will only have one 16x PCI express slot for the video card and then a couple 1x, 2x, and 4x slots PCI-Express slots for everything else. The reason I wanted to upgrade to a new computer this winter is in the hope that I could have two high-end video cards in the same system (instead of having one high-end agp and one mid-range PCI).

Oh, come on! (4, Insightful)

Short Circuit (52384) | more than 10 years ago | (#9138933)

All you really need is some way to copy the data in memory from one card to another.

Easy solution? Several high-speed serial connections in parallel between the two cards. With a little bit of circuitry on the card dedicated to keeping the data identical.

Or, with a little bit of a performance hit, you could keep each section of RAM separate, and route misses over the cables.

one of two things (1)

Darthmalt (775250) | more than 10 years ago | (#9138935)

The way I'm reading it either it makes cards from 2 diff manufactuers work together to display video from one card twice as fast ie beowulf clustering.
Or it make to diff cards support dual screen? But I've been doing that on my computer for years so that couldn't be it.

Nice, A complete Vapor-article. (4, Informative)

Gr8Apes (679165) | more than 10 years ago | (#9138940)

From the article: "The answers may have to wait until Q3/Q4". There are no performance numbers, no real statements of how it works, nothing much at all. Just wow, gee whiz, dual graphics cards in parallel. What exactly does "in parallel" mean? That's not even addressed.

Some things I thought of immediately reading this, great - two displays each driven by a separate card, or, better yet, quad displays driven by two cards. Nope, not a word about either possibility. The implication of the PR/article is that 3D graphics will be processed faster. How? Do they have some nifty way of combining two standard off the shelf graphics card signals into a single monitor? (Hint, it's hard enough getting the monitor to properly synch up with a single high performance graphics card!)

Since when does ArsTechnica merely regurgitate PRs? This was 99.999% vacuum.

Re:Nice, A complete Vapor-article. (1)

ProfBooty (172603) | more than 10 years ago | (#9139217)

a couple years back you could link together two voodoo2 cars together to run games at a higher resolution. It did require a special cable as I recal.

Perhaps something like that?

Re:Nice, A complete Vapor-article. (1)

archen (447353) | more than 10 years ago | (#9139422)

Some things I thought of immediately reading this, great - two displays each driven by a separate card

Am I missing something here, because I just set up two video cards up on my machine the other day. Now it's relativly easy to set up a dual display on a dual headed card, but using two cards takes a bit more work, especially if you have two completely different monitors that run different resolutions and refresh rates. Now I use an AGP and PCI card, but I imagine it's the same difference with PCI Express.

So I'm assuming that they're talking about using the power of another GPU to help processing, because there isn't anything really new about using two graphics cards.

Re:Nice, A complete Vapor-article. (1)

rokzy (687636) | more than 10 years ago | (#9139510)

>What exactly does "in parallel" mean?

well this is Alienware so I think they mean that one card is on top of the other and both are perpendicular to the MB, hence they are running "in parallel" with each other.

Are we going to need this... (5, Funny)

CodeMonkey4Hire (773870) | more than 10 years ago | (#9138945)

for Longhorn [slashdot.org] ?

nigritude ultramarine (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9138948)

Nigritude ULtramarine [mindsay.com]

Not reading the linked article... (-1, Offtopic)

Chris_Jefferson (581445) | more than 10 years ago | (#9138952)

Normally I don't read the linked article because it's slashdotted.

In this case I'm not reading it because their horribly misformed HTML doesn't render in Konqueror (and yes, it is their HTML. It's horrible. I imagine mozilla will render it because it's absorbed enough IE-specific hacks over the years)

While I realise slashdot isn't a website to be talking about valid HTML, perhaps it could at least be checked pages render properly and readably? (I'm guessing Safari fails too. Anyone got a mac to check?)

(OK, OK, I know I'm horribly off-topic)

Re:Not reading the linked article... (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#9138999)

Was readable in Safari 1.2...

Intel's Chipset only supports One x16 PCIe (2, Informative)

hattig (47930) | more than 10 years ago | (#9138953)

In fact all the first generation PCI-Express chipsets only support one x16 PCIe for graphics controller.

I doubt that Intel is going to make a 2 port one especially for Alienware.

So I expect it means that the second graphics card is plugged into a x4 or x1 PCIe connector.

Anyway, this is nothing special, it is all part of the specification. Hell, you could have two AGP v3 slots in a machine working at the same time - how do you think ATI's integrated graphics can work at the same time as an inserted AGP card's?

Re:Intel's Chipset only supports One x16 PCIe (1)

DaHat (247651) | more than 10 years ago | (#9139136)

Are you quite sure that ATI's integrated graphics are AGP based and not on the PCI bus? Last I checked, there was not a publicly available chipset that could handle more then one AGP video device in a system at a time. If it were possible, I'd throw out my nVidia card today and worship at the alter of ATI... I fear though that you must be mistaken.

Re:Intel's Chipset only supports One x16 PCIe (1)

hattig (47930) | more than 10 years ago | (#9139393)

I'm talking about ATI's integrated graphics chipsets, the ones where you can run the integrated graphics (with AGP functionality) at the same time as a graphics card in an AGP slot. Now whilst ATI could have done some clever stuff with a single AGP controller, it would make more sense to simply make use of the AGP v3 functionality that allows more than one AGP device in a system.

But ... sensible ideas are often eschewed in favour of some hair brained system that kinda worked all too often...

Re:Intel's Chipset only supports One x16 PCIe (1)

strictnein (318940) | more than 10 years ago | (#9139216)

In fact all the first generation PCI-Express chipsets only support one x16 PCIe for graphics controller.

Have you seen anything talking about second-generation chipsets that support two 16x PCI-express connectors?

This is what I want and I'm not getting a new computer until it happens.

Finally! (-1, Redundant)

Anonymous Coward | more than 10 years ago | (#9138957)

Graphics performance that is adequate enough to run Longhorn!

Re:Finally! (NOT!!) (1)

IAmAMacOSXAddict (718470) | more than 10 years ago | (#9139448)

Re-read the article your thinking of. M$ Long(wait)horn will require 3-4 times the GPU speed...

Thank would mean 3-4 GPU cards, not to mention the overhead for running them.

No Thanks, Make mine a mac, I have everything now what I've seen them planning for Longhorn. Feels like I'm back in 1984 waiting to see what Micro$oft copies from the mac... AGAIN...

Everything old is new again? (1, Informative)

Fortunato_NC (736786) | more than 10 years ago | (#9138961)

When Windows 98 came out, there was a new feature (that before had pretty much been limited to Matrox cards with a special driver) that would let you use multiple PCI and AGP video cards in the same motherboard with multiple monitors. At first glance, this seems like pretty much the same idea.

The article seems to claim that the cards will be able to split processing duties, even if they're not from the same manufacturer. That particular claim seems very dubious to me for some reason. Other than integrating two PCI-Express slots on a motherboard, I'm not sure Alienware has achieved anything here. Of course, should Alienware want to send me one of these to try out, I'll be happy to post my review on Slashdot.

Re:Everything old is new again? (2, Insightful)

skiflyer (716312) | more than 10 years ago | (#9139134)

I'm sure this will be said a million times during this thread, but this has nothing to do with multiple heads... it has to do with multiple cards serving 1 head.

Alienware did this before. (1)

thecombatwombat (571826) | more than 10 years ago | (#9138962)

I'm pretty sure it was alienware, when the voodoo 3 was really, really new I saw a little blurb in what I think was a computer gaming world about a system that would run up to 4 voodoo 3 cards. It didn't use SLI like the old voodoo 2s, but would split the screen up among the cards, a 2x2 grid with 4 cards, the blurb went on about crazy quake 2 framerates I think.

Is that new? (-1, Offtopic)

e-Trolley (771869) | more than 10 years ago | (#9138976)

I use a 3 head system with xinerama since a few years. It's damn slow but I love these 1600x1200x3 :)

Re:Is that new? (3, Insightful)

lucas teh geek (714343) | more than 10 years ago | (#9139046)

I belive you need to RTFA, this isnt about dual head setups. one head, two cards

Power hog... (0)

Anonymous Coward | more than 10 years ago | (#9139008)

The next gen card from nvidia is supposed to require two (unchained/direct) power connectors and a 300w supply.

You'll probably not only need the lottery to get this, you'll need to win another to pay your power bill every month...

Metabyte PGC (2, Informative)

Erwos (553607) | more than 10 years ago | (#9139051)

It looks like the same thing as Metabyte PGC - and Alienware was supposed to be the roll-out partner for that.

Nothing wrong with it, though - PGC actually did work, and was previewed independently by several people (I think Sharky?).

-Erwos

Great News (0)

Anonymous Coward | more than 10 years ago | (#9139099)

Now that we have infinite frame rate, someone should write game that I want to play. The other day I spent 2 hours waching GamingTV (which shows 5 minute clips froma various games 24/7, and find out that all games look almost the same. You are standing in the center of the screen (with sword, magic or skateboard) and move around killing peaople or putting "magic" to them (to take "health units" out of them, of course). Last interesting thing I saw was "Bridge Builder" 4 years ago. It was ZX-like game. I think that if there was nVidia cards back in 80ies and early 90ties, Lemmings would be game with main lemming in the center of the screen, walking around rooms and halls to find "magic bottle" and distroy evil enemy. Thanks god I was kid before nVidia.

How is this different than what I've been doing fo (-1)

shoppa (464619) | more than 10 years ago | (#9139109)

For the past many years I've been supporting Linux/XFree86 configurations of up to 8 CRT's, usually via multi-head PCI cards. Often it's a mix-and-match of hardware consisting of multiple PCI video cards and whatever video is on the motherboard.

How is this different than this vaporware configuration that the article is about? I read the article and I see nothing useful in terms of details.

Re:How is this different than what I've been doing (3, Informative)

IAmAMacOSXAddict (718470) | more than 10 years ago | (#9139361)

If you read the short article. they are talking about pushing all the data for a SINGLE monitor, through two PARALELL cards. Essentially allowing for twice the GPU power to crunch the graphics for the monitor.

You are running a bunch of video cards INDEPENDANT of each other. Clearly NOT THE SAME THING...

But do you need multiple monitors? (2, Interesting)

MtViewGuy (197597) | more than 10 years ago | (#9139111)

I think the big question we need to ask is do we really need multiple monitor setups?

Besides the obvious issue of hardware cost of multiple graphics cards and multiple monitors, you also have to consider desktop space issues. Even with today's flat-panel LCD's, two monitors will hog a lot of desktop space, something that might not be desirable in many cases.

I think there is a far better case for a single widescreen display instead of multiple displays. Besides having a lot less impact on hogging desktop space widescreen displays allow you to see videos in the original aspect ratio more clearly and also allow for things like seeing more of a spreadsheet, clearer preview of work you do with a desktop publishing program and (in the case of a pivotable display) make the reading of web pages easier and/or single page work with a DTP program easier. Is it small wonder why people so much liked the Apple Cinema Display that uses a 1.85 to 1 (approximately) aspect ratio?

Re:But do you need multiple monitors? (2, Interesting)

Jeff DeMaagd (2015) | more than 10 years ago | (#9139254)

Two 4:3 displays can be bought at a lower cost than one widescreen display.

Re:But do you need multiple monitors? (1)

MtViewGuy (197597) | more than 10 years ago | (#9139483)

Two 4:3 displays can be bought at a lower cost than one widescreen display.

I agree with that, but the desktop space hogged by two 17" LCD monitors is surprisingly large, far more than what you get with the Apple Cinema Display.

Besides, with large-scale manufacturing of widescreen LCD's the cost would come down very quickly. Remember, most of today's latest graphics cards can easily add display drivers that can support something akin in aspect ratio to the Apple Cinema Display (they're already part way there with the 1280x768 display driver used on some smaller widescreen LCD's).

Re:But do you need multiple monitors? (1)

aquabat (724032) | more than 10 years ago | (#9139389)

I thought the point was that multiple cards would split up the 3D number crunching calculations between them, and send the results to a master card to be displayed on a single monitor.

Re:But do you need multiple monitors? (1)

addaon (41825) | more than 10 years ago | (#9139480)

So, uh, why not multiple widescreen displays?

The real question (4, Interesting)

241comp (535228) | more than 10 years ago | (#9139144)

Is this compatible with Brook [stanford.edu] and other general-purpose GPU [gpgpu.org] programming techniques? The use I see for it is this:

Imagine an openmosix cluster of dual-processor machines that run bioinformatic calculations and simulations. Lots of matrix math and such - pretty fast (and definitely a lot faster than a single researcher's machine).

Now imagine the same cluster but each machine has 2 or 4 dual-head graphics cards and each algorithm that can be created in Brook or similar is. That gives each machine up to 2 CPU's and maybe 8 GPU's that may be used for processing. The machines are clustered so a group of ~12 commodity machines (1 rack) could have 24 CPU's and 96 GPU's. Now that would be some serious computing power - and relatively cheap too (since 1-generation old dual-head cards are ~$100-$150).

By the way, does anyone know if there is any work going on to create toolkits for Octave and/or MatLab which would utilize the processing power of a GPU for matrix math or other common calculations?

The Return of Voodoo 2 SLI (3, Informative)

MoZ-RedShirt (192423) | more than 10 years ago | (#9139243)

History repeating: Who can (or can't) remember [tomshardware.com]

Power to them if they can pull it off! (3, Insightful)

l33t-gu3lph1t3 (567059) | more than 10 years ago | (#9139259)

discrete parallel graphics processing has been around for a while. The most notable example of it is probably 3DFX and their Voodoo-2 cards. However, there's a problem with this tactic, namely, in the "diminishing gains" department.

So here's the question:

-How is pixel processing going to work? For a given frame, there is vertex, texture information, as well as the interesting little shader routines that work their magic on these pixels. How are you going to split up this workload between the 2 GPUs? you can't split a frame up between the GPUs, that would break all texture operations and there would be considerable overhead with the GPUs swapping data over the PCI bus. *MAYBE* having each gpu handle a frame in sequence would do the trick, but, again, it's a dicey issue.

It would appear to me that this dual-card graphics rendering is quite similiar to dual-gpu graphics cards. Except, where in a graphics card you can handle cache/memory coherency and logic arbiting easily due to the proximity of the GPUs, with this discrete solution you run the problem of having to use the PCI Express bus, which, as nice as it is, is certainly not that much faster than AGP.

So I say, power to you Alienware. If you can pull it off with Nvidia, ATi et all, great. It's too bad the cynical side of me thinks this idea reeks of those blue crystals marketing departments love :)

heh (0)

Anonymous Coward | more than 10 years ago | (#9139307)

..."would it be nice if you could plop in a GeForce 6800 Ultra and a Radeon X800 XT and get the best of both worlds?"...

no more fan boys! yay.

Re:heh (1)

m1chael (636773) | more than 10 years ago | (#9139343)

Next them be runnin' Lynux and da Windoze on ze zame cohmputor.

More interested in real dual PCI-Express GFX slots (1)

Performer Guy (69820) | more than 10 years ago | (#9139367)

They might be able to make this work for games but I'm personally more interested by the simple fact that Intel chipsets will support dual PCI Express graphics buses. Hopefully this will be possible on a reasonably proces mobo.

Having 3 slots would be ideal but I won't say no to 2 GFX cards so I can drive two monitors from two independent graphics cards at last.

It doesn't say how this technology will combine the two cards and whether it will need software support from the games. Hopefully it won't but the devil is in the details. I'm pretty skeptical about this at the moment. We need more details on the implementation.

how is this different from my voodoo2 cards? (1)

kerp11 (410921) | more than 10 years ago | (#9139497)

i had dual voodoo 2 cards that were linked together with a special cable to make them run in SLI (scan line interlacing) mode. quake certainly ran a lot faster once i got the second one installed.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?