Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Gigabyte's Dual-GPU Graphics Card

CmdrTaco posted more than 9 years ago | from the two-is-one-more-than-one dept.

Graphics 252

kamerononfire writes "Tom's Hardware has an article on a new dual-GPU graphics card, to be released Friday, by Giga-byte: "According to sources, the SLI card will lift current 3DMark2003 record revels by a significant margin while being priced lower than ATI's and Nvidia's single-GPU high-end cards.""

Sorry! There are no comments related to the filter you selected.

Drivers? (5, Informative)

BWJones (18351) | more than 9 years ago | (#11108691)


So, the question will be: Can we get drivers for this card that will work in Linux or OS X? It is based in Nvidia technology, so presumably one could write drivers for this card unless Gigabyte is keeping their stuff proprietary.....

It looks interesting and I would certainly be more than interested in plugging one into my dual G5, but I don't have time (or the interest) to write my own drivers.

Re:Drivers? (5, Insightful)

Ianoo (711633) | more than 9 years ago | (#11108755)

It's almost certain that what Gigabyte have done is this:
  • Take the basic single GPU nVidia 6600 PCB
  • Lay down two on the same PCB with two GPUs
  • Link them together with a PCI Express switch
  • Reverse engineer the card bridge that nVidia is selling for SLI and connect whatever control signals are required as traces on the PCB.
It seems they can do this for a signficantly lower price than you can build two single cards.

The point is that if nVidia SLI is working under Linux, then this should too.

Re:Drivers? (3, Interesting)

hattig (47930) | more than 9 years ago | (#11109044)

I agree, but not about a PCI-Express switch. Most likely 8 PCIe channels go to one GPU, and the other 8 to the other.

What I want on the card is TWO DVI outputs though. And possibly another two available on the other GPU via a cable when not in SLI mode.

Re:Drivers? (0)

Anonymous Coward | more than 9 years ago | (#11109069)

The SLI connector is just a straight cable. No reverse engineering required.

Re:Drivers? (1)

SilentChris (452960) | more than 9 years ago | (#11109058)

"So, the question will be: Can we get drivers for this card that will work in Linux or OS X?"

Actually, to the vast majority of hardcore gamers (which this card is targetting) that won't matter. I have my Mac for desktop use, my Linux box for file serving and my Windows box for gaming. No need to get special drivers.

Re:Drivers? (1)

BWJones (18351) | more than 9 years ago | (#11109116)

Well, I would not be considered a hardcore gamer per se and I am absolutely not going to purchase another computer just for games, but I did help with the beta test development of Halo on OS X and a fast graphics card was nice to have for that process. It also helps push all the pixels on my huge Cinema Displays and kept me from getting fragged by all the 12 year old twitch meisters out there. Damn some of those kids are monsters.

Will this work with BSD? (-1, Troll)

Anonymous Coward | more than 9 years ago | (#11108694)

IMPORTANT UPDATE: Please show your support [calcgames.org] for Ceren in this poll of Geek Babes!

Is it any wonder people think Linux [debian.org] users are a bunch of flaming homosexuals [lemonparty.org] when its fronted by obviously gay losers [nylug.org] like these?! BSD [dragonflybsd.org] has a mascot [freebsd.org] who leaves us in no doubt that this is the OS for real men! If Linux had more hot chicks [hope-2000.org] and gorgeous babes [hope-2000.org] then maybe it would be able to compete with BSD [openbsd.org] ! Hell this girl [electricrain.com] should be a model!

Linux [gentoo.org] is a joke as long as it continues to lack sexy girls like her [dis.org] ! I mean just look at this girl [dis.org] ! Doesn't she [dis.org] excite you? I know this little hottie [dis.org] puts me in need of a cold shower! This guy looks like he is about to cream his pants standing next to such a fox [spilth.org] . As you can see, no man can resist this sexy [spilth.org] little minx [dis.org] . Don't you wish the guy in this [wigen.net] pic was you? Are you telling me you wouldn't like to get your hands on this ass [dis.org] ?! Wouldn't this [electricrain.com] just make your Christmas?! Yes doctor, this uber babe [electricrain.com] definitely gets my pulse racing! Oh how I envy the lucky girl in this [electricrain.com] shot! Linux [suse.com] has nothing that can possibly compete. Come on, you must admit she [imagewhore.com] is better than an overweight penguin [tamu.edu] or a gay looking goat [gnu.org] ! Wouldn't this [electricrain.com] be more liklely to influence your choice of OS?

With sexy chicks [minions.com] like the lovely Ceren [dis.org] you could have people queuing up to buy open source products. Could you really refuse to buy a copy of BSD [netbsd.org] if she [dis.org] told you to? Personally I know I would give my right arm to get this close [dis.org] to such a divine beauty [czarina.org] !

Don't be a fag [gay-sex-access.com] ! Join the campaign [slashdot.org] for more cute [wigen.net] open source babes [wigen.net] today!

$Id: ceren.html,v 9.0 2004/08/01 16:01:34 ceren_rocks Exp $

Re:Will this work with BSD? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#11109001)

Bah. They look like they hit the Krispy Kreme a bit too hard. This is a nice woman [mikeschicks.com] .

Re:Will this work with BSD? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#11109110)

Ceren is hot for a goth type girl, and of course, she excites the BSD crowd which is already deep into the cult of death.

But we, in the Linux community, prefer her [bakla.net] . So stick that red fork in your ass and take your BSD and Windows bullshit elsewhere, douchebag. Here we believe in Lunux.

Uh oh... (3, Funny)

koreth (409849) | more than 9 years ago | (#11108720)

record revels
I guess now we know where Kim Jong Il's roach went.

Re:Uh oh... (0)

Anonymous Coward | more than 9 years ago | (#11109232)

Dakalaka jaka laka jihad. Mat Damon.

Next year (2, Funny)

Anonymous Coward | more than 9 years ago | (#11108729)

They are coming out with a card that includes a gpu, cpu, hard drive, ram, motherboard, ethernet, sound AND it's a nuclear powered plus it will fit in your back pocket and transmit the monitor images straight to your visual cortex all the while making your breakfast and cleaning your basement.

Re:Next year (4, Funny)

tanguyr (468371) | more than 9 years ago | (#11108816)

really? will it run linux?

Re:Next year (2)

zx75 (304335) | more than 9 years ago | (#11108955)

ooh... beowulf!

Never Satisfied

Re:Next year (1)

_Sprocket_ (42527) | more than 9 years ago | (#11109126)

...the downside being how many back pockets you'll need to support the cluster.

Re:Next year (4, Funny)

LurkerXXX (667952) | more than 9 years ago | (#11108985)

No. God won't open-source the drivers.

Don't worry, we are working on reverse engineering them.

Re:Next year (1)

BlkSprk (840773) | more than 9 years ago | (#11109086)

Wait... your brain is open source... command to get Cortex Drivers in Debian apt-get install cortexdrivers

Re:Next year (1)

lordofthechia (598872) | more than 9 years ago | (#11108890)

More importantly, how big are the batteries?

Re:Next year (1)

Tuxedo Jack (648130) | more than 9 years ago | (#11108991)

And in Korea, only young people will use it. Old people will use standard PCs and graphics cards and complain about "young whippersnappers and their newfangled mini-PCs."

Bad Eggs? (0)

Anonymous Coward | more than 9 years ago | (#11109117)

...all the while making your breakfast and cleaning your basement.

What kind of breakfast is it making that will clean out my basement?

Re:Next year (0)

Anonymous Coward | more than 9 years ago | (#11109132)

How does it know that I live in the Basement? :)

Re:Next year (1)

MoeMoe (659154) | more than 9 years ago | (#11109160)

They are coming out with a card that includes a gpu, cpu, hard drive, ram, motherboard, ethernet, sound AND it's a nuclear powered plus it will fit in your back pocket and transmit the monitor images straight to your visual cortex all the while making your breakfast and cleaning your basement.

If it's nuclear powered, I wouldn't wanna keep that thing in my pocket... I like my equipment functioning and prefer to keep it exactly where it is...

Great idea (4, Interesting)

Anonymous Coward | more than 9 years ago | (#11108731)

That makes a lot more sense, store the textures once in shared memory instead of storing it twice as you would have to do in a two card solution.

Makes me wonder if Nvidia will have dual core gpus in the future.

Re:Great idea (0)

Anonymous Coward | more than 9 years ago | (#11108887)

It's probably just two 6600GT cards on one pcb, meaning 128mb for each 'card'.

Re:Great idea (0)

Anonymous Coward | more than 9 years ago | (#11108917)

GPUs are already highly parallel processors, so what would be the point of combining separate cores on a single die when they could just as well increase the number of functional units in a single core? The advantage of dual GPUs is probably more related to heat dissipation and production yield (smaller die at identical structural size = higher yield) and those would both vanish if they combined two cores on one chip.

What about 4 gpu 3dfx V5 6000? (3, Interesting)

Anonymous Coward | more than 9 years ago | (#11108743)

eom

Re:What about 4 gpu 3dfx V5 6000? (3, Interesting)

orthancstone (665890) | more than 9 years ago | (#11108800)

Yeah, apparently the author hasn't kept up with the graphics card industry. I would say perhaps he is only considering graphics cards that are realistically retail, but this one isn't on the market yet so I hardly feel that's applicable.

Re:What about 4 gpu 3dfx V5 6000? (2)

cnettel (836611) | more than 9 years ago | (#11108940)

Actually, the V5 did no T&L, AFAIK. The first "GPU" was the GeForce 256, and Nvidia motivated that by the fact that it had a (locked) T&L pipeline, not just triangle setup and texturing. (Hey, the Voodoo 2, fully normal, even hade 3 chips, two texture units and one triangle setup.) And to all of you talking about dual core chips: forget it. The current chips are parallel in every relevant way already and putting two of these highly parallel chips together on the same die wouldn't benefit compared to "just" adding more units. Heat and lower yields with increasing die sizes are reasons to not do that without some kind of limit. Therefore, it's no surprise that separated chips actually are able to perform better at a lower price point.

Re:What about 4 gpu 3dfx V5 6000? (1)

Ianoo (711633) | more than 9 years ago | (#11108969)

Considering that "GPU" is an invented marketing term (by nVidia themselves), it seems rather silly to call cards with T&L "GPUs" whilst those without T&L are not.

Re:What about 4 gpu 3dfx V5 6000? (2, Insightful)

supabeast! (84658) | more than 9 years ago | (#11109046)

GPU is not a marketing term, it's a technical term. Just because Nvidia came up with the term that ATI doesn't use doesn't make it any less technical a label than using RAM to describe random-access memory and ROM to describe read only-memory.

The V2 did not have triangle setup. (2, Informative)

i41Overlord (829913) | more than 9 years ago | (#11109255)

I had one. It had no triangle setup. Nvidia was the first to come out with on-board triangle setup.

Re:What about 4 gpu 3dfx V5 6000? (1)

DaHat (247651) | more than 9 years ago | (#11109317)

Or the V5 5500 which had just two GPU's.

Granted the thing was a POS (and is now in my junk box), it sure beat out this new one.

While not the same, I do recall and old Voodoo 2 card that was nothing more than two cards stuck to each other in SLI mode.

Hmmmm (-1, Troll)

Anonymous Coward | more than 9 years ago | (#11108752)

Did someone port apache to this thing and use it to serve Slashdot?

Doom for Gigabyte! (5, Insightful)

millisa (151093) | more than 9 years ago | (#11108767)

I bought my dual GPU 3DFx Voodoo5 around this time 4 years ago. . . and then the company was bought, support disappeared, and my fancy video card became worthless even quicker than it should have . . . I don't recollect seeing another 'dual gpu video card that will slay the market' announcement since . . .

Re:Doom for Gigabyte! (0)

Anonymous Coward | more than 9 years ago | (#11108782)

nvidia bought them. so you are seeing return of the same technology.

Re:Doom for Gigabyte! (1)

Malc (1751) | more than 9 years ago | (#11108808)

3DFx was already moribund at that point. Nothing could really save them.

Re:Doom for Gigabyte! (1)

UWC (664779) | more than 9 years ago | (#11108983)

I thought "GPU" wasn't widely used until nVidia introduced their GeForce cards, which, with the inclusion of transform and lighting processing on the graphics processor, they claimed to be the "first true GPU" or something like that. Apparently those bits had previously been handled by the CPU. Did 3Dfx ever make cards with T&L handled on-card? Seriously asking, as I don't recall. I remember 3Dfx in their final generation or two (wasn't Voodoo5 released before Voodoo4?) boasting about their new cinematic features like motion blurring but recall no mention of T&L on their cards.

Re:Doom for Gigabyte! (1)

Hast (24833) | more than 9 years ago | (#11109206)

Of course they didn't call their chips "GPU"s since nVidias marketing team hadn't invented the name yet. OTOH I think it is a pretty good name, at least today. (I'm not quite convinced that the GeForce 1 should be called GPU if you want a more technical term.)

And since it's "Graphical Processing Unit" pretty much anything that uses special chips to do graphic calculators have one. IMHO if it can't be used as a simple processor then it hasn't deserved the term GPU.

Deja Voodoo (4, Informative)

PurpleFloyd (149812) | more than 9 years ago | (#11108770)

As I recall, 3dfx used multi-GPU chips for its Voodoo 4 and 5 lines, and didn't do so well. Is there anything to indicate that this card will do better? After all, sticking with SLI and multicore technology after its prime was what killed 3dfx and allowed Nvidia to take its place; it'd be rather ironic to see Nvidia go down the same path.

Re:Deja Voodoo (1)

Scrybe (95209) | more than 9 years ago | (#11108925)

Don't forget the Radeon MAXX. This dude just mangled an already bad press release. And what's up with the 256Mbit memory bandwidth that's lie 4Mhz with a 64 bit bus. I think he meant "256 bit memory interface" which i bet is a marketing way of saying we got 2 GPU's wit h128 bit interfaces just like the 256MB if RAM is really 256 per GPU...

I hate maketing types, and wish Tom's was still as cool as it was in 1998...

Re:Deja Voodoo (4, Interesting)

RealErmine (621439) | more than 9 years ago | (#11109008)

Is there anything to indicate that this card will do better?

The Voodoo 4/5 were the most expensive cards on the market. This card is cheaper than a *SINGLE* Nv 6800 and outperforms it by a good margin.

Why buy a 6800?

Re:Deja Voodoo (2, Informative)

supabeast! (84658) | more than 9 years ago | (#11109021)

3DFX died not because of SLI, but because they put all the R&D funding toward anti-aliasing low resolution (640x480, 800x600) graphics. By the time they had it working well, Nvidia was producing chips that ran the same games just fine at 1024x768 and up with better texture filtering, which looked much better than anti-aliased low-res graphics.

The idea of slapping multiple chips on a card, or using multiple cards is still a good one, as long as the cards come out before someone else does something better with one chip.

Re:Deja Voodoo (4, Interesting)

Ianoo (711633) | more than 9 years ago | (#11109023)

The point was, I think, that the Voodoo 4 and Voodoo 5 were last ditch efforts for survival by 3DFX when faced with more competition from a fast-growing 3D acceleration industry. IIRC, the performance of those cards was nearly matched by a single GPU from nVidia, so they weren't an attractive deal (being large, expensive, power hungry beasts). This card, however, doesn't have any obvious competition, yet, and by the time it does, I'm sure nVidia will have added SLI to their latest and greatest too. Additionally, PC buyers and makers more readily accept large coolers, whereas in the days of the Voodoo 4, the cooling required for the heat generated by all the chips just seemed silly.

Re:Deja Voodoo (1)

Ignignot (782335) | more than 9 years ago | (#11109042)

I assume you mean it would be ironic if Nvidia was killed because they didn't incorporate multicore GPU's... or that nvidia was killed because they did incorporate multicore. Either way it isn't irony unless your last name is morisette - here is a definition of irony:

Irony involves the perception that things are not what they are said to be or what they seem.
That's from our sacred cow, wikipedia.

What you're looking for is poetic justice, which is defined as:
Poetic justice refers to a person receiving punishment intimately related to their crime. For example, "poetic justice" for a rapist would be becoming the rape victim; for a adulteress, having her spouse be an adulterer; and so on.

Re:Deja Voodoo (0)

Anonymous Coward | more than 9 years ago | (#11109085)

Wasn't there a company named Obsidian or something making multi-processor cards back in the day using the voodoo 1?

Re:Deja Voodoo (1)

BWJones (18351) | more than 9 years ago | (#11109203)

There were a whole lot of reasons why 3dfx went belly up, but SLI and multicore were not it. What I have always wondered and never heard was: What happened with the big Army contract that 3dfx got for running the new displays in helicopters? Did this project just go away or did another company step in? Nvidia?

Article title misleading (4, Informative)

caerwyn (38056) | more than 9 years ago | (#11108777)

The article title at Tom's Hardware is a little misleading. This is certainly *not* the first graphics card with two chips on it- back in the days of the ATI Rage chips, ATI had a Rage Fury MAXX that used two chips to render alternate frames.

Re:Article title misleading (1)

Trepalium (109107) | more than 9 years ago | (#11109239)

I think I've seen a Rage Fury MAXX working on a machine ONCE. The other times, it was always, "install driver, reboot windows, scream at black screen when Windows doesn't finish booting." Unfortunately, when the thing did work, playing the game was a little odd because you were always delayed by at least one frame (and that could be anywhere from less than 16ms [60fps] to more than 100ms [10fps] depending on the framerate at that exact moment). Not enough to seriously affect your ability to play the game, but usually enough to make the disconnect between your control and display obvious.

It was more common than that. (1)

Anonymous Coward | more than 9 years ago | (#11109259)

Oxygen cards come to mind for instance, with up to four GPUs per card.

Dubious Information (5, Interesting)

webword (82711) | more than 9 years ago | (#11108778)

Not based on actual data. Tom's Hardware has NOT run any tests yet. Take what you read with a grain of salt.

"Sources told Tom's Hardware Guide..."

"Tom's Hardware Guide's test lab staff will run the 3D1 through its benchmark track, as soon as the card becomes available."

IMHO, this is a PR coup by Gigabyte to get something into Tom's Hardware. But more importantly, why post this on Slashdot now? Let's see some data first. Let's see the results of the tests.

Re:Dubious Information (2, Insightful)

tanguyr (468371) | more than 9 years ago | (#11108849)

lies, damn lies, benchmarks,... and press releases.

Re:Dubious Information (1)

Hast (24833) | more than 9 years ago | (#11109270)

Well it's Tom's Hardware guide, so what did you expect? Relevant conclusions?

How does PS3/Xbox2/GC Compare? (0, Flamebait)

Anonymous Coward | more than 9 years ago | (#11108787)

PS3 and Xbox2 will have clusters of multiple chips inside. How does this compare to the this Graphic cards?

Re:How does PS3/Xbox2/GC Compare? (0)

Anonymous Coward | more than 9 years ago | (#11109283)

The PS3 chips will be baked and therefore lower in trans fats. The Xbox 2 ones will be fried in sunflower oil and, while trans fat free, will contain higher amounts of saturated fat.

Doom4 (2, Funny)

kompiluj (677438) | more than 9 years ago | (#11108796)

You will need two such cards to play Doom4 in 640x480 at 25 fps :)

Re:Doom4 (1, Funny)

Anonymous Coward | more than 9 years ago | (#11109073)

If you turn off shadows, it'll run at 28 fps but only 98% of the screen will be black.

like siamese weightlifters... (2, Informative)

infiniter (745494) | more than 9 years ago | (#11108809)

this reminds me of the voodoo2 cards. clearly we have hit another speedbump in video technology development, and if history serves as a good model we'll have to see a real revolution in architecture rather than speed before we can start moving away from brute-force improvement again.

Re:like siamese weightlifters... (1)

UWC (664779) | more than 9 years ago | (#11109088)

Keep in mind that Gigabyte is just a company licensed to use nVidia's chipsets in their cards. From what I can tell, it's Gigabyte taking advantage of the SLI capabilities and putting two cards' worth of hardware on a single PCI-Express board. This isn't nVidia trying to squeeze the last bit of power for lack of something better. I'm sure they're hard at work on the GeForce 7x00 chipsets.

No US or Europe release. (4, Funny)

Lethyos (408045) | more than 9 years ago | (#11108810)

...the SLI card will lift current 3DMark2003
record revels by a significant margin...

Unfortunately, it's only available in Asia.

Re:No US or Europe release. (2, Funny)

entrager (567758) | more than 9 years ago | (#11109014)

Old people in Korea like them... or whatever. I'm not good at cliche posts.

hmm (1)

compro01 (777531) | more than 9 years ago | (#11108818)

would you be able to run 2 of these cards for quad-GPU?

*hopes*

Re:hmm (1)

hattig (47930) | more than 9 years ago | (#11109141)

There's no inter-card SLI connector however, as that is used up between the two GPUs.

Whilst you could have two of these cards in your computer, they wouldn't operate together like SLI. Well, not until nVidia puts TWO SLI ports on each GPU, so you could connect them in a ring or something. You're currently stuck you having two cards to increase the number of monitors you can attach to your system.

Please address... (5, Funny)

Anonymous Coward | more than 9 years ago | (#11108820)

... the following Slashot community concerns:

1) Does it run under Linux?
2) Even better, can I install Linux on it?
3) Does it increase Firefox's market share?
4) Does it make Bill Gates look bad?
5) Is it in any way related to Star Wars?
6) Will it make my porn look better?

Prompt reponses will be greatly appreciated.

-Slashdot

Re:Please address... (1)

WIAKywbfatw (307557) | more than 9 years ago | (#11109017)

You forgot the first one:

0) does it play Ogg Vorbis files?

I'll take two, and dual SLI for quad power! (2, Interesting)

CYDVicious (834329) | more than 9 years ago | (#11108822)

And what are the chances of a dual GPU pci-express card coming out after this, with the compatibility to be run DUAL SLI mode with a 2nd Dual GPU card? ~CYD

Re:I'll take two, and dual SLI for quad power! (0)

Anonymous Coward | more than 9 years ago | (#11109096)

Sure to be the love and joy of your local power company.

Watch that meter spin! yo ve

Very nice...but Gigabyte? (1)

victorhooi (830021) | more than 9 years ago | (#11108830)

Looks pretty sweet... Sorry if this is silly, I've been out of the loop but SLI is Scan Line Interleaving, right? Anyway, from what I recall, Gigabyte was always known for the budget solutions - does this still hold true? I wonder how good their driver development will be - after all, ATI suffered early on from buggy drivers and the bad publicity which resulted, whilst NVidia managed to attain significant performance gains simply via releasing updated drivers. Also, anybody know if the Render/DRI/Xorg people have got wind of this? Would be nice if XDamage and composite could use it... Bye, Victor

Re:Very nice...but Gigabyte? (1)

Sandbox Conspiracy (836255) | more than 9 years ago | (#11109159)

In this instance, I believe the acronym stands for Scalable Link Interface. Scan Line Interweaving was applicable to the Voodoo2 cards.

Re:Very nice...but Gigabyte? (1)

hattig (47930) | more than 9 years ago | (#11109213)

SLI is not Scan Line Interleaving in the modern nVidia sense. Instead each card renders half the scene (by processing load, not half the screen, the difference is calculated by the drivers), and the slave card sends its rendered half to the other card to merge and output.

SLI should be fully supported by nVidias drivers. Gigabyte won't have to do a think, SLI is an nVidia feature. Even the Linux drivers should support it, as nVidia has a common core for the drivers.

Gigabyte? They're one of the big four motherboard manufacturers these days.

what goes around comes around aprently (1)

j14ast (258285) | more than 9 years ago | (#11108861)

*cough* voodoo 4 and 5 *cough*(there was one by 3dfx with 4 gpu's (may have been pre release)
*cough*some crappy third party card also hyped on tom's (the Xabre 800 by xgi a sis spin off *cough*

Me thinks I'll wait and see

( on a side note, how did sis not go the way of ali?)

Re:what goes around comes around aprently (0)

Anonymous Coward | more than 9 years ago | (#11109105)

They got their act together, that's how.

I used to own a PC with SiS graphics that was really terrible. After that experience I wouldn't have touched SiS with a barge pole. Now I know people with Athlon XP systems on SiS motherboards that work Just Fine (tm). In fact this board in question was very cheap, no frills, but is stable and seems quite fast. Better than the *shudder* VIA board I currently own (KT133 I think? Buggy as hell...).

Need Dual AGPs.... (2, Interesting)

muntumbomoklik (806936) | more than 9 years ago | (#11108863)

It really burns my butt to see all these fancy-pants cards being released every few months but since no motherboard manufacturer makes dual-AGP motherboards you can't use your 'old' card as a secondary display and the new one in tandem; you just gotta throw it out, or stick to PCI cards which sucks. Am I the only person who is surrounded by at least 4 monitors at one time and wants more AGP power to the other two?

Re:Need Dual AGPs.... (0, Redundant)

Loligo (12021) | more than 9 years ago | (#11108912)


AGP is old and busted.

PCI Express is the way to go now, and yes, you can have multiple slots per board.

-l

Re:Need Dual AGPs.... (4, Funny)

WIAKywbfatw (307557) | more than 9 years ago | (#11109043)

Slashdot UIDs less than six digits are old and busted.

Six digits plus is the way to go now, and yes, I am taking the piss out of your comment.

Re:Need Dual AGPs.... (4, Funny)

Cheeze (12756) | more than 9 years ago | (#11109285)

Watch your mouth, son....

darn younguns, with their crazy slashdot comments. Back in my day......blah blah blah...gosh durnit.

Re:Need Dual AGPs.... (0)

Anonymous Coward | more than 9 years ago | (#11108935)

It's called PCI Express, look into it.

Re:Need Dual AGPs.... (0)

Anonymous Coward | more than 9 years ago | (#11109138)

Um, how many PCI express boards can you buy right now with two slots?

I know some are being tested and will be released, but I haven't seen any for sale yet. Then again maybe it's because I live in the Arse End of the World...

I don't see the point of PCI express boards with only one 8x/16x slot personally... it mitigates the major advantage of the new system.

Re:Need Dual AGPs.... (1)

Milican (58140) | more than 9 years ago | (#11109254)

AGP [computerhope.com] is a point-to-point bus. That is, you cannot have more than one device on the bus. That is why you don't see multiple AGP slot solutions. While I'm sure someone could find a way hardware wise you would need software to do its part as well. PCI Express will change all of that.

JOhn

Re:You speak the truth... (0)

Anonymous Coward | more than 9 years ago | (#11109272)

AGP is not a bus, it is a one way connection between system RAM and a video card that allows up to 2.1GB/s one way throughput. This was the answer to the aging PCI bus inability to satisfy memory throughput requirements. Now AGP is obsolete, and its inability to act like a bus (and allow things like SLI) is one reason it is being replaced.

Since AGP is not a bus it does not do any switching, or signalling that are required if you want to put more than one device on the connection. It would be like trying to put two modems on the same phone line and using them at the same time. They would interfere. AGP is just a way to plug a video card into the system memory controller and RAM.

PCI-Express is a serial bus architecture that is full duplex and capable of combining lanes or channels to produce much higher throughput levels than AGP. It allows low throughput devices like ethernet cards (200MB/s) to use small connectors, while cards with high throughput demands (3.2GB/s+ ) like video cards use a larger 8x or 16x slot.

There are no dual-AGP MB because AGP is not a bus. you cannot have more than one. If you want dual cards get a motherboard that has PCI-Express, and PCI-Express with SLI support and two 16x connectors.

3DMark Scores (0)

Anonymous Coward | more than 9 years ago | (#11108875)

sarcasm on: Of course the only reason I'd buy this card is *just* so I can up my 3DMark Score. sarcasm off

No I didn't read the article, just felt it was silly that they pointed that out.

The Obvious.. (1, Interesting)

mavi_yelken (801565) | more than 9 years ago | (#11108891)

When will it be available?

Seems good, but not the first (1)

adler187 (448837) | more than 9 years ago | (#11108906)

FTA: "Gigabyte creates first dual-GPU graphics card"

In the infamous words of Bill Lumbergh: "Riiiight," I think they forgot about this [google.com] card.

Maybe they meant to say, Gigabyte creates first dual-GPU nVidia card, or some such. Or, maybe since only nVidia uses the term GPU (ATi uses VPU) it is implied that it is an nVidia card and thus the first dual-GPU nVidia card.

Blue light special on slot 9... (2, Insightful)

doorbot.com (184378) | more than 9 years ago | (#11108913)

Oh I see that these latest cards are finally taking the modder's advice and adding integrated blue LEDs, for that extra burst of raw rendering power.

I know that people are cutting holes in their cases so people can admire their wiring, but I'd like to pay a bit less and save the R&D costs on the appearance-enhancing design. Plus, if this is a budget card, will appearance matter as much? It's like putting nice rims on a Yugo, I see the point but you're not fooling anyone.

Re:Blue light special on slot 9... (1)

stienman (51024) | more than 9 years ago | (#11109091)

If they put the bling on their card, fewer modders will try doing the work themselves, and there will be fewer 'warranty' returns that wouldn't have failed without that extra little bit of help.

Plus, consider your market. Many (if not most) gamers who will pay for high end equipment want it to be easily distinguishable from low end equipment so they can show off their stuff. How many gamers paying $600 for a video card can recite exactly whats on their system at any given time? A very high percentage. Making that visual sells the card.

-Adam

Sucker Gamers. (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#11108922)

and will be sold as "luxury solution" for gamers by mid of January.

Gamers are the new suckers of this industry, compelled by corporations to sink more money every month into hardware and games.

Bored to see grapix card benchmarks ... (1, Insightful)

Anonymous Coward | more than 9 years ago | (#11108948)

Bored to see grapix card benchmarks ...

Tell me when a CRT and LCD makers start to include this grapix stuff into their display.

All those windowing kits ... themes... game engines ... font engines... font... need to be with display.

What i am saying is... just move my graphix card out-of my pc and bring in some standard... so that i can connect my computer to any 'standard' display... let it be cell phone... gel phone... huge projector... tv or car dash board display..

Re:Bored to see grapix card benchmarks ... (1)

Anonymous Coward | more than 9 years ago | (#11109219)

What you are suggesting (connecting to ANY display) would be about a million times easier by using the analog or digital VGA signal that is output by the graphics card exactly as it is designed today. In fact, today you can usually do that with any computer onitor, any modern projector and even a lot of new TVs!

The possibilities are mind boggling... (1)

William_Lee (834197) | more than 9 years ago | (#11108964)

Just imagine a beowulf cluster of these! /.ers everywhere rejoice, the pr0n holodeck is one step closer to reality...

I'm still waiting... (2, Funny)

Anonymous Coward | more than 9 years ago | (#11108995)

For the Bitboys card I pre-ordered.

Yes, but will there be drivers for Linux (0)

RichiP (18379) | more than 9 years ago | (#11108999)

My main gaming rigs all run on Linux. Will they support that platform? If they do, I'll have one on order this Christmas even if I have to ship it from Taiwan.

Re:Yes, but will there be drivers for Linux (0)

Anonymous Coward | more than 9 years ago | (#11109214)

You've gone to the effort to build multiple gaming rigs and then put Linux on them? What kind of gamer are you? I think you're full of shit.

Why do they have to reinvent the wheel .. (1)

sundru (709023) | more than 9 years ago | (#11109031)

I couldnt help noticing that Gigabyte had "patent pending" on it the sobs ,3DFX the VOODOO 5 had this technology 4 years ago,doesnt make sense, it would have been a mature technology by now, apparently Nvidia decided it was not to be undercut and slowly reinventing the wheel now . Bring back VooDoo. Its like digesting the mother of all graphics cards and spitting out yellow looking blobs ..SOBs aargghhh

Re:Why do they have to reinvent the wheel .. (0)

Anonymous Coward | more than 9 years ago | (#11109295)

Are you just trying to boast that you've got enough experience to remember 3dfx? If you do, then you'll remember that what they were doing 4 years ago was a desparate attempt to eke more performance out of an architecture that couldn't scale any further. They needed a break through with a new architectural approach, but they couldn't beat the pressure from nVidia.

3dfx failed to be creative and failed to keep up the pace. Why would you want to bring them back? They got bought by nVidia for a reason. RIP.

Ummm... noise? (1)

Leomania (137289) | more than 9 years ago | (#11109067)

I just spent a few buckazoids buying an Arctic Cooler for my 9800 Pro to quiet it down, and it had just one medium-speed fan. I can't imagine what this beast will sound like.

Okay, well, I guess I can...

- Leo

Performance (0)

Anonymous Coward | more than 9 years ago | (#11109161)

Who cares about Linux, what a waste. Its not like there are any decent games for Linux. Can it run EQ2 in high quality mode fluidly... the only reason I use Windows?

Two GPUs? (1)

Slime-dogg (120473) | more than 9 years ago | (#11109166)

How the hell am I supposed to watercool that? My box got uber-hot once I stuck a GeForce 5900XT in, combined with my HD's and CPU. I got a water-cooling system to combat this problem. Do they even make graphics card water blocks that support multiple chips?

Re:Two GPUs? (0)

Anonymous Coward | more than 9 years ago | (#11109318)

Not yet...?

Just Dual? (1)

coopaq (601975) | more than 9 years ago | (#11109190)

Well if they put another SLI connector on this card and I could buy two of these cards and run a total of 4 GPUs I would be very impressed!

Hey Gigabyte, Are you listening to this?

Not Excessive Enough (1)

yeremein (678037) | more than 9 years ago | (#11109233)

So this is basically two 6600GT cards glued together on one PCB. That's all well and good, but it's barely faster than one 6800 Ultra. It would probably be slower when gobs of video memory are required, because the quoted 256MB on a 256-bit bus is really split half and half between the two GPUs, and texture data will have to be duplicated in both, so there's less usable video memory on one of these contraptions than on a single-GPU 256MB card.

Two 6800s on a single card would be nice, though, as it would obviate the need to get a special SLI motherboard. Of course, if you could afford to buy such a monstrosity (and the gargantuan power supply and tornado of case fans you'd need too), you could probably afford a new motherboard too.

Re:Not Excessive Enough (1)

KarmaMB84 (743001) | more than 9 years ago | (#11109335)

The idea is that this is faster than a single 6800 and supposedly cheaper. That alone makes it an attractive alternative to the 6800. It depends on how much cheaper though.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?