Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New Multi-GPU Technology With No Strings Attached

kdawson posted more than 6 years ago | from the here-process-this dept.

Graphics 179

Vigile writes "Multi-GPU technology from both NVIDIA and ATI has long been dependent on many factors including specific motherboard chipsets and forcing gamers to buy similar GPUs within a single generation. A new company called Lucid Logix is showing off a product that could potentially allow vastly different GPUs to work in tandem while still promising near-linear scaling on up to four chips. The HYDRA Engine is dedicated silicon that dissects DirectX and OpenGL calls and modifies them directly to be distributed among the available graphics processors. That means the aging GeForce 6800 GT card in your closet might be useful once again and the future of one motherboard supporting both AMD and NVIDIA multi-GPU configurations could be very near."

cancel ×

179 comments

Sorry! There are no comments related to the filter you selected.

No strings? (5, Funny)

Plantain (1207762) | more than 6 years ago | (#24666643)

If there's no strings, how are they connected?

Re:No strings? (2, Funny)

Anonymous Coward | more than 6 years ago | (#24666659)

It's a GPU orgy. They'll find a way to connect.

Re:No strings? (5, Funny)

x2A (858210) | more than 6 years ago | (#24666699)

The theory will fit, there will be strings, we'll add more dimensions if we need to.

Re:No strings? (1, Funny)

Anonymous Coward | more than 6 years ago | (#24666711)

Tubes. Defiantly a series of tubes.

Re:No strings? (0)

Anonymous Coward | more than 6 years ago | (#24666747)

Duct Tape.

Re:No strings? (1)

Curtman (556920) | more than 6 years ago | (#24666751)

Tubes.. A whole series of them.

Re:No strings? (4, Funny)

philspear (1142299) | more than 6 years ago | (#24666833)

I'm sure there's a superstring theory joke in here somewhere. Unfortunately I don't understand string theory. I guess it's okay, since apperantly no one else does either.

I guess I'll just reference XKCD

http://xkcd.com/171/ [xkcd.com]

Q: OK, what would that imply? (3, Funny)

smitty_one_each (243267) | more than 6 years ago | (#24667211)

A: Satriani is a messenger from God.

Re:No strings? (0)

Anonymous Coward | more than 6 years ago | (#24668039)

Well there is that Michio Kaku guy http://www.mkaku.org/ [mkaku.org] , I think he understands atring theory.
My question is 'How Long is a piece of string?'

AMD and NVIDIA?? (2, Interesting)

iduno (834351) | more than 6 years ago | (#24666645)

is that suppose to be ATI and NVIDIA

No (4, Informative)

x2A (858210) | more than 6 years ago | (#24666675)

ATI were bought out by AMD, so future ATI GPUs will be released by AMD.

Re:No (1)

iduno (834351) | more than 6 years ago | (#24667691)

Guess that shows how long its been since I've bought a graphics card (or even played games)

Re:AMD and NVIDIA?? (1)

Deltaspectre (796409) | more than 6 years ago | (#24666677)

Re:AMD and NVIDIA?? (2, Funny)

Spatial (1235392) | more than 6 years ago | (#24666827)

Shit, it's last year already?

Re:AMD and NVIDIA?? (1)

maxume (22995) | more than 6 years ago | (#24666951)

For some people, it is already tomorrow.

Re:AMD and NVIDIA?? (1)

Ant P. (974313) | more than 6 years ago | (#24666843)

Is that supposed to be "supposed"?

Re:AMD and NVIDIA?? (0, Redundant)

gEvil (beta) (945888) | more than 6 years ago | (#24667011)

Is that supposed to be "supposed"?

I suppose so....

Re:AMD and NVIDIA?? (0, Redundant)

Surt (22457) | more than 6 years ago | (#24667171)

I suppose you meant is that "suppose" supposed to be "supposed"

Interesting (4, Informative)

dreamchaser (49529) | more than 6 years ago | (#24666649)

I gave TFA a quick perusal and it looks like some sort of profiling is done. I was about to ask about how it handled load balancing when using GPU's of disparate power, but perhaps that has something to do with it. It may even run some type of micro-benchmarks to determine which card has more power and then distribute the load accordingly.

I'll reserve judgement until I see reviews of it really working. From TFA it looks like it has some interesting potential capabilities, especially for multi-monitor use.

Re:Interesting (3, Insightful)

argent (18001) | more than 6 years ago | (#24666731)

It seems to be using feedback from the rendering itself. If one GPU falls behind, it sends more work to the other GPU. It may have some kind of database of cards to prime the algorithm, but there's no reason it has to run extra benchmarking jobs.

Re:Interesting (1)

dreamchaser (49529) | more than 6 years ago | (#24666805)

True, but I was thinking that the software could profile/benchmark both cards at startup. What you say makes sense though, and like I said I only gave TFA a cursory viewing.

Re:Interesting (1)

MadnessASAP (1052274) | more than 6 years ago | (#24668351)

Especially since different cards will perform different operations faster then others. One I specifically ran into was the nVidia 5(6?)600 which although a fast card at the time, its PS2.0 implementation was so slow to be practically unusable and many games overrode it and forced it back to PS1.1.

Re:Interesting (4, Informative)

x2A (858210) | more than 6 years ago | (#24667093)

"It seems to be using feedback from the rendering itself"

Yep it does look like it's worked out dynamically; the article states that you can start watching a movie on another monitor while scene rendering on another, and it will compensate by sending fewer tasks to the busy card. Simplest way I'd assume to do this would be to keep feeding tasks into each cards pipeline until the scene is rendered. If one completes tasks quicker than the other, it will get more tasks fed in. I guess you'd either need to load the textures into all cards, or the rendering of sections of the scene could have to be decided in part by which card as textures it needs already in its texture memory.

I guess we're not gonna know a huge amount as these are areas they're understandably keeping close to their chests.

Re:Interesting (1)

The Ancients (626689) | more than 6 years ago | (#24666803)

Perhaps it keeps shovelling work to each GPU until it notices an increase in time getting the results back, and then surmises that GPU is at it's maximum load?

Re:Interesting (2, Interesting)

TerranFury (726743) | more than 6 years ago | (#24668119)

I gave TFA a quick perusal

FYI, this is a very common mistake in English, and loathe though I am to be a vocabulary Nazi, I think pointing it out here might benefit other Slashdot readers:

The verb "to peruse" means to read thoroughly and slowly. It does not mean "to skim" -- quite the opposite!

(Unfortunately, it seems that even the OED is giving up this fight, so maybe I should too.)

That's it for this post. Cheers!

quick (0)

Sir_Real (179104) | more than 6 years ago | (#24666665)

Someone port java to opengl.

Seriously. That would rock.

Re:quick (2, Insightful)

jfim (1167051) | more than 6 years ago | (#24666743)

Someone port java to opengl.

Seriously. That would rock.

You can use OpenGL from Java with JOGL [java.net] . Or were you thinking of running a Java or J2EE stack on your GPU? (That would be a really bad idea, in case there were any doubts)

Re:quick (1)

lee1026 (876806) | more than 6 years ago | (#24667141)

For extremely parallel code, why not?

Re:quick (1)

x2A (858210) | more than 6 years ago | (#24667239)

GPUs very good at running a series of highly parallel maths... but comparisons and conditional jumps required for general computing erm... stuff?

I dunno, think that used to be the case, not sure whether it still is but I'd guess so.

Re:quick (1)

JuzzFunky (796384) | more than 6 years ago | (#24667971)

Have a look at NVIDIA's CUDA [nvidia.com] for an example of General Purpose GPU processing.

Re:quick (1)

MadnessASAP (1052274) | more than 6 years ago | (#24668379)

Sort of, CUDA requires that every process complete in the same time. So something like a ray tracing algorithm that will complete in the same amount of time regardless of inputs works great whereas some other systems that use branching and looping wont work.

Re:quick (2, Informative)

x2A (858210) | more than 6 years ago | (#24668413)

A quick glance does look like this is the case. It's like having a load of extra ALUs which can speed up number crunching in apps where the same or similar actions need to be performed against a series of values, such as working with matrices, FFTs, signal encoding/decoding. But GP computing also needs flow control; conditional branching, which still needs the main CPU. (Memory management also, but GPUs do have at least basic memory management as they have increasingly large chunks of memory for caching textures etc). Setting up the GPU to do stuff for ya takes overhead, which pays off if you're using it enough, so yeah being able to write functions that take advantage of it from languages like java could be beneficial, but you couldn't port the actual java vm over to the GPUs.

(but still with the disclamer "from a quick glance" - deeper inspection may prove otherwise)

Re:quick (0)

Anonymous Coward | more than 6 years ago | (#24667293)

No, the other way around, so you don't just end up with slow OpenGL.

Re:quick (2, Funny)

billcopc (196330) | more than 6 years ago | (#24668245)

No no, you're thinking "port OpenGL to Java". I want to see a Java VM written in OpenGL shader language.

Maybe having 384 lame little stream processors will make Java fast enough to compete with... um... Borland Pascal on a Cyrix 386.

Non-Windows drivers? (4, Insightful)

argent (18001) | more than 6 years ago | (#24666693)

Can it work with Linux or OS X?

Re:Non-Windows drivers? (0, Troll)

Anonymous Coward | more than 6 years ago | (#24666855)

Yeah cause so many Mac users customize their Mac.. and the Linux driver ABI is so stable it's worth doubling the development time to support it.

Re:Non-Windows drivers? (0)

Anonymous Coward | more than 6 years ago | (#24668309)

Mark this as a troll if you wish, but it wasn't an attack on the OP, only on the Linux community's attitude towards driver developers (and application developers in general). While Linux remains fragmented into a dozen distributions, each running different kernels with different patches, and there is no stable ABI for driver developers or application developers, you won't see any decent drivers or closed-source apps from anything but the largest companies. It makes absolutely no business sense to develop a driver for a piece of hardware on Linux if a) you'll have a constant stream of people complaining that it doesn't work with their distribution, because there are so many configurations, and b) your driver will be broken in the near future when Linus or someone decides to randomly tweak the ABI. Nvidia did it with graphics drivers.. and they still break all the time, resulting in fun times at the console whenever I update my Ubuntu. Adobe did it with Flash.. and it worked until people decided to have 64-bit distros and include 64-bit web browsers, cause yeah, maybe I want more than 4 GB RAM on my browser; when all the browsers on Windows and Mac OS are 32-bit, and porting Flash to x64 would require rewriting all of the just-in-time compliation code in there to compile user scripts to 64-bit code rather than 32-bit. And what do these companies get in response? Just calls from the community to open-source their code.. apart from giving away their intellectual property to competitors, this solution also involves hoping that some hobbyist programmers do a decent job tweaking your code to work in their particular configurations, and then dealing with all the support issues that come in when the modified code breaks.

Re:Non-Windows drivers? (1)

Ant P. (974313) | more than 6 years ago | (#24666873)

They'll have no choice if they want to stay in business. PhysX made windows-only hardware, and look what happened to them (eaten by nVidia).

Re:Non-Windows drivers? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#24667645)

Ugh..the fact Ageia didn't do so hot and was bought by nVidia had NOTHING to do with the fact they didn't make drivers for OSX or Linux (i.e. 10% or less of the market) it's because there was no need for dedicated hardware since it could be incorporated into the GPU as it has been done in nVidias latest GTX2x0 series of cards.

Re:Non-Windows drivers? (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24668335)

No shit, idiot. Way to miss the joke.

I WOULD LIKE TO SAY THANKS TO A SLASHOLE (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#24666701)

To whomever told me that I could "skip the hotel bill" and just camp out at a KOA in Florida while I was in town to see John McCain's speech, I'd like to say Thank You. And Fuck You Very Much.

The amount of rain they get in Florida is unbelievable. I don't think I can even put it in words. WTF. This is the most hostile camping location I've ever been to and that says a LOT. How do people in Florida put up with this shit? It's been doing this all day!! Worst location ever!

Not to mention my wife is with me. She is pretty fucking pleased. I'm not sure if I'll die from exposure to this fucking weather and wind, or from exposure to her caustic glare. WTF Florida!?

So to that helpful slashdot poster, rest assured, let me just say GFYS and good day. I said good day sir.

Re:I WOULD LIKE TO SAY THANKS TO A SLASHOLE (0, Offtopic)

mr_mischief (456295) | more than 6 years ago | (#24667841)

Watch the forecasts. Tropical storms they're talking about potentially becoming hurricanes tend to dump a bunch of rain.

Math Coprocessor (1, Funny)

Anonymous Coward | more than 6 years ago | (#24666749)

Ack! Will this outdate my math coprocessor?

Re:Math Coprocessor (2, Funny)

9Nails (634052) | more than 6 years ago | (#24667379)

Your math co would still be good, but your turbo switch will need to be set to off.

Re:Math Coprocessor (1)

mr_mischief (456295) | more than 6 years ago | (#24667859)

What about my clock potentiometer?

Latency. (0, Troll)

Anonymous Coward | more than 6 years ago | (#24666759)

Latency will be a problem. All that extra message passing and emulation layers.

Already, most Windows 3d games lead me feeling a little disconnected compared to DOS games.
The sound effects and graphics always lag behind the input a little.

Try playing doom in DOS with a soundblaster, then try a modern windows game. With doom you hear and see the gun go off when you hit the fire button. In a modern 3d game, you don't.

I've experienced the same thing over a number of different computers.

Re:Latency. (3, Insightful)

Bill, Shooter of Bul (629286) | more than 6 years ago | (#24666901)

What is this 1996? That was true of Doom vs WinDoom before Direct X. I don't think I've had a problem since then.

Re:Latency. (0)

Anonymous Coward | more than 6 years ago | (#24668409)

It's just how trolling is nowadays.

People are so web weary that to troll effectively you often end up being so subtle as to be either 'interesting' or 'informative'.

Re:Latency. (4, Informative)

Mprx (82435) | more than 6 years ago | (#24666975)

I agree this is a common problem in modern games; see http://www.gamasutra.com/view/feature/1942/programming_responsiveness.php [gamasutra.com]

Don't confuse control latency with reaction time. Reaction time will be at least 150ms for even the best players, but humans can notice time delays much smaller than best reaction time. A good rhythm game player can hit frame exact timing at 60fps -- a 17ms time window. With low enough latency the game character feels like a part of your own body, rather than something you are indirectly influencing.

The same thing applies to GUIs, and only a very short delay will destroy that feeling of transparency of action. I never actually used BeOS myself, but I read that it was designed with low interface latency as a priority, which was why it got such good reviews for user experience.

Re:Latency. (1, Interesting)

Anonymous Coward | more than 6 years ago | (#24667637)

So reaction times are about 150ms, but if you know in advance, you can hit 17ms?

That's some interesting stuff.
As far as rhythm games go, I make my living as a musician. Most instruments.

I wonder if I have got into the habit of intending physical actions in advance, so reaction time is less important than latency.

In that article they mention frame jitter as well, which I think is the thing that bothers me most. My usual instruments don't do that.

Re:Latency. (2, Interesting)

Mprx (82435) | more than 6 years ago | (#24667771)

Absolutely, and I'm sure musicians will be most sensitive to these effects. Performance isn't just playing the notes as written, it's about phrasing/groove/swing/etc., and this requires very subtle and precise timing. And just like frame timing jitter in a game, it contributes to "feeling" of quality in a way untrained listeners will likely notice without being able to explain.

Re:Latency. (1)

Bill, Shooter of Bul (629286) | more than 6 years ago | (#24667707)

The article was interesting, but it doesn't explain why Doom would feel less like there was less latency. Wouldn't the main loop be the same thing?

Re:Latency. (1)

RulerOf (975607) | more than 6 years ago | (#24667883)

The same thing applies to GUIs,

And that's the only feature the iPhone has that I would readily sacrifice a hundred virgins for to obtain on my Blackberry.

Re:Latency. (4, Informative)

Colonel Korn (1258968) | more than 6 years ago | (#24666979)

Latency will be a problem. All that extra message passing and emulation layers.

Already, most Windows 3d games lead me feeling a little disconnected compared to DOS games.
The sound effects and graphics always lag behind the input a little.

Try playing doom in DOS with a soundblaster, then try a modern windows game. With doom you hear and see the gun go off when you hit the fire button. In a modern 3d game, you don't.

I've experienced the same thing over a number of different computers.

Most monitors have about a 30-50 ms input lag, meaning the image is always a frame or two behind in most modern games. You can get a 0-5 ms input lag monitor, though. The DS-263n is a good example. I felt like everything was lagged ever since I switched to LCDs, but once I picked up the 263, that feeling is gone. The feeling of sound lagging input could be a different issue or it could be psychological.

Re:Latency. (2, Funny)

Ironchew (1069966) | more than 6 years ago | (#24667433)

The feeling of sound lagging input could be a different issue or it could be psychological.

Or in 30-50 milliseconds it could be, y'know, the speed of sound.

Re:Latency. (1)

porl (932021) | more than 6 years ago | (#24667721)

....if you are sitting about 10-15m from your speakers......

you must have a pretty big gaming setup... :)

Re:Latency. (1)

Jimbob The Mighty (1282418) | more than 6 years ago | (#24667803)

Apologies for my ignorance, but is this an issue with LCD monitors only? Is being guttertech and still using a CRT helping me to remain an LPB?

Re:Latency. (3, Informative)

MaineCoon (12585) | more than 6 years ago | (#24667949)

Yes, CRTs have something like 1-2 ms latency + refresh rate.

Re:Latency. (2, Informative)

Tanman (90298) | more than 6 years ago | (#24667111)

This is most likely due to a feature on newer cards and drivers whereby the video card actually renders ahead of time. They have algorithms to predict what the next 3-5 frames will be and the GPU uses extra cycles to go ahead and render those. That way, if something happens like there turns out to be a high-geometry object that pops out and gets loaded into memory, the video card has a buffer before the user notices a drop in their framerate.

The drawback? You get control lag. Newer drivers let you adjust this or even set it to 0 -- but you will notice that your overall FPS will decrease if you set it to 0 because the card cannot optimize ahead-of-time that way.

obligatory ... (1, Funny)

bwthomas (796211) | more than 6 years ago | (#24666773)

...

All your GPU are belong to Lucid!

(sorry guys)

no Strings attached (4, Funny)

Daimanta (1140543) | more than 6 years ago | (#24666837)

what is attached though:

ints
booleans
longs
short
bytes

Re:no Strings attached (1)

Michael Hunt (585391) | more than 6 years ago | (#24667255)

float4x4s for the most part, i'd be guessing

Re:no Strings attached (1)

Annymouse Cowherd (1037080) | more than 6 years ago | (#24667327)

char*s

Re:no Strings attached (3, Funny)

lolwhat (1282234) | more than 6 years ago | (#24667831)

what is attached though:

ints booleans longs short bytes

what about lists, you insensitive clod!I call lisp discrimination.

Time to make them imcompatible! (3, Insightful)

TibbonZero (571809) | more than 6 years ago | (#24666839)

So its obvious that these cards could have been working together now for some time. They aren't as incompatible as AMD and NVidia would like us to think. Of course this leaves only one course of action; they must immediately do something "weird" in their next releases to make them no longer compatible.

Re:Time to make them imcompatible! (1)

zappepcs (820751) | more than 6 years ago | (#24666893)

I thought that too, hopefully NVidia turns around and says "Ok, here is a cheaper version of model xyz, stripped down a bit, so you can afford to buy 4 from us for this little trick"

I'm not holding my breath but it would be worth the money methinks, even if they only sold them in 4 packs.

Re:Time to make them imcompatible! (4, Interesting)

im_thatoneguy (819432) | more than 6 years ago | (#24667135)

Don't you mean Wierd(er).

The reason NVidia requires such symetrical cards isn't just because of speed and frame buffer synchronization but also because different cards render different scenes slightly differently. This is the reason why OpenGL rendering isn't used very often in post production. You can't have two frames come back to you with slightly different gammas, whitepoints, blending algorithms etc etc.

I'm actually very very curious how they intend to resolve every potential source of image inconsistancy between frame buffers. It seems like it would have to almost use the 3D Cards abstractly as a sort of CPU accelleration unit not an actual FrameBuffer generator.

Re:Time to make them imcompatible! (3, Informative)

LarsG (31008) | more than 6 years ago | (#24667371)

From the screen shots and the description, it sounds like this thing takes the d3d (or ogl) instruction stream, finds tasks that can be done in parallel and partition them up across several cards. Then it sends each stream to a card, using the regular d3d/ogl driver for the card. At the back end it merges the resulting framebuffers.

What I'd like to know is how they intend to handle a situation where the gpus have different capabilities. If you have a dx9 and a dx10 card, will it fall back to the lowest common denominator?

Also, what about cards that produce different results? Say, two cards that does anti-aliasing slightly different. The article says that Hydra will often change the work passed off to each card (or even the strategy for dividing work amongst the cards) on a frame by frame basis. If they produce different results you'd end up with flicker and strange artefacts.

Sounds like interesting technology but unless they get all those edge cases right...

Re:Time to make them imcompatible! (1)

x2A (858210) | more than 6 years ago | (#24667193)

"aren't as incompatible as AMD and NVidia would like us to think"

Nope, only incompatible enough to need extra hardware, translation layers, drivers, and all the R&D required to produce 'em...

Re:Time to make them imcompatible! (1)

Spatial (1235392) | more than 6 years ago | (#24667273)

You can already pair up unmatched AMD cards as long as they both have crossfire capability. I can't recall the exact details offhand, but I remember reading benchmarks of it on Anandtech some time ago.

Re:Time to make them imcompatible! (0)

Anonymous Coward | more than 6 years ago | (#24667961)

If the approach is straight forward enough to basically appear like an application just trying to render data, which I assume it is. Then there is no way for them to do that.

My god... (3, Interesting)

bigtallmofo (695287) | more than 6 years ago | (#24666865)

Next you'll need a 1,000 watt power supply just to run your computer. How long until my home computer is hooked up to a 50 amp 240 volt line?

I mean, if one GPU is good and two GPUs are better, does that mean 5 are fantastic?

I used to have a Radeon 1950 Pro in my current system, which is nowhere near the top of the scale in video cards (in fact, it's probably below even average). It was so loud and literally doubled the number of watts my system took while running (measured by Kill-a-Watt). I took it out and now just use the integrated Intel graphics adapter. Man, that was fast enough for me but I don't play games very often.

Re:My god... (1)

Kjella (173770) | more than 6 years ago | (#24667221)

Power saving is as usual reserved for the higher margin laptop market, much like advanced power states on the CPU. There's no good reason why a card should have to draw that much all the time when idle.

Re:My god... (3, Insightful)

x2A (858210) | more than 6 years ago | (#24667311)

"How long until my home computer is hooked up to a 50 amp 240 volt line?"

Mine already is... how do you usually power your computer?

Re:My god... (4, Informative)

Bureaucromancer (1303477) | more than 6 years ago | (#24667719)

Presumably he's North American, we (among a few other places) use 120v lines, 240 is reserved for special high power circuits, generally only used for dryers, stoves and refrigerators (and only some of the first two).

Re:My god... (1)

mr_mischief (456295) | more than 6 years ago | (#24667957)

It might be 240 volt, but if it's 50 amps at the wall socket you might just be doing it wrong. I very seriously doubt your computer needs 12,000 VA of power.

Re:My god... (1)

ascendant (1116807) | more than 6 years ago | (#24668025)

don't forget that the amount of energy a card uses doesn't change much. there's only so much energy you can dissipate with heatsinks in that small an area.
google will tell you that personal computer power usage hasn't gone much up or down ever.
high end cards (the 1950 was one when it came out) will always use a lot of power and minimalist ones (just like your integrated chip) will always use only a little.
... and people like you will always shout on about how they expect to get a 50-amp 240 volt line just for some gadget they don't even need to buy

Re:My god... (0)

Anonymous Coward | more than 6 years ago | (#24668295)

Fuck everything, we're doing 5 GPUs!

Re:My god... (1, Informative)

Anonymous Coward | more than 6 years ago | (#24668415)

Both nVidia and ATI are working on "hybrid" tech so that you can use a low-power integrated graphics chip for normal desktop/laptop use, and only use your graphics card(s) when you're playing games. Unfortunately, the nVidia version only works for nForce chipsets, and the ATI version only works with a specific AMD chipset series. You're probably out of luck if you did't build your system with the right motherboard.

GeForce 6800 GT (4, Insightful)

Brad1138 (590148) | more than 6 years ago | (#24666913)

How many people feel this is an old card that should be in a closet? If your not a hard core gamer that is a very good video card. My fastest card (out of 4 comps) is a 256meg 7600GS (comparable to a 6800GT) on an Athlon 2500+ w/1 gig mem. Plays all the games I want without a prob and is more than fast enough to run any none game app.

Re:GeForce 6800 GT (1)

ACMENEWSLLC (940904) | more than 6 years ago | (#24667183)

I have a 6600GTS stock overclock which plays games well on my old AMD4000+ PC. While I payed a lot of $$ back in the day, the card is worth $40 now.

I can't help but think this chip would cost more than it's worth. I like the idea, but I also liked the idea that I could purchase another 6600 GTS and not loose the investment in my original. That didn't happen for me. I am much better off purchasing a current generation card than buying two 6600GTS.

So the question is how much will it cost to be able to keep using my now $40 card? (Yes, I understand it is for newer PC's not my old one, but the concept still applies.)

Re:GeForce 6800 GT (4, Insightful)

Pulzar (81031) | more than 6 years ago | (#24667223)

Plays all the games I want without a prob and is more than fast enough to run any none game app.

*Any* app? Try getting an HD camcorder and editing some video of your kid/dog/girlfriend/fish and see how well your PC does. It's easy to make generalizations about other people based on personal experience. Resist the urge to do it.

Re:GeForce 6800 GT (1)

Fumus (1258966) | more than 6 years ago | (#24667241)

Then again. If you are buying a new PC now, a 8800GT, a C2D and 2GB of RAM are cheap enough to buy and powerful enough to run anything you throw at them without much problems.

Re:GeForce 6800 GT (0)

Anonymous Coward | more than 6 years ago | (#24667575)

8800GT? Get with the times, HD4850 runs faster, with less power, and quieter. It even challenges the GTX280.

Re:GeForce 6800 GT (0)

Anonymous Coward | more than 6 years ago | (#24667989)

It also costs about $100 USD more.

I'll stick with the 8800GT thank you very much. It runs everything and is inexpensive.

Re:GeForce 6800 GT (1)

Spatial (1235392) | more than 6 years ago | (#24667373)

It seems pretty ridiculous to characterise anyone who wants to play a new game at a playable framerate as a 'hardcore gamer'. Mass Effect, Bioshock, Supreme Commander, Rainbow Six Vegas and so on are all good games for both casual or more frequent players, but would be completely unplayable on that hardware. Anyone wanting to play those games would probably feel that way.

Re:GeForce 6800 GT (0)

Anonymous Coward | more than 6 years ago | (#24667591)

I don't know about the others, but supreme commander runs perfectly fine on my 7600, even with dual screen.

Re:GeForce 6800 GT (1)

mr_mischief (456295) | more than 6 years ago | (#24667973)

You're probably not running an Athlon 2500+ for your CPU. Supreme Commander is more CPU heavy than GPU heavy. It's seriously limited by pretty much any single-core CPU out there.

Re:GeForce 6800 GT (1)

Brad1138 (590148) | more than 6 years ago | (#24667903)

Here [evilavatar.com] are the system specs for BioShock, my system meets their min. sys. req. and here [direct2drive.com] are the req. for R6 Vegas, my video card meets their min. sys. req. and my cpu is barely under, I didn't check the other games. There may be a couple games I can't play but there are 1000's I can.

Re:GeForce 6800 GT (0)

Anonymous Coward | more than 6 years ago | (#24668117)

Well it's a terrific card if all you play is Warcraft and 3 year old games, I suppose.

Re:GeForce 6800 GT (1)

PaintyThePirate (682047) | more than 6 years ago | (#24668185)

I actually do have a 6800 GT just sitting in my closet. Normally I rotate down video cards through my older computers when I buy a new one (since video cards tend to obsolete a lot faster than the rest of the computer), but I only have one computer with PCI Express.

The 6800 GT is/was a great card, but I wanted to be able to play Oblivion, Mass Effect, and yes, Crysis, at full monitor resolution. Good video cards are not as expensive as they once were; I ended up getting an 8800 GT in the spring for around $180. This brought my 3 year old desktop up to speed with the latest games.

No strings attached? (1)

sleeponthemic (1253494) | more than 6 years ago | (#24666921)

Sounds like Head In The Cloud Computing (TM)

Eek. Potentially energy inefficient. (3, Informative)

RyanFenton (230700) | more than 6 years ago | (#24666927)

Power supply units only supply so much energy, and before then cause interesting system instability.

Also, given the increasingly growing cost of energy, it might be worth buying a newer generation card just for the sake of saving the energy that would be used by multiple older generations of graphics cards. Not the newer cards use less energy in general - but multiple older cards being used to approximate a newer card would use more energy.

I guess power supplies are still the underlying limit.

As an additional aside, I'm still kind of surprised that there hasn't been any lego-style system component designs. Need more power supply? Add another lego that has a power input. Need another graphics card? Add another GPU lego. I imagine the same challenges that went into making this hydra GPU thing would be faced in making a generalist system layout like that.

Ryan Fenton

Re:Eek. Potentially energy inefficient. (0)

Anonymous Coward | more than 6 years ago | (#24667055)

I remember seeing something similar to what you're describing in terms of a power supply which fit in a front panel slot and was designed to add extra power for something like a new graphics card your old power supply couldn't handle, without having to replace the entire psu. At the time, though, it was more effective/efficient for me to just replace the old psu, and it did look like it could be cumbersome having the adjunct psu in a front panel slot from the standpoint of cabling.

Re:Eek. Potentially energy inefficient. (1)

compro01 (777531) | more than 6 years ago | (#24667287)

That's why there are bigger power supplies. I can find 1200W ones without any trouble, or even a 2000W one at a major online store, though that one would be tricky to fit into most cases as it's double-high, not to mention the thing will run you $650.

Two Links, One Article? (2, Insightful)

Anonymous Coward | more than 6 years ago | (#24667019)

Why does the summary include two links to the same article? If there are two links, shouldn't there be two articles?

And why does the summary link the phrase "allow vastly different GPUs to work in tandem?" Not only isn't it a quote from the article, it actually contradicts the article. The article says "To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA)." How did anyone get "vastly different GPUs" from this?

Re:Two Links, One Article? (3, Informative)

Michael Hunt (585391) | more than 6 years ago | (#24667329)

> How did anyone get "vastly different GPUs" from this?

Presumably because (for e.g.) a G70-based 7800 and a G92-based 8800GT are vastly different GPUs.

G70, for example, had two sets of fixed-purpose pipeline units (one of which ran your vertex programs, and one of which ran your fragment programs,) a bunch of fixed-function logic, some rasterisation logic, etc.

On the other hand, G80 and above have general purpose 'shader processors' any of which can run any pipeline programs (and, afaik, runs the traditional graphics fixed-function pipeline in software on said SPs), and a minimal amount of glue to hang it together.

About the only thing that current-generation GPUs and previous-generation GPUs have in common is the logo on the box (this applies equally to AMD parts, although the X19xx AMD cards, i'm told, are more similar to a G80-style architecture than a G70-style architecture, which is how the F@H folks managed to get it running on the X19xx before G80 was released.)

No strings. (4, Funny)

jellomizer (103300) | more than 6 years ago | (#24667027)

But it looks like it will need plenty of threads to work though.

Cross Platform? (1)

kcbanner (929309) | more than 6 years ago | (#24667263)

Well, is it?

Save your pennies now. It will take a new mobo. (1)

Catalina588 (1151475) | more than 6 years ago | (#24667597)

This product will appeal to hard-core gamers who can afford a new motherboard when the product comes to market next year. It only makes economic sense to harness a couple of recent graphics cards: a new mobo costs more than a decent midrange graphics card. So, if you have say a last generation 8800 GTX and you want to add a 260GT, the Lucid solution would work.

I saw the product tonight at Intel Developer Forum. Looks like it actually works, and the execs in the booth said production silicon arrived yesterday.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?