Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Promised Platform-Independent GPU Tech Is Getting Real

kdawson posted more than 4 years ago | from the near-linear dept.

Graphics 102

Vigile writes "Last year a small company called Lucid promised us GPU scaling across multiple GPU generations with near-linear performance gains without restrictions of SLI or CrossFire. The company has been silent for some time, but now it is not only ready to demonstrate the 2nd generation hardware, but also to show the first retail product that will be available with HYDRA technology. In this article there is a quick look at the MSI 'Big Bang' motherboard that sports the P55 chipset and HYDRA chip and also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering. Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive."

cancel ×

102 comments

Sorry! There are no comments related to the filter you selected.

First post? (0, Offtopic)

FunkyRider (1128099) | more than 4 years ago | (#29512151)

Yes? No? Yes? No!?

Why only on DirectX ? (1)

Taco Cowboy (5327) | more than 4 years ago | (#29512783)

Articles and articles I've read about Hydra is that it comes in both the HW and the SW parts, with the SW sits in between DirectX and the OS

My point is why only DirectX?

Wouldn't this be cutting itself short --- and be dependent on Microsoft?

While I know that in the world we live now DirectX pwns, but if we don't offer others a chance, how are they gonna be popular enough to rival DirectX ?

Re:Why only on DirectX ? (1)

Dudeman_Jones (1589225) | more than 4 years ago | (#29514043)

Because everything is woefully behind DirectX, and DirectX is the best supported PC Graphics layer for the majority of PC gamers. Besides, this was just announced to even be this far along. At least let them get it to retail with the most widespread graphics layer on the PC gaming market before demanding that they support everything else.

Re:First post? (1)

fractoid (1076465) | more than 4 years ago | (#29515267)

Yes? No? Yes? No!?

Dude it's about parallel processing, not quantum computers.

can it be done in software? (2, Insightful)

aerton (748473) | more than 4 years ago | (#29512201)

If it is essentially just a load-balancer, why can't it be done in software?

The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?

Re:can it be done in software? (3, Insightful)

shentino (1139071) | more than 4 years ago | (#29512239)

Crippleware is a common method of rent seeking, and copyrights, patents, and plain old obfuscation may obstruct genuine improvements.

Case in point: Old mainframes deliberately given a "cripple-me" switch that only an expensive vendor provided technician is authorized to switch off.

Re:can it be done in software? (1)

Datamonstar (845886) | more than 4 years ago | (#29512423)

It ain't just the old ones. They do it remotely now.

Re:can it be done in software? (1)

zefrer (729860) | more than 4 years ago | (#29514081)

Uh not quite. Old mainframes were deliberately given a 'block remote access' switch that gives full control to the console to the person physically at the mainframe. That's a feature, not a cripple-me switch.

Re:can it be done in software? (1)

shentino (1139071) | more than 4 years ago | (#29523877)

The switch in question I speak of either capped the CPU speed or disabled part of the memory, but I'm not sure which. But it definitely counted as crippleware.

Re:can it be done in software? (1)

zemkai (568023) | more than 4 years ago | (#29518015)

Ah... memories. Back in the day when I worked at Amdahl, this was called the "Mid Life Kicker"... components / performance designed and built in from the get go, but not available or activated until later and at a fee.

Re:can it be done in software? (2, Interesting)

Nyall (646782) | more than 4 years ago | (#29512253)

That would require CPU. Rendering a game at 60Hz not only requires a GPU that can render your imagery within 16.7ms, but also requires the software running on the CPU to issue its directx/openGL commands in 16.7ms.

Re:can it be done in software? (0)

Anonymous Coward | more than 4 years ago | (#29512393)

I can't see why not.

It's not like DirectX / OpenGL calls are passed directly to the hardware - there's still some processing done. If a solution can be found to quickly farm this out to multiple GPUs ...... well, the sky's the limit, and we could see any cards playing nicely together.

There's just one problem: Windows device manufacturers usually produce hardware "drivers" that encompass every part of the processing from the backend of the software using it all the way to the bus drivers - there's not really any GPU abstraction layer to hook into that's more general than the darker side of DirectX / OpenGL. - and this is given that the drivers don't conflict.

That said, I can see the potential for Gallium3D supporting this out of the box given it's level of hardware abstraction. Gallium3D on Windows FTW!!!

Re:can it be done in software? (1)

dark_requiem (806308) | more than 4 years ago | (#29512575)

The idea here is to improve performance. There are many complex calculations that need to be performed each frame to properly load balance the cards to achieve significant performance gains. If all that work was being performed by the CPU, it's quite possible that the rendering process could become CPU limited while the graphics cards sit waiting for the CPU to decide which card does what.

Re:can it be done in software? (2, Informative)

gedw99 (1597337) | more than 4 years ago | (#29512963)

Yes VirtualGL can do this easily. it can attach to any window also and then use GPUS on multiple machines at once. Its very stable and easy to use also

Re:can it be done in software? (0)

Anonymous Coward | more than 4 years ago | (#29513387)

>>why CPU can't do it?
how is babby formed?

Re:can it be done in software? (1)

jdb2 (800046) | more than 4 years ago | (#29519077)

If it is essentially just a load-balancer, why can't it be done in software? The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?

See my post concerning Chromium. [slashdot.org]

jdb2

only ATI with ATI, NVIDIA with NVIDIA... (1)

nethenson (1093205) | more than 4 years ago | (#29512211)

Lucid is offering up scaling between GPUs of any kind within a brand (only ATI with ATI, NVIDIA with NVIDIA)

Strange... Is the difference between a 10-years-old NVIDIA card and a current-year NVIDIA card really smaller than the difference between a current ATI card and a current nVidia card?

Re:only ATI with ATI, NVIDIA with NVIDIA... (2, Insightful)

Anonymous Coward | more than 4 years ago | (#29512261)

The cards of a brand share drivers across a few generations. So if this solution communicates with the drivers, you get the picture.

Re:only ATI with ATI, NVIDIA with NVIDIA... (1)

dark_requiem (806308) | more than 4 years ago | (#29512525)

From the Anandtech article [anandtech.com] on the subject, it appears that multi-vendor GPU scaling has been implemented, as it was demoed with a GTX 260 and an HD 4890. The mixed-vendor implementation apparently requires Windows 7 to get the card's drivers to work properly (the article was light on details on this point), but it does work. And spare me the "M$ is teh suxorz" garbage. This is aimed at gamers, and like it or not, new games come out on Windows, not Linux, so that's where Lucid's priorities will be for the product launch.

Re:only ATI with ATI, NVIDIA with NVIDIA... (1)

nxtw (866177) | more than 4 years ago | (#29514403)

The mixed-vendor implementation apparently requires Windows 7 to get the card's drivers to work properly (the article was light on details on this point)

Windows 7 includes a new version of the display driver model [wikipedia.org] . One of the new features: "Support multiple drivers in a multi-adapter, multi-monitor setup".

In Vista, multiple adapters can only be used with Aero and the new graphics features if they all used the same driver. (The Windows XP drivers still support this, and you can still use them in Vista.)

Re:only ATI with ATI, NVIDIA with NVIDIA... (0)

Anonymous Coward | more than 4 years ago | (#29512763)

If you RTFA (or RTFS) you'd see their rig has an ATi HD4800 and an nVidia GeForce GTX 260 working in tandem.

Re:only ATI with ATI, NVIDIA with NVIDIA... (0)

Anonymous Coward | more than 4 years ago | (#29513229)

Getting the drivers to co-exist is probably trouble.

-scary- firmware is the captacha

Is it useful? (4, Funny)

MarkRose (820682) | more than 4 years ago | (#29512215)

Let me be the first to saw aw s  e! Maybe I can finally get decent full screen flash performance on linux now!
                             e om

Re:Is it useful? (1)

Luke Wilson (1626541) | more than 4 years ago | (#29512317)

FTA:

The HYDRA technology also includes a unique software driver that rests between the DirectX architecture and the GPU vendor driver.

The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.

Looks like we'll have to keep waiting

Re:Is it useful? (1)

AmiMoJo (196126) | more than 4 years ago | (#29512905)

Forget about performance, what about power saving? For a couple of years now we have been promised the ability to shut down a graphics card and rely just on the on-board chip for desktop use, with the card kicking in when a game is launched. No-one seems to have actually implemented it yet though.

Re:Is it useful? (1)

Hal_Porter (817932) | more than 4 years ago | (#29513093)

Sony have done that. It required a reboot unfortunately.

http://www.cnet.com.au/sony-vaio-vgn-sz483n-339284061.htm [cnet.com.au]

Actually I think a better solution would be to put a PCI Express slot in a docking station and integrated graphics in the laptop. Then you could disable the integrated GFX when you dock and use discrete instead. Even better you could use a relatively cheap desktop card.

Mind you Asus have tried that and it didn't exactly catch on

http://www.techspot.com/news/24044-asus-introduces-xg-modular-laptop-docking-station.html [techspot.com]

Probably the reason is only a vanishingly small percentage of people care about gaming performance on a laptops. Oh, and at the moment docking stations use a proprietary connector so the market for things like this is even more limited.

Re:Is it useful? (1)

tlhIngan (30335) | more than 4 years ago | (#29515653)

Forget about performance, what about power saving? For a couple of years now we have been promised the ability to shut down a graphics card and rely just on the on-board chip for desktop use, with the card kicking in when a game is launched. No-one seems to have actually implemented it yet though.

I know nVidia does, and I believe ATI has similar technology. There's a weak GPU onboard the chipset, but it can then switch to the faster offboard GPU when you want the grunt (at the expense of battery life).

Heck, even the dual-GPU Macbooks support it - there's an option to use the powerful offboard GPU or the chipset one, and it's been around since at least this current generation of Apple laptops. I believe the big criticism was that on Windows, you can enable/disable the extra GPU without a reboot, while on MacOS X, you have to reboot to go from single/weak to dual mode (you can't really "turn off" the chipset GPU, so you might as well use it). I'm not sure if this is still true for Snow Leopard.

Re:Is it useful? (1)

AmiMoJo (196126) | more than 4 years ago | (#29515855)

You are correct, it does exist but only seems to be used on laptops so far. I don't know of any desktop mobos that support it.

Re:Is it useful? (1)

randomencounter (653994) | more than 4 years ago | (#29516533)

Of course it's only supported on laptops. Laptops are where there is a solid business case for the feature. I'm sure it will show up on desktop systems when it becomes more expensive to not do it.

Re:Is it useful? (1)

L4t3r4lu5 (1216702) | more than 4 years ago | (#29513087)

Getting full screen flash to perform well on Linux is not the fault of your hardware, which is what this solution is. I have quite a mediocre PC running Windows, and Flash works full screen just fine.

I also have a gaming PC with a more than capable second card not being used, but which would probably allow me the small performance boost I need to keep me from upgrading just yet. I think that I'm more the target market than you are.

Re:Is it useful? (1)

icebraining (1313345) | more than 4 years ago | (#29514011)

I have a AMD Neo CPU (1.6GHz) and a on-board x1250 and Fullscreen Flash works well in Debian Sid.

Re:Is it useful? (1)

L4t3r4lu5 (1216702) | more than 4 years ago | (#29514711)

So what was the point of the OP?

Re:Is it useful? (1)

AliasMarlowe (1042386) | more than 4 years ago | (#29515087)

Fullscreen flash on Linux is just one of those /. memes that gets recycled past its "best by" date. It probably still applies to some combinations of graphics hardware and driver, especially older stuff.

Re:Is it useful? (1)

MarkRose (820682) | more than 4 years ago | (#29515941)

I have a GeForce 9500 GT, and a dual core Athlon 5050e, yet Flash fails abysmally at playing hidef YouTube on even a single 1680x1050 screen.

Re:Is it useful? (1)

Khyber (864651) | more than 4 years ago | (#29517857)

Well, for one, Youtube HD isn't even HD at all. It's a 640x272 resolution video that's been heavily upsampled. I've fullscreened a Youtube HD video and paused it, and I've counted the blocks that represent a 'pixel.' I've already tested this on Youtube and Vimeo with my DXG 720HD camcorder. Vimeo keeps the true HD, Youtube shrinks the resolution for file size then upsamples the entire thing. Pretty easy to spot when you're using a nice 32" LCD that is only 4 feet from your face.

That why I just renewed my Vimeo Plus Account.

Re:Is it useful? (1)

PitaBred (632671) | more than 4 years ago | (#29516465)

Just FYI, I picked up a pair of Radeon 4670's for ~$50 each open-box from Newegg and a bridge for $7, and in Crossfire they perform about like a 4850 from my benchmarking and basic testing. And they don't need a secondary power cable, either. It's a cheap, easy upgrade if you have two x8/x16 PCIe slots.

Re:Is it useful? (1)

JAlexoi (1085785) | more than 4 years ago | (#29513643)

Hm... You have issues with that? Flash 10 works rather well on Linux in fullscreen mode. I have not seen any sluggishness for a long time now.

Re:Is it useful? (0)

Anonymous Coward | more than 4 years ago | (#29514239)

It's appalling for me, latest flash player on ubuntu 8.10, the issue is that adobe basically fail at coding for the platform, i have half a dozen media players which are capable of playing fullscreen video very smoothly at 1680x1050, they can even just about keep up with 720p on my old hardware, but adobe's flash plugin can barely cope with fullscreen even below 800x600.

Suddenly, to be released to market in 30 days (5, Interesting)

prisma (1038806) | more than 4 years ago | (#29512221)

Finally, we can have asynchronous GPU pairing? And cross-brands to boot? What's incredible is having heard nothing about this for so long, TFA now says a product may hit the market in the next 30 days. I take it that by sidestepping Crossfire and SLI technology, they won't have to pay any licensing fees to either AMD or NVIDIA. Hopefully the patent trolls won't be able to find any fodder that would prevent and delay commercial release.

Re:Suddenly, to be released to market in 30 days (2, Informative)

ShakaUVM (157947) | more than 4 years ago | (#29512307)

>>Finally, we can have asynchronous GPU pairing?

I think NVIDIA has some sort of asymmetrical SLI mode available on its mobos with built-in video cards. It allows the weak built in card to help a little bit with the big video card installed in the main PCI-E slot.

IIRC, it gives a 10% boost or so to performance.

Ah, here it is...
http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]

Re:Suddenly, to be released to market in 30 days (1)

Genocidicbunny (1212400) | more than 4 years ago | (#29512577)

Ah, here it is... http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]

Im pretty sure that also got discontinued with the 9xxx generation of nVidia GPU's

Re:Suddenly, to be released to market in 30 days (1)

Hal_Porter (817932) | more than 4 years ago | (#29513103)

A 10% boost is not really worthing bothering about to be honest. Discrete graphics is so much faster than integrated you might as well turn off the integrated graphics completely.

Re:Suddenly, to be released to market in 30 days (2, Insightful)

Trahloc (842734) | more than 4 years ago | (#29513575)

10% isn't a big deal? There are people who go to crazy extremes just to tweak out an extra 1-3% with entire sub markets dedicated to them, so yeah 10% is worth it.

Cross brand or not? I'm confused (1)

gozu (541069) | more than 4 years ago | (#29513615)

From TFA (emphasis mine):

To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA) and the ability to load balance GPUs based on performance and other criteria.

So what is the deal? Is it cross-brand or not?

Also, they are only planning to launch their chipset on one motherboard with one manufacturer. It all sounds like a short-lived gimmick to me.

fuck you all (-1, Troll)

Anonymous Coward | more than 4 years ago | (#29512267)

fuck niggers, jews, republicans, communists, nerds and fags

fuck all

Re:fuck you all (4, Funny)

Anonymous Coward | more than 4 years ago | (#29512333)

I know I speak anonymously when I say that this Anonymous Coward does not speak for the rest of us Anonymous Cowards.

So, parent poster, gargle our collective balls.

Re:fuck you all (0)

Anonymous Coward | more than 4 years ago | (#29513337)

Really? I thought we were supposed to be a collective hive mind.

Re:fuck you all (0)

Anonymous Coward | more than 4 years ago | (#29521841)

We anonymously approve and add our numerous collective balls to the group.

J. Doe, president of the Society of Anonymous Cowards Knitters Supermachos (S.a.c.k.s.)

Re:fuck you all (0)

Anonymous Coward | more than 4 years ago | (#29512449)

Can we ban humans from the internet/electronics by making it so that if they touch something 5 volts of greter they get a VERY hard shock?

Great, can't wait until there's a Linux driver (1)

xiando (770382) | more than 4 years ago | (#29512341)

From the article, "The HYDRA technology, as it is called, is a combination of hardware (in the form of a dedicated chip) and software (a driver that sits between the OS and DirectX)", I can't wait for this software technology to be available for GNU/Linux. But.. something tells me it will take a while as never ATI and NVidia chips can not even do 3D using free software as of today and support seems to be years away. And yes, I know, there is some unstable proprietary binary blob available for my ATI card which can do 3D, but it is immoral to use that and it is actually so slow on 2D (which to me is more important) compared to the free "radeon" driver that it's ridiculous.

Re:Great, can't wait until there's a Linux driver (0)

Anonymous Coward | more than 4 years ago | (#29512411)

This may be offtopic, but I completely agree. My family was using ubuntu on a pretty old computer, until I got a new one for the family. It was a rude awakening to the state of ati drivers for linux. Had to switch them to xp just because of this. But then again, they're completely oblivious, it's just a computer to them.

Re:Great, can't wait until there's a Linux driver (1, Insightful)

Slarty (11126) | more than 4 years ago | (#29512469)

We live in a world where thousands of children starve to death every day, people are killed or imprisoned for expressing their beliefs, women/minorities/everybody are oppressed, and few people really care about any of it, because it's all someone else's problem. I find it kind of funny (and more than a little sad) that the use of a driver can be blithely written off as "immoral" just because you can't download the source.

Re:Great, can't wait until there's a Linux driver (1, Insightful)

Anonymous Coward | more than 4 years ago | (#29512501)

... women(by women and some men)/men(by women)/minorities/everybody are oppressed, ...

Fixed that for you

Re:Great, can't wait until there's a Linux driver (0)

Anonymous Coward | more than 4 years ago | (#29512509)

Just because it could be worse, doesn't mean it couldn't be better. There are different degrees of immoral acts.

Re:Great, can't wait until there's a Linux driver (2, Insightful)

Anonymous Coward | more than 4 years ago | (#29512613)

Certainly. One is important, the other is trivial nerd rage bullshit.

Re:Great, can't wait until there's a Linux driver (3, Insightful)

Tom (822) | more than 4 years ago | (#29512777)

I hope you die young. Seriously. If we get world hunger solved, and peace eternal, people will start to complain about even less important stuff. People complain about things, it's part of human nature. Just because 500 people died in Africa today before I got out of bed doesn't mean I don't feel that particular idiot at work is a friggin' [censored].

You can't deny people's feelings with a rational appeal to global standards.

Re:Great, can't wait until there's a Linux driver (5, Insightful)

bcmm (768152) | more than 4 years ago | (#29513251)

We live in a world where thousands of children starve to death every day, people are killed or imprisoned for expressing their beliefs, women/minorities/everybody are oppressed, and few people really care about any of it, because it's all someone else's problem. I find it kind of funny (and more than a little sad) that the use of a driver can be blithely written off as "immoral" just because you can't download the source.

Some people rape children. How can you possibly think shoplifting is immoral?

/me steals some stuff.

Re:Great, can't wait until there's a Linux driver (0)

Anonymous Coward | more than 4 years ago | (#29513253)

There aren't grades of immoral. Either something is or it isn't (subjectively, of course).

Re:Great, can't wait until there's a Linux driver (1)

Seth Kriticos (1227934) | more than 4 years ago | (#29513291)

Cute, you are trying to pull an common didactic trick by sidestepping the issue with overexerting another. I know grade schoolers like to do it:

Kid: I don't want to do my homework, it's so stupid.
Parent: For the last time kid, do your homework!
Kid: You know that hurricane Jane killed 5000 people yesterday on the west cost? And you are upset about some measly homework? How can you, there are so many worse problems on the world.
Parent: [...]

Now who the hell modded parent up?

Re:Great, can't wait until there's a Linux driver (1)

whatajoke (1625715) | more than 4 years ago | (#29513335)

> Now who the hell modded parent up? /. mods suck bigtime these days. To get modded up, just write a emo post.

Re:Great, can't wait until there's a Linux driver (1)

machine321 (458769) | more than 4 years ago | (#29513535)

Either that, or write "This will get modded down because of [...], but" at the beginning of your post.

Re:Great, can't wait until there's a Linux driver (1)

RightSaidFred99 (874576) | more than 4 years ago | (#29512479)

So you can't wait until approximately...20never? You won't see drivers for anything like this in Linux. You'll be lucky to get decent bog standard 3D drivers.

Re:Great, can't wait until there's a Linux driver (5, Insightful)

dark_requiem (806308) | more than 4 years ago | (#29512553)

"Immoral"? What, because it's proprietary? Are you serious? Get ready to throw out your whole computer, because the whole damn thing is proprietary. You don't have circuit diagrams for the cpu or gpu, you don't have firmware code, nothing. Before you start taking the "moral" high ground about proprietary components, look at what you're typing on. There's plenty of room in the world for proprietary and open source to coexist, RMS' rantings not withstanding.

Re:Great, can't wait until there's a Linux driver (2, Funny)

Hal_Porter (817932) | more than 4 years ago | (#29513123)

It's a slippery slope though

1) You can't read the source code to your graphics driver.
2) ???
3) You are being herded into a gas chamber.

Re:Great, can't wait until there's a Linux driver (1)

MadKeithV (102058) | more than 4 years ago | (#29513361)

Imagine a beowul...
On second thoughts, I'm not going to go there.

Re:Great, can't wait until there's a Linux driver (1)

Ruin666 (1027352) | more than 4 years ago | (#29515015)

of gas chambers? :)

Re:Great, can't wait until there's a Linux driver (1)

maxume (22995) | more than 4 years ago | (#29513753)

I was on a slippery slope once. I got mud all over my pants.

Re:Great, can't wait until there's a Linux driver (0)

Anonymous Coward | more than 4 years ago | (#29515671)

You don't have circuit diagrams for the cpu or gpu, you don't have firmware code, nothing.

I log into /. from my C64, with circuit diagrams of the entire system, you insensitive clod!

Re:Great, can't wait until there's a Linux driver (1)

Fred_A (10934) | more than 4 years ago | (#29513195)

And yes, I know, there is some unstable proprietary binary blob available for my ATI card which can do 3D, but it is immoral to use that

Now that you have confessed you shall say 2 our RMS and 3 hail Linus and all will be forgiven for the GNU is merciful. Go in peace, user.
(duh)

and it is actually so slow on 2D (which to me is more important) compared to the free "radeon" driver that it's ridiculous.

Or you could get supported hardware from "the other company", or stop being anal about trivial issues nobody in his right mind cares about.

Or just get a real SVGA card which is perfectly supported with completely open drivers. I hear Tseng ET3000 are a steal these days.

Re:Great, can't wait until there's a Linux driver (1)

PitaBred (632671) | more than 4 years ago | (#29516541)

Keep an eye on the radeon development. They just pushed OpenGL 1.4 acceleration to the Radeon driver for all ATI cards (including the current r600/700 cards, the 2xxx/3xxx/4xxx series), and it's just getting better. It'll really fly once Gallium3D drops and allows GLSL and other improvements. Most distros should be including it when they get the 2.6.32 kernels shipping with them. So, Fedora 12 alphas have it running, Ubuntu 9.10 should have 3D without the KMS for the new radeons, and things in general are just moving along smoothly. ATI has really committed to open-source drivers.

BTW, I'm just saying this as just a fan of their products and politics. I'm not an employee or paid by them in any way. If you want to see how things are going for yourself (and try out the bleeding-edge code), check out the Phoronix open-source ATI forums [phoronix.com] . The actual ATI devs post there pretty much daily.

It won't work... (0)

Anonymous Coward | more than 4 years ago | (#29512383)

There are very good reasons to assume that this story sounds too good to be true. The main claim to fame of the Hydra solution is that separate objects are rendered on separate GPU's and then merged together later. That's really beautiful, but it requires one GPU to access the Z buffer of the other. That's typically not possible or, if it is, only at very low performance. It also doesn't explain how rendered textures will work, which has always been known as one of the main issues for SFR-type multi-GPU rendering (which the Hydra is also.)

Hydra really doesn't intrinsically do anything that couldn't be done by ATI or Nvidia. In fact they have less information at their disposal. E.g. they don't know the post-transform vertex data, which is necessary to know the extent of new objects.

I'm very sceptical that this will be a universal solution that's going to put existing GPU vendors to shame and that's even if those won't put active road blocks in Lucid's way.

Re:It won't work... (1)

marcansoft (727665) | more than 4 years ago | (#29512427)

The Z buffer shouldn't be an issue. Just render objects on each CPU (pretend they're different scenes), and then merge the images using their Z buffers as a key. You only need to exchange the Z buffer once for the final merge. It doesn't matter that objects will be drawn that would normally be hidden under objects drawn by the other GPU, because the right pixels will be chosen during the final merge based on the Z buffers.

There are certainly questions to be answered, but the fundamental idea of rendering different objects on different GPUs shouldn't be too problematic. Of course, things get "slightly" harder when you factor in render-to-texture, effects, etc. But they can always take the cheap way out and just bail to single GPU when "incompatible"/unimplemented features are used.

I doubt this is vaporware. It may perform better or worse, but I think it's a perfectly real product.

Re:It won't work... (1)

Hal_Porter (817932) | more than 4 years ago | (#29513153)

I think you could do tile based rendering. If you look at the Larrabee paper by Intel they managed to get very impressive scaling by doing this.

http://en.wikipedia.org/wiki/Larrabee_(GPU)#Preliminary_performance_data [wikipedia.org]

Of course you have to wonder about Larrabee. If it's as good as this, why haven't Intel launched it as a hybrid CPU/GPU? I think it's one of those ideas like Itanium which are great at the academic paper level but seriously flawed in terms of real world performance - i.e. there are couple of use cases that it fails miserably at.

Add-On GPU Daughterboard Hardware... (1)

Xin Jing (1587107) | more than 4 years ago | (#29512461)

With their proprietary CUDA and Firestream technologies, I would think Nvidia and AMD/ATI resepctively would be able to make a daughter card that could add or increase GPU capability on their existing respective hardware, or open up 3rd party licensing to build this market segment.

My ATI X1300 handles far more BOINC than it does games, and I have no real reason to upgrade right now. But if there was an add-on that ATI or an approved 3rd party manufacturer developed that was reasonably priced, I wouldn't hesitate to add functionality.

I seem to remember a similar concept years ago with the Intel 386 SX architecture where you could purchase an optional math co-processor to plug in next to it.

 

Re:Add-On GPU Daughterboard Hardware... (1)

dark_requiem (806308) | more than 4 years ago | (#29512497)

There has been some noise from the big two GPU manufacturers for something similar. I don't have time atm to search for it, but I believe the article I read was on anandtech. Basically, the idea was to create a graphics card with basic functionality like 2D processing built in, but have the actual GPU chipset be user-replaceable by using a socket instead of hard soldering it to the board, so you could just plug in a new chip, and bam! instant 3D processing upgrade, without the unnecessary expense of replacing the whole board. Obviously, such an implementation has its limits (different chips have different bandwidths for the memory controller, for instance), but it would still give you the option of upgrading within a limited subset of the available GPUs. I haven't heard anything about this since, so I guess nothing's come of it yet, but the idea is being toyed with.

Re:Add-On GPU Daughterboard Hardware... (1)

RMingin (985478) | more than 4 years ago | (#29515127)

What you're not seeing is that the PCIe card **IS** the user-replaceable GPU tech! WHY would you want/need to swap out the socketed GPU? All you're keeping by your method is the ram (which is probably slow and out of date by the time you swap GPUs) and the physical connectors, which cost roughly NOTHING. In exchange you've added a ton of connections to be loose or misconnected.

Re:Add-On GPU Daughterboard Hardware... (1)

PitaBred (632671) | more than 4 years ago | (#29516703)

If you want a compatible upgrade, just check Newegg. An X1650pro [newegg.com] would do a lot more for BOINC than your current card, is supported by the exact same drivers, and only runs $54 if you do the free shipping option. A "daughterboard" or even just a new chip would require a heatsink, more power, and so on... a replacement just makes more sense, especially since the newest generations of cards are multiple times more powerful than your current one.

Re:Add-On GPU Daughterboard Hardware... (1)

Xin Jing (1587107) | more than 4 years ago | (#29519793)

Thanks for the product recommendation. As you can tell, I'm not exactly operating on the bleeding edge of technology and that price range fits in nicely with my budget.

Re:Add-On GPU Daughterboard Hardware... (1)

PitaBred (632671) | more than 4 years ago | (#29520105)

If you are using Windows and have a PCIe slot, you can pick up a 3xxx or 4xxx series ATI card for around the same price that will blow the 1650 out of the water. An open-box 4650 [newegg.com] is smoking fast for only $40. If you're limited by AGP, a 3450 [newegg.com] is still probably faster than the x1650, and definitely faster than your x1300. Lots of options available on a budget.

NOT Platform-independent (1)

tagno25 (1518033) | more than 4 years ago | (#29512463)

Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive.

But this is not any ware close enough.

Performance issue (2, Insightful)

dark_requiem (806308) | more than 4 years ago | (#29512471)

There are going to be some performance hits compared to native crossfile/sli implementations. There are three models of the Hydra 200 part, and they each differ in their pcie lanes. The high-end model, which is going on the MSI motherboard, sports two x16 pcie lanes from the chip to the graphics cards (configurable as 2x16, 1x16 + 2x8, or 4x8), but only a single x16 lane from the chip to the pcie controller. So, where a good high-end crossfire or sli board will have two x16 pcie lanes from the controller to the slots for the gpus, this solution will be limited to one x16, limiting the bandwidth available to each graphics card. Exactly how much of a performance hit this would incur remains to be seen, and it probably depends on the cards being used (an older 8000 series geforce doesn't need/won't use as much bandwidth as a gtx 295, for example), but I would expect as gpus grow more powerful and require more bandwidth to keep them fed and working, we will start to see performance deterioration compared to the native crossfire and sli implementations (although lucid can always modify their design to keep pace).

Incidentally, the two lower-end hydra chips will sport a x8 connection to the controller and 2 x8 connections to the cards, and a x16 connection to the controller and two x16 connections to the cards (strictly 2x16, not configurable in any other arrangement)

Re:Performance issue (1)

Tynin (634655) | more than 4 years ago | (#29512569)

I don't think there will be a performance issue for sometime as video cards aren't even using enough bandwidth to saturate the now older PCIe 1.1 which can do up to 4GB/s, and are obviously no where near touching the PCIe 2.0's 8GB/s. Their is a ~2% performance increase with running modern cards on the 1.1 compared to the 2.0 which is within the margin of error. So if you want to use this HYDRA setup with 2 cards, regardless of their speed, the 16x PCIe 2.0 line should be enough bandwidth to do the job right. But in the end, down the line, their will be some performance issues if you keep using PCIe 2.0, but by then we'll all have PCIe 3.0 so it should be a moot point.

DirectX/Windows only (1)

Michael Woodhams (112247) | more than 4 years ago | (#29512561)

The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.

So presumably it will work only in Windows, and only with DirectX games (e.g. not with OpenGL.) I'm guessing that supporting OpenGL would require a big programming effort so we won't see it soon if at all. I suspect there aren't many OpenGL games out there anyway, but I don't follow such things.

Unless the OS market changes drastically, I expect we'll never see non-Windows drivers.

(Aside: ...combine the performance of an AMD and NVIDIA card in the same system. This is truly the killer feature that will make you want, no NEED, a HYDRA-based motherboard very soon. My GPU selection strategy is to buy the most powerful GPU available ... that needs no power connector and is passively cooled. I don't think I'm in the market segment they are addressing here.)

Latency (1)

brucmack (572780) | more than 4 years ago | (#29512607)

It'll be interesting to see how much extra latency the chip adds to the rendering process. I don't imagine the hardcore gamers would be too happy about it if they sacrifice an extra 50 ms to gain some FPS.

Re:Latency (1, Funny)

Anonymous Coward | more than 4 years ago | (#29513141)

Nonsense. Graphics beat gameplay, remember? It would follow that throughput beats latency.

2 display drivers??? Come on... (2, Insightful)

cbope (130292) | more than 4 years ago | (#29512823)

Great, now I can have 2 buggy display drivers installed at the same time, each with their own quirks. And who helps me out when I have graphical problems in a game? Do you really think ATI or NVIDIA will give end-user support for this? What about game developer support? It is a support nightmare for all involved. No thanks. Sorry, this idea is brain-dead long before it hits the shelf.

Re:2 display drivers??? Come on... (1)

Dr. Spork (142693) | more than 4 years ago | (#29513529)

Hey, if I could just plug a second card into my system that didn't have to match the first one, I'd be at newegg right after hitting "submit". If this gets big it will definitely sell more hardware because it will lead to more frequent upgrading and just more GPU buying. AMD and NVidia would be crazy to kill this goose.

Re:2 display drivers??? Come on... (1)

Nemyst (1383049) | more than 4 years ago | (#29514361)

Like they give support for their cards right now. Sure, if it breaks and it's still under warranty, they'll replace it, but they rarely fix problems with game compatibility unless a majority of their users are experiencing the issue.

If vendor support is so important for you, get a console.

Useless tech (1)

FithisUX (855293) | more than 4 years ago | (#29512825)

It would make sense if they developed a spec with a common access API to the HW instead of using wrappers like OpenGL/DX on proprietary drivers. HW should expose a platform-independent API so driver could be written by Microsoft or apple or who ever. And by the way, why should I put two different vendors cards on the same machine instead of using the native single vendor solution? It is only useful when OpenGraphics comes out.

Re:Useless tech (2, Insightful)

Hal_Porter (817932) | more than 4 years ago | (#29513263)

Back in the Dos days video hardware was originally a register level standard. Then the accelerator companies all invented their own solutions to line drawing, BitBlts and so on. Now in Dos each programmers used Vesa Bios calls to get into high res modes but they had to write a driver themselves for anything more complex. Windows came along and acted like a software motherboard - application programmers wrote to user mode API and the graphics card manufacturers wrote drivers to a kernel level API.

At this point WinG was a better platform to port games to than Dos because you didn't need to write your own graphics driver. As 3D became more popular things became even more clear cut. Each 3D company invented their own standard, and none were keen to document it. 3DFX had a driver layer for Dos and quite a lot of games supported it. Still it was not an abstraction layer - it would only work with 3DFX cards.

Now Microsoft spotted an opportunity and launched DirectX. This was Windows only of course but it was graphics card independent. Now all the other graphics card manufacturers could implement a DirectX driver and all the games could use that API. And from what I've read DX was actually designed to be as thin a wrapper as possible to hide the differences between graphics cards. NVidia got started at this point - they had no hope of competing with 3DFX in terms of getting people to write code to their own API or hardware. In Windows they didn't have to.

I don't really see any chance of getting ATI/AMD, NVidia and Intel to agree on a common register spec for graphics at this point. Well, not one that would support high performance games at any rate. And Microsoft obviously have no interest in doing anything that would commoditize the OS. Even Apple seems to do OK with the current setup - they can just ask NVidia and ATI/AMD to both supply drivers if they want their hardware to be used.

SLI/crossfire is a niche market (1)

johncandale (1430587) | more than 4 years ago | (#29512859)

eh. SLI/crossfire has always been a niche market. Buying 1 top of the line nvidia or ATI card is always a stronger solution then buying two mid level cards. So this would only make sense if you are buying 2 top of the line cards and honestly while the charts make it look impressive, it's just that, a bragging right. You don't see human detectable improvement in performance in most games/apps. It's a very small market. I believe that's why nidia or ATI hasn't done any real development of their own products to allow a older card to work with a newer card to work together. It seems like a good market idea, increase brand loyalty, if they had a product, knowing you can add a new card a year later and keep the one you already have, the trick being you'd have to buy the same brand again. But no GPU maker has any main stream stuff like this. Why? Because it's fr ricer rigs. GPU's gain performance so fast.

Look out for Taiwan ! (1)

Taco Cowboy (5327) | more than 4 years ago | (#29512913)

First it was Nvidia, then Nvidia's control over Ageia (of PhysX chip fame)

Now it's MSI's turn in their control over Hydra

What this means is, there are only ATI and Intel out there who are seriously dabbling with graphics hardware, who are not based in Taiwan !

Re:Look out for Taiwan ! (1)

djnforce9 (1481137) | more than 4 years ago | (#29515735)

Speaking of Phys-X,

"also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering"

That might cause a problem. Remember, nVidia disabled Phys-X on their latest drivers when ATI video hardware is present (to prevent people from using cheap nvidia GPU's as a glorified Phys-X PPU) so I hope these guys made their own "custom drivers" that work with both cards (and not just a software bridge between the two). This will eliminate that restriction as well as the need to have TWO separate graphic drivers installed at the same time (causing who knows what problems when certain games work well with one card and buggy with the other).

VirtualGL does this NOW - http://www.virtualgl.org (1)

gedw99 (1597337) | more than 4 years ago | (#29512993)

http://www.virtualgl.org/ [virtualgl.org] Will scale across many GPUS on the same board or across the world.

wait and see (1)

Atreide (16473) | more than 4 years ago | (#29513449)

last time i heard of
such magical product
it was april fool

Applause (0)

Anonymous Coward | more than 4 years ago | (#29515217)

Wow, an actual Haiku! It referenced a season (I'm willing to count "joking season in spring")!
Thanks for your post, it brought a smile to my morning.

Hardware based ripoff of Chromium (1)

jdb2 (800046) | more than 4 years ago | (#29513665)

A software based solution to the problem of aggregating a heterogeneous collection of parallel OpenGL command streams into one, compositing the output of several graphics cards into one image, or both, has been available for years : It's called Chromium [sourceforge.net] .

Although originally designed for a networked cluster with one gpu per machine, it can conceivably be adapted to one machine with multiple GPUs. Because Chromium's software based compositing would bog down a single processor system, a natural extension would be to build a PCI-E card, running some sort of embedded Unix with a dedicated high speed processor, which would handle compositing the output of multiple GPUs in parallel.

I'm surprised no one has done this yet for Linux.

jdb2

Hybrid Crossfire? (1)

Aggrajag (716041) | more than 4 years ago | (#29515939)

Would this technology enable me to use the onboard IGP (Radeon HD3300) which
is now doing absolutely nothing as I am using a separate Radeon HD3850.

Interesting. (1)

Ragingguppy (464321) | more than 4 years ago | (#29518079)

Interesting. But let me guess its only compatible with windows.

Re:Interesting. (0)

Anonymous Coward | more than 4 years ago | (#29518917)

Which is why I'm still trying to figure out the whole "platform-independent" title... Since when does anyone call Radeon and GeForce "different platforms"?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>