Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA To Buy AGEIA

kdawson posted more than 6 years ago | from the it's-all-physics dept.

Graphics 160

The two companies announced today that NVIDIA will acquire PhysX maker AGEIA; terms were not disclosed. The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed, and that the combination positions NVIDIA to compete with Intel on a second front, beyond the GPU — as Intel purchased AGEIA competitor Havok last September. While NVIDIA talked about supporting the PhysX engine on their GPUs, it's not clear whether AGEIA's hardware-based physics accelerator will play any part in that. AMD declared GPU physics dead last year, but NVIDIA at least presumably begs to differ. The coverage over at PC Perspectives goes into more depth on what the acquisition portends for the future of physics, on the GPU or elsewhere.

cancel ×

160 comments

Sorry! There are no comments related to the filter you selected.

off on a tangent (3, Funny)

User 956 (568564) | more than 6 years ago | (#22300690)

The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed

Well, that's because they were pondering a similar strategy to Microsoft, and were going to buy Yahoo.

Re:off on a tangent (4, Funny)

ma1wrbu5tr (1066262) | more than 6 years ago | (#22301538)

Here's a strange tangent... or two

It's almost like some bizarre comic.

Lets imagine that AMD and ATI teamed to to be the Super Friends.

And Intel and nVidia are the Legion of Doom.

Now, let the battle for the universe begin.


At least that's how I feel when I read ./ers comments sometimes.
We geeks tend to take ourselves entirely too seriously,

Grammatical and spelling errors are bonuses.

Re:off on a tangent (4, Informative)

mabhatter654 (561290) | more than 6 years ago | (#22302720)

except Intel doesn't think they need Nvidia... that's why they've got nearly all the notebook vendors pumping out crappy built-in graphics that just barely run Windows Vista. ATI saw the writing on the wall and got themselves bought by AMD. Now AMD battles on CPUs, integrated graphics, and high end graphics... Intel can never buy Nvidia because they'd be instantly sued. Nvidia overpriced themselves, even with all the work they did for AMD, and the matching logos... stock holders were just too rich for AMD.

This makes Nvidia the "odd man out" because they don't make processors. Both Intel and AMD have integrated solutions and obviously want physics processing on the CPU so that they can sell 7 core 3.21GHz processors. NVidia has to break the mold if they want sales... they got shunned the last round of consoles for IBM and ATI, and Microsoft pretty much let ATI write the book for DX10 this round. NVidia + Ageia only makes sense if they'll make an open source console that runs either AMD or Intel CPUs. Games would need to run flawlessly, without "installing" just like a console. There's a hole for PC gaming right now... Apple's not filling it (they think it's stupid) Wintel is not helping (Microsoft only wants Vista gaming, and Intel wants to sell integrated graphics) so a well done Linux console could help... but there's too much IP in the way to make it happen.

Re:off on a tangent (2, Informative)

greazer (165571) | more than 6 years ago | (#22303648)

they got shunned the last round of consoles for IBM and ATI, and Microsoft pretty much let ATI write the book for DX10 this round.
Last I checked, the graphics in PS3, a.k.a. RSX, was NVIDIA designed.

Re:off on a tangent (1)

Calinous (985536) | more than 6 years ago | (#22304214)

What about nVidia and Sun? Sun has plenty of intellectual property in processors area - too bad its current processors are optimised for many threads of lower performance

Must bundle with GPU (4, Insightful)

Macfox (50100) | more than 6 years ago | (#22300710)

This won't float unless they bundle it with the next generation GPU. AGEIA haven't been able to get traction with a dedicated card and neither will nVidia, unless a heap of games support it overnight.

Re:Must bundle with GPU (5, Insightful)

Kyrubas (991784) | more than 6 years ago | (#22300816)

It might be that nVidia doesn't even intend to use the overall PhysX stuff at all, but instead wants to tear it apart for the patents on specific design patents further optimization of their GPUs.

Re:Must bundle with GPU (5, Interesting)

644bd346996 (1012333) | more than 6 years ago | (#22301380)

Don't forget that PhysX has software out there, too. It hasn't been doing well against Havok, but it's obviously in NVidia's best interests to promote the use of physics engines in games, seeing as they could provide the hardware acceleration for them. I expect the PhysX engine will soon have the ability to use NVidia GPUs, and it will pushed as a more viable competitor to Havok, especially since Intel cancelled Havok FX.

Re:Must bundle with GPU (2, Interesting)

mrxak (727974) | more than 6 years ago | (#22300822)

That's just it, really. Games need to support it in large enough numbers, and need to do it well enough to make a difference between those without the cards and those that have them. Most people seem to think this is a joke, and the way CPUs are going anyway with extra cores, I think we'd be better off seeing multithreaded games instead of physics cards.

Re:Must bundle with GPU (4, Interesting)

Rival (14861) | more than 6 years ago | (#22301190)

Games are great at motivating the development of better video cards, and to some extent bus speeds, processors and other non-gaming-specific components. This is a good thing, though I have some old-man opinions on how Moore's Law is spoiling many developers.

That being said, I don't believe games drive the adoption of hardware as much as you might be thinking. As a case in point, look at Vista. Ugly and bloated, yes, but perforce nearly everywhere. And the minimum requirements for Aero (which is the one feature your average user is going to jump on -- ooh, it's pretty!) are going to do more to push the next large jump in base video card standards than any given game.

Retailers don't have enough fiscal incentives to stop pushing Vista, even if they do try to gain positive PR by selling Ubuntu or XP on a few low-end models. And if they're pushing Vista, they want to support the pretty interface the public expects. By making hardware-accelerated rendering a practical requirement of the OS, Microsoft has raised the bar of the "minimum acceptable" video card.

Right now we see physics cards as a niche product, barely supported. It has been the same with all technical developments. But if we're heading toward 3D interfaces (which I believe we are,)then physics can only play an increasing roll in such an environment. If that should become the case, then a dedicated processor will be much more valuable then assigning a generic CPU core to try and handle the calculations.

Re:Must bundle with GPU (1)

mabhatter654 (561290) | more than 6 years ago | (#22302756)

Intel has already killed that train.. they made GMA950 and X3100 to be "just enough" that Microsoft would certify them for "full" vista effects. Once that happened, gaming on any store-bought PC is pretty much dead under $1,000. Both Microsoft and Intel and the OEMs want to milk the market and charge twice the profit for "gaming" PCs even though the low end PC now is twice as fast as 3 years ago... except for the 5 year old graphics chip!

Re:Must bundle with GPU (3, Interesting)

Wesley Felter (138342) | more than 6 years ago | (#22303658)

There's nothing stopping you from buying a low-end PC and installing a real GPU. AFAIK, most systems with integrated graphics still have a PCI Express slot so you can upgrade.

I also don't see any gouging going in in gaming PCs. I recently built a $1000 gaming PC and prebuilt models with similar specs were selling for $1100-1200, which is not much of a markup.

Re:Must bundle with GPU (1)

Clanked (1156473) | more than 6 years ago | (#22300842)

Game makers won't require something people don't have, the mass majority won't buy something they don't need. NVIDIA however could throw this on their cards, and have it a big part of The way it's meant to be played. This would really give them something over ATI (other than faster cards :p)

Re:Must bundle with GPU (2, Interesting)

Rival (14861) | more than 6 years ago | (#22300910)

I see your floating point.

The way I picture things, a Physics Processing unit (PPU?) will end up like FPUs: at first an optional, narrow-use add-on, then integrated on premium products, then more widespread as software vendors feel comfortable relying on it, and finally ubiquitous and practically indispensable.

And then Slashdotters will be able to say, "You kids with your integrated PPUs nowadays -- when I was your age, we had to calculate trajectories and drag coefficients by hand, and we liked it that way!"

Re:Must bundle with GPU (2, Funny)

Grave (8234) | more than 6 years ago | (#22302584)

And then Slashdotters will be able to say, "You kids with your integrated PPUs nowadays -- when I was your age, we had to calculate trajectories and drag coefficients by hand, and we liked it that way!"
But I already say that...

Re:Must bundle with GPU (5, Informative)

RelliK (4466) | more than 6 years ago | (#22300970)

I always thought that GPU + physics engine would be a perfect combination. Ultimately, the AGEIA card is just a DSP + software driver for calculating physics. A GPU is... also a DSP + software driver for calculating graphics. It wouldn't be too hard to write a driver that does both: some of the pipelines could be allocated to graphics, and some to physics. Might even make a software-configurable to dedicate more/less units to physics.

Re:Must bundle with GPU (2, Informative)

milsoRgen (1016505) | more than 6 years ago | (#22301082)

The interesting thing about the processor on an AGEIA card is it's similair in design to an IBM Cell processor [wikipedia.org] . Just a fewer number of SPE's...

I can't seem to find the link to the paper that discussed it in detail, if I can find it I'll post it later...

Re:Must bundle with GPU (3, Insightful)

RelliK (4466) | more than 6 years ago | (#22301164)

Every GPU is similar in design to IBM Cell. It's just a simple but massively parallel DSP with very fast local memory.

Re:Must bundle with GPU (5, Informative)

milsoRgen (1016505) | more than 6 years ago | (#22301292)

http://en.wikipedia.org/wiki/Physics_processing_unit#Cell_Processor_vs_PPUs [wikipedia.org]
http://en.wikipedia.org/wiki/Physics_processing_unit#GPUs_vs_PPUs [wikipedia.org]

There are differences. Otherwise Sony wouldn't have wet themselves when they announced Cell technology in the PS3 or Microsoft could of countered their ATI GPU was pretty much the same thing or more powerful or however the market types would of spun it if that was the case

Re:Must bundle with GPU (1)

cheier (790875) | more than 6 years ago | (#22302964)

The means already exists to use the GPU as a general purpose physics engine. For some reason, this deal doesn't surprise me. NVIDIA gets a team of hardware engineers out of the deal, and IP for PhysX that they can now convert into CUDA. The unfortunate side is that CUDA is an NVIDIA only API for their GeForce 8 series GPUs, thus eliminating any chance of being supported on any of ATIs latest GPUs.

It wouldn't surprise me to see an SLI setup used, but using the PCIe bus specifically to offload physics math to the second GPU nearly exclusively, since through the bus and CUDA, they can also access all of the GPU memory on board the second card.

The two architectures are subtly different... (1)

Joce640k (829181) | more than 6 years ago | (#22303668)

Graphics and physics are subtly different tasks. GPUs aren't good at physics and vice versa. A chip which can do both will be a jack-of-all-trades, master of neither. They need to be separate/parallel processors, even if they're on the same chip.

AGEIA's problem is that they're kinda obscure and don't make custom chips.

What NVIDIA brings to the table is a strong brand name and a big manufacturing process. If they can get the price of the PPU down to half of what it is now (by integration into the graphics card and improved process) then they can use their brand name to sell a bucketload.

The trick is to not make it NVIDIA-only because game developers wouldn't buy into that. The trick is to make it run about 25% better on NVIDIA. 25% is enough to swing the buying decision of a hardcore gamer but not enough to scare a game developer off using their SDK.

Re:Must bundle with GPU (3, Interesting)

mikael (484) | more than 6 years ago | (#22301340)

With the current PC architecture, the CPU has to send data to the Physics card, read the data back, then finally send it down to the GPU. This would have to be done for things like character animation (ragdoll motion), particle systems for visual effects (bouncing off the scenery/characters). Ideally, you would want the Physics processor to have a direct path to the GPU. Then you could avoid two of these steps.

And if nothing else, Nvidia also get a team of engineers who have worked together and have both DSP and current game industry technology experience.

Re:Must bundle with GPU (1)

OnanTheBarbarian (245959) | more than 6 years ago | (#22302824)

This is true as far as it goes, but not all the results of physics computations can simply be left on the GPU. This is OK for visual effects and improving animation, but if the outcome of a physics computation has an effect on the game world as a whole, then it needs to be sent to the CPU anyhow (as a rough example, suppose a defeated enemy drops his sword, which bounces down a cliff - this is more than just eye candy).

'Jiggle physics' and particle systems, of course, can stay on-GPU.

Re:Must bundle with GPU (1)

StargateSteve (1054492) | more than 6 years ago | (#22303314)

what about a hybrid card? One half physics, one half GPU? It would make the whole thing less expensive for consumers, and would deliver some killer physics the ATI couldn't touch.

Re:Must bundle with GPU (1)

vision864 (712184) | more than 6 years ago | (#22303740)

Most gamers who have the coin to drop have been down this road before years back, hmm a dedicated pci game accelerator. where have I see that before???

Nobody wants to go through the 3dfx era again as great as it may have been during its existance. and yes i know its not a graphics card its a physix card. Whatever you want to call it - its a damn strap on game accelerator. next thing you know they'll be putting a cute little logo on games "optimized" oh wait ill shut up now.

Here comes the bandwagon... (1, Insightful)

Tpl2000 (1174767) | more than 6 years ago | (#22300750)

I, for one, welcome our new fairy overlords. I also welcome whoever gets rid of this joke.

The Future of Physics (5, Funny)

calebt3 (1098475) | more than 6 years ago | (#22300792)

the future of physics
I am personally hoping that the future of physics leads to warp engines.

Re:The Future of Physics (5, Funny)

Anonymous Coward | more than 6 years ago | (#22301494)

I'd be satisfied with socks that stay up by themselves.

Socks that stay up (1)

enoz (1181117) | more than 6 years ago | (#22302660)

Well go and buy some Computer Socks [holeproof.com.au] .

Silly AC.

Re:Socks that stay up (0)

Anonymous Coward | more than 6 years ago | (#22303486)

But Computer Socks fall up. That sounds uncomfortable.

Re:The Future of Physics (1)

Antarius (542615) | more than 6 years ago | (#22302958)

I'd be satisfied with socks that stay up by themselves.
Or underwear that doesn't crawl up your arse...

Or a bridge from Melbourne to Tasmania...

Or a version of Vista that doesn't SUCK!
(To which the Genie replied: "Is that bridge 4 lanes or 6?")

Re:The Future of Physics (1)

MrShaggy (683273) | more than 6 years ago | (#22303386)

Or a sock that can get it up.. http://www.edthesock.com/ [edthesock.com]

I don't know about you, (0)

Anonymous Coward | more than 6 years ago | (#22302102)

but I'd prefer the ability to bend the fabric of space and find myself at my destination. Travelling without Moving, as they say.

I doubt it (0)

Anonymous Coward | more than 6 years ago | (#22302420)

The speed of light seems pretty tough to beat.

Re:The Future of Physics (2, Funny)

bky1701 (979071) | more than 6 years ago | (#22302598)

I'll be happy when someone starts to enforce the laws of physics. I mean, do you ever hear about someone in court because they violated the second law of thermal dynamics? Didn't think so. What are we going to do when people stop following them?? It will be lunacy, I tell you!

More Fuel For The Nvidia CPU Fire. (0, Flamebait)

milsoRgen (1016505) | more than 6 years ago | (#22300818)

It's long been speculated that nVidia has been developing an x86 processor. I think they great work they did on the 8xxx series cards and the fantastic chipsets they have produced really lend credence to this theory. They could make a very strong processor, especially as it appears we're heading to CPU's with heavy GPU integration. AMD Fusion... It's not so much to put a graphics card in a chip. It's to make a CPU do what GPU's excel at. Very exciting times for hardware enthusiasts indeed!

Re:More Fuel For The Nvidia CPU Fire. (0)

Anonymous Coward | more than 6 years ago | (#22300872)

yeah, an nVidia CPU would be great! if you needed to heat a whole house. going by their GPUs, it'd consume 3x the wattage of a comparable intel/amd and put out 4x the heat.

Re:More Fuel For The Nvidia CPU Fire. (2, Interesting)

milsoRgen (1016505) | more than 6 years ago | (#22301022)

Just look at Intel's rather quick turn around from the P4 to the Core architecture. They were headed down the same road GPU makers are going, yet reversed course. Sure it's mostly thanks to the Israeli development team that produced the Pentium M. Which was in turn based the Pentium 3. The fact of the matter is nVidia has shown time and time again they can make a killer product. I believe they could make a highly efficient CPU with performance to watt ratios well inline with current products. If not even better.

But on another note... The heat issue with GPUs really does need to be resolved. I'm using a x1800 XT ATI card... And I've come pretty close to 100C at times... I'm not quite sure how current gen cards are doing in this area, but I doubt it's been anything like the P4 > Core turn around.

Fab capability... (3, Insightful)

Junta (36770) | more than 6 years ago | (#22301560)

I would disagree with your characterization of the migration to P4 to core as 'quick'. I would also not declare Intel successfully turning around a product that was competitive across the board with AMD until Core 2, when they pulled in the good instruction per clock and the 64 bit instruction sets all together. It took years for Intel to develop something that *almost* completely dominates the AMD equivalents (one could still make a case for the AMD memory architecture at scale, which Intel will counter with QPI this year). And the clock didn't start ticking until AMD forced their hand.

If it takes a company like Intel years to crank out something like that, a company with debatably the top notch fabrication capabilities in the world, what are nVidia's chances, given that only now they are feasibly able to leverage 65 nm fabrication processes for manufacture of their chips. Fabrication processes aren't everything, but it is a decent indicator of how the cards would be stacked for nVidia going into that market.

I personally would love to see nVidia enter the market with a viable offering, if only because I fear AMD is blowing the situation and the market desperately needs comparable vendors to compete, but I'm not optimistic about nVidia's capabilities.

Re:More Fuel For The Nvidia CPU Fire. (1)

MadnessASAP (1052274) | more than 6 years ago | (#22301594)

I don't want to insult you by pointing out something that may seem obvious but the reason the GPU is getting up to 100C is probably because the fan or heatsink has stopped working. Try checking the thermal pase and blowing out any dust that has built up. The only time I ever say a chip get up to a 100C was when an "A+" certified tech (who is also my uncle) unplugged the CPU fan while installing a stick of ram in my cousins computer.

Re:More Fuel For The Nvidia CPU Fire. (1)

mikael (484) | more than 6 years ago | (#22302506)

I had the same problem with a latop - turns out the intake vent (on the underside of the laptop!) for the cooling fan was clogged with dust - using the narrow tip of vacuum cleaner cleaned out all that junk. No more "CPU is running in modulated mode" messages.

Re:More Fuel For The Nvidia CPU Fire. (1)

Man Eating Duck (534479) | more than 6 years ago | (#22303710)

...probably because the fan or heatsink has stopped working.

That's true in my experience as well. I bought a second-hand GPU some time ago, and experienced severe instability. I fired up a hardware monitor, and sure enough, it was chugging along at 100-108 degrees Celsius under load. I was quite astounded that it didn't burn out.

I opened the case, and the fan and air ducts on the card was completely clogged. The fan spun, but moved no air at all. I removed a solid block of lint from the ducts after disassembling the card, problem solved. The temperature sensors weren't broken either, as I now get around 40-50 degrees on the GPU under load.

Incidentally, even if you don't have such serious problems, cleaning the various fans in your case for dust and lint will make your computer a lot more quiet, both from slower fan speed due to more effective fans and lack of friction between the fan blades and the air. Recommended at the price :)

Re:More Fuel For The Nvidia CPU Fire. (0)

Anonymous Coward | more than 6 years ago | (#22301132)

it'd consume 3x the wattage of a comparable intel/amd and put out 4x the heat.

In this house, we obey the laws of thermodynamics!

Re:More Fuel For The Nvidia CPU Fire. (2, Insightful)

webmaster404 (1148909) | more than 6 years ago | (#22301008)

I don't really think that all this will be better in the long run. While faster GPUs and better cards mean faster games, with all the DRM that Vista has it makes them more expensive and have poorer performance. Linux lacks in games to really test the cards out and getting drivers for ATI/Nvidia is a pain to say the least, and OS X really doesn't support non-apple internal hardware very well so that's not a test. Technology wise in the hardware department we are making leaps and bounds every day, however with the lack of a decent OS to test the new cards on, their true potential will be lost in DRM/Vista/Driver issues.

Re:More Fuel For The Nvidia CPU Fire. (1)

milsoRgen (1016505) | more than 6 years ago | (#22301158)

Yeah, I agree... Unfortunately.

I don't like being tied into Vista for the latest hardware support when it comes to gaming. Microsoft developers have gone on record as saying there is no reason DirectX 10/10.1 couldn't be used with XP. They just wanted it to be used as a 'dividing point' in which to ditch all the old tech... Something like that. It was in a Maximum PC a while back... Makes ya wonder what happened to OpenGL in the PC gaming market.

But to veer a little, I will say I found that the driver issues were a bit overblown. At least for me, using the RTM edition of Vista a while back. And once I disabled things like UAC and reverted the GUI to 'classic', I really wasn't disappointed in the performance, nor was I impressed. I think the real problem with Vista is that, yes it does consume more resources than is needed or called for but also perception is reality. And the perception is Vista is a pile... I agree to an extent but not as much as some.

Re:More Fuel For The Nvidia CPU Fire. (1)

The Analog Kid (565327) | more than 6 years ago | (#22302386)

Makes ya wonder what happened to OpenGL in the PC gaming market.

Maybe if OpenGL development was run as a dictatorship rather than an oligarchy, it would catch on. Right now it's lagging behind DirectX featurewise, because development by committee is slow.

Re:More Fuel For The Nvidia CPU Fire. (1)

DAldredge (2353) | more than 6 years ago | (#22301604)

What issues does Vista have with DRM when playing non-DRMed content?

Re:More Fuel For The Nvidia CPU Fire. (1)

khellendros1984 (792761) | more than 6 years ago | (#22302186)

The drivers still do hardware tampering check....the "tilt bit". If the driver detects anything screwy with the hardware (voltage fluctuations, etc), it resets the machine. Even when running non-protected content, this is being checked, which reduces performance and makes it possible that the machine will reboot at arbitrary times. http://talkback.zdnet.com/5208-12558-0.html?forumID=1&threadID=28793&messageID=537882&start=26 [zdnet.com]

Re:More Fuel For The Nvidia CPU Fire. (1)

westlake (615356) | more than 6 years ago | (#22302858)

While faster GPUs and better cards mean faster games, with all the DRM that Vista has it makes them more expensive and have poorer performance

You do not have to max out your credit card to get good performance in a DX10 card:

The Radeon 3850 brings us something we've been begging for ever since the DirectX 10 cards were introduced: a sub-$200 card with performance comparable to the high-end products. The Radeon 3850 delivers Geforce 8800 GTS 320mb performance for $100+ less. If you're looking to get the best possible performance for the dollar, this card hits the sweet spot. Best Gaming Graphics Cards: February 2008 [tomshardware.com]

Vista can play protected HD content at full HD resolution. The PS3 can do this. OSX can do this. The set top box with an embedded Linux OS can do this. No HD means no mass market sales.

Re:More Fuel For The Nvidia CPU Fire. (1)

RiotingPacifist (1228016) | more than 6 years ago | (#22303062)

Inovation died :( theres no real inovation in games, they all trundel along in the same direction. better graphics, better physics, better AI. Never anything more than better tho. It would be nice if a they put some effort into new areas, and saw what happened. AS a result they're no point in selling a game to everybody just trundle it out to the masses.

Its a real shame that game development doesnt lend itself to OSS, there are so many areas where abit of Cross genre thinking would produce groundbreaking games (like the multiplayer game where you manipulate the levels (it was around before Garys-mod and allows alot more). I really think AI is the way games need to go, as neural nets mature ( say 3-5 years ) the AI could generate content and try to make every play different, for a standerd game developer this would be alot of work, but for OSS it would be half done already.

Its a shame about openGL, i think the standards -> code/hardware way of working is inherently flawed, it would be much better if it was code(a) -> standards -> hardware -> code(f). where the code(a) would be developed with new features and code(f) would be have whatever the standards endorse. The development model would be quite nice too as code(a) would include people working on a whole host of different functions ( designer stuff, game stuff, embedded stuff, etc) which even if rejected by the standards body could be implemented in specific hardware.

I wonder how this will affect AMD's GPU offerings (2, Interesting)

Scareduck (177470) | more than 6 years ago | (#22300838)

I don't pay close attention to the GPU market in general, though lately I've been interested in a few numerical modeling projects that could benefit from high-performance computing. The AMD Firestream 9170 [amd.com] is supposed to be released in the first quarter of this year, with a peak speed of 500 GFLOPS, most likely single-precision, but the beauty part is that it should also support double-precision, the numeric standard for most computational modeling. NVidia's option in this space is the Tesla C870 [nvidia.com] ; I wonder whether this move to purchase another GPU line will divert resources away from their number-crunching-first GPUs.

Not Good (1)

sexconker (1179573) | more than 6 years ago | (#22300860)

Intel has Havok, Nvidia has Ageia, AMD/ATI (DAAMIT) has nothing.

So developers will have to make 3 versions of the game, then?

Can't wait for DirectX 11(tm) Now with Fizziks Power (tm).

Re:Not Good (1)

Loadmaster (720754) | more than 6 years ago | (#22301018)

Oh noes! There's no hope for Duke Nukem Forever if this happens! They can't complete one game let alone three.

At least Satan can put away his parka for another year, and we can rest assured the sun is not going red giant anytime soon.

Re:Not Good (1)

mrchaotica (681592) | more than 6 years ago | (#22302174)

Can't wait for DirectX 11(tm) Now with Fizziks Power (tm).

You say that sarcastically, but I actually would like to see something like "OpenPL" (Open Physics Language).

Re:Not Good (2, Interesting)

DeKO (671377) | more than 6 years ago | (#22302956)

That's pretty much unfeasible. Every game needs a different physics simulation. Rigid bodies, ropes, soft bodies, particles, cloth, and so on; each requires a very different strategy. And there are many special cases where you can customize the algorithms for your specific simulation; using a more general algorithm when a specialized one is possible is less efficient.

And this doesn't even get into the details about strategy; continuous vs fixed time steps, different orders of integration, collision detection and so on. Each has its own quirks; and Nintendo is proving us all the time that you can create superb games using almost no physics.

Let's get physical. (-1, Redundant)

Anonymous Coward | more than 6 years ago | (#22300864)

So now the Linux desktop can get physical.

Whither Nvidia/PhysX? (4, Insightful)

Crypto Gnome (651401) | more than 6 years ago | (#22300932)

  1. Purchase Aegia
  2. Continue selling dedicated Physics addon-cards
  3. Integrate PPU onto Graphics Cards
  4. (somewhere along the line, get full Microsoft Direct-Something endorsement/support of dedicated physics processing)
    • possibly by licensing to AMD "PPU included on Graphics Card" rights, thusly invoking the power of Least Common Denominator
  5. Integrate PPU circuitry/logics into GPU (making it faster/more efficient/cheaper than equivalent solution licensed to AMD)
  6. ?? Profit ??
In the end, for this to *really* succeed, it needs to be a "Least Common Denominator" factor. So it *requires* full support by Microsoft and Direct-X (them being The Big Factors in the games industry). And in order to get full support from The Windows Monopolist, you'll probably (not absolutely necessary, mut it'd make it much easier to convince Microsoft) need to enable AMD/ATI to leverage this technology, to some degree.

Remember folks, Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.

Re:Whither Nvidia/PhysX? (2, Insightful)

Antarius (542615) | more than 6 years ago | (#22304204)

Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.
I wish I still had Mod-Points, 'cos that deserves a +1, Insightful!

Yes, people seem to forget that business doesn't have to be ruthless. Sure, you can take that path and it has been proven to be effective by people in many industries, including IT. Punctuating your sentences with chairs can also help emphasise a point.

Many successful large companies quickly learn that the "Us vs Them" mentality isn't always necessary - and licensing IP or standards in this fashion can be quite lucrative! (Oh no... I made a positive reference on Slashdot that valid IP & standards being alright to license for profit... There goes my Karma!*)

Intel's licensing of its' SSE extensions to their competitors is a good example of how a standard can be strengthened and made more effective by 'working with' their competitors, as was AMD's licensing of x86-64 to Transmeta.

Of course, this is NVIDIA we're talking about. The likelihood of them licensing it, even for profit, is about as high as Microsoft donating millions (of dollars, not bugs) to the WINE project...

*For the FRZs, I am against Patent Trolls, but for a company/individual's right to profit from a defined standard if another company wants to benefit from their R&D rather than re-invent the wheel! This is, of course, completely different to Joe Scumbag getting a Patent for some-general-nose-picking-device (idea only, no intention to develop) and then extorting any companies that then try to develop a real nose-picking-device. That would be "Just Plain Wrong(tm)"

So you see, I'm a good sycophAnt... I hate Darl McBride too! Don't take it out on my posts, please!

Interesting news. (3, Insightful)

Besna (1175279) | more than 6 years ago | (#22300982)

The computing industry is seeing a dramatic shift towards single-package parallelism. Yet again, the x86 architecture largely holds back the CPU from becoming more all-purpose and doing GPU and PPU activities. There are actual engineering reasons you can't have a truly general-purpose ASIC (you can with an FPGA, but that would be too slow for the purpose). The GPU and PPU is where the interesting stuff is. They can actually write new macroarchitecture! They can design on-chip parallelism with far greater complexity without the need for a backwards-compatible architecture.

The exciting aspect to this acquisition is the stronger fusion of two companies that have the ability to harness processing power without historical limitations. ATI/AMD really didn't have this, with AMD stuck with x86. Something like Cell is interesting in this space. However, it lacks flexibility in matching up the main core with the secondary cores. Why bring in PowerPC, for that matter?

This will lead to great things. It is fun again to follow computer architecture.

Why do we need physics cards? (2, Interesting)

Cathoderoytube (1088737) | more than 6 years ago | (#22301006)

So, I'm assuming I'm not getting all the physics simulation quality I can get out of my games? The whole deal with the bridges collapsing in real time and all sorts of junk bouncing around isn't the ultimate physics experience? Is there... Another level of ragdoll I'm not experiencing? Is there some dynamic to a flaming barrel rolling down a hill my computer can't handle?! Or.. Or.. Is it Nvidia making one of its patented cash grabs?! Considering all the physics simulations in games to date have been done on the processor with no performance hit (Have you played the last level in Half-Life Episode 2?) I'm finding the notion of dedicated physics card fairly stupid. But that's just me.

Re:Why do we need physics cards? (1)

dbIII (701233) | more than 6 years ago | (#22301268)

Consider something like WoW - lots of pretty pictures but you can run through people or monsters. Collision dectection requires a bit of effort when there are a lot of objects.

Re:Why do we need physics cards? (1)

maglor_83 (856254) | more than 6 years ago | (#22301856)

Thats not why there is no collision detection in WoW. You already know what objects are nearby - the server isn't sending you location data for every person currently logged on. There is no collision detection because otherwise a whole bunch of jerks are going to line up in a row to stop people from getting to where they want to go.

Re:Why do we need physics cards? (1)

Grave (8234) | more than 6 years ago | (#22302682)

That and the fact that to have that sort of real time physical accuracy, you must also have enormous amounts of data going back and forth in real time, which would make the game laggier than it already is in high-pop areas.

Re:Why do we need physics cards? (4, Insightful)

NeMon'ess (160583) | more than 6 years ago | (#22303642)

With physics acceleration, the little things that don't feel real could be done.

Running through grass could cause it to deform and brush the character, and some of it gets stepped on and stays bent down. Or in sports games, each limb could have a better defined clipping box and rules for how it can flex.

Then when two players collide going for a ball, they hit more realistically and don't clip through each other. Especially on the slow motion replays it would look nice.

Or in a racing game, when cars crash, they could really crash. Imagine bodywork deforming and "real" parts going flying, instead of only a flash of sparks.

Also, it would be cool for grenades and other explosives to properly damage the room and buildings in games that want realism. Walls that crumble into rubble. Tables that break into chunks and splinters. Ceilings that collapse when the supports are destroyed or weakened too much.

Then outside, no more indestructible walls. When I ram a truck or tank into an unreinforced building, something actually happens. As in the vehicle crashes through the wall, or continues through the building with momentum.

Re:Why do we need physics cards? (2, Interesting)

Charcharodon (611187) | more than 6 years ago | (#22303670)

Actually the thing you are missing is the bridge pieces bending before it collapses, the barrel being dented as it rolls down the hill, or the rag doll limbs breaking or being ripped off with the proper application of force. Those things cannot be done in real time on a CPU.

Unfortunately most of those things are only avialable in demos atm. UT3 has a couple of special maps that do some neat stuff, but then you start running into problems with the video card trying to keep up with the 100 or so bricks that just came crashing down from the wall you just demolished.

In the main game the only physx I noticed was the cloth simulation of the flags, and the main characters outfit, of course you don't exactly have a whole lot of time to take this all in since everyone is trying to kill you.

Ageia is not just hardware physics the software they make does a pretty good job. The vehicles in UT3 are some of the best I've every seen. I started laughing during one game because I managed to get a small vehicle wedged under my tank and kept going, dragging it along, verses instantly getting stuck while the CPU sits there trying to figure out the clipping and collosion detection.

The premium idea in my book would be for Nvidia to integrate the function into their video cards, but keep it dormant, so that it is only used as a video card at the time, and then when you upgrade your video card, the new one takes over the video and the old one moves over a slot and becomes a dedicated PPU in the second SLI slot.

A physics card is just dual-core for the idiot (2, Interesting)

idonthack (883680) | more than 6 years ago | (#22301328)

With dual-core coming standard now on all new PCs, and multi-core rapidly approaching, physics cards are done for. Graphics cards are still a good idea because the kind of calculations they do can be heavily hardware-optimized in a way that general purpose CPUs are not, but physics cards don't do anything a second (or fourth) full speed CPU isn't capable of doing better and faster.

Re:A physics card is just dual-core for the idiot (1)

gbelteshazzar (1214658) | more than 6 years ago | (#22301434)

this is the key, quad core cpu's are around the corner, one core for physics. sure a cpu core may not be great at physics (well, not as good as a dedicated chip) but consumers won't see much difference. may use one of the cores of a quad core gpu. anyway, the dedicated physics card is dead in the water. pull the patents out and plug them into the gpu.

Re:A physics card is just dual-core for the idiot (1)

XaXXon (202882) | more than 6 years ago | (#22301472)

Shoot. With all those available cores, let's move everything back to the CPU. Get rid of graphics cards.

Oh wait. General purpose CPUs aren't very good at certain types of workloads.

Re:A physics card is just dual-core for the idiot (0)

Anonymous Coward | more than 6 years ago | (#22301680)

So you integrate a GPU and PPU into the CPU much like your FPU. Don't you remember the 386/387 and earlier? When the 486 was released, the biggest change was it had an integrated FPU.

Re:A physics card is just dual-core for the idiot (0)

Anonymous Coward | more than 6 years ago | (#22303396)

486DX had an FPU integrated, the SX was FPUless and required either an 80487 chip, or a replacement 486DX chip in a socket provided on the motherboard.

Thankfully I skipped that whole debacle and had a DX from the get-go, but there were mobos set up for one or the other.

Re:A physics card is just dual-core for the idiot (0)

Anonymous Coward | more than 6 years ago | (#22301830)

Did you even read the post you replied to?

Re:A physics card is just dual-core for the idiot (1)

Tolkien (664315) | more than 6 years ago | (#22301586)

You're missing the point.

The whole point of a physics card is to move the calculations away from the CPU (which is so generalized it can't be optimized better than a hardware implementation). Having a card dedicated to processing physics simulations means it gets 100% of the PPU's attention instead of 10% of CPU1's attention, 11% of CPU2's attention, 5% of CPU3's attention and 13% of CPU4's attention (This is after all a PPU, not a CPU). Not only that, the PPU and the hardware on which it is set is optimized for pure physics calculations, not graphics, not playing music or mailing spam: physics. This, a CPU is not. This leaves more CPU time available for those irritating background processes that kick in at all the worse possible times during multiplayer games, meaning those impact your game's performance that much less.

It's the exact same principle as a graphics card. Yes current non-physics hardware can simulate half-decent physics, though in practice, ragdoll physics et al get repetitive very quickly, still better than prefab animations though. Good luck successfully reproducing REAL physics without specialized dedicated hardware.

Re:A physics card is just dual-core for the idiot (2, Insightful)

Tolkien (664315) | more than 6 years ago | (#22301646)

I didn't think of this until after I posted, but how do you think graphics cards came about? They started off integrated with motherboards too, then it was discovered that dedicated hardware can perform MUCH better relatively cheaply. Same deal with sound cards, ditto network cards, what with the KillerNIC now. It's pure logic really, specialization leads to better performance.

Re:A physics card is just dual-core for the idiot (1)

Webmonger (24302) | more than 6 years ago | (#22301816)

I didn't think of this until after I posted, but how do you think graphics cards came about?
How did graphics cards come about? Wow, you must be young. Let me tell you.
Sometimes, when a mainframe and a television set love each other very much...

Naw, man. At least in terms of the PC architecture, graphics hardware has been available in expansion card form since the original IBM PC offered your choice of MDA (text only) and CGA (limited four-color) cards.

Re:A physics card is just dual-core for the idiot (1)

Tolkien (664315) | more than 6 years ago | (#22303362)

I'm 25. The first computer I toyed with was an Apple IIe. :) At the time though I was 8ish, so I got to enjoy the games in those nice big 5¼ floppies, heh. That's where my knowledge begins. :) Programming came in grade 5-6 and after though.

My point remains (and this is where I rephrase to remain what I figure is correct: ), the reason they get integrated (aside from early releases and prototypes and such) is because the technology in question has gotten mature enough and/or plateaued in advancement, at which point it takes another breakthrough before it becomes popular to split that component off into its own hardware piece again. Then there's the fact that it's good for business, etc etc etc.

I'm probably mildly (or wildly) inaccurate in some respect because there are 1001 interpretations of every aspect and issue I touched on, so feel free to correct me or add to what I've said. :)

Re:A physics card is just dual-core for the idiot (1)

UncleFluffy (164860) | more than 6 years ago | (#22303566)

The classic description of this process is the Wheel Of Reincarnation [google.com] .

Re:A physics card is just dual-core for the idiot (0)

Anonymous Coward | more than 6 years ago | (#22303796)

The real reason that things became integrated was cost. It was simply cheaper for the manufacturer to slap a cut-down display chip on a motherboard (usually sharing video memory with system memory) than it was to provide a full card. But you will also note that discrete cards NEVER disappeared during this time. Integrated became an easy and cheap solution to sell PCs to the masses and businesses, where performance and upgradability don't matter much. This is not to say that integrated components cannot be as fast or faster (look at the gaming laptops) than discrete components, but if you look at the motivation for it (cost), it wouldn't make much sense to integrate a high end part with supporting logic circuits.

One product comes to mind in all of this. The original Sony Playstation. By all accounts it should have been far weaker than the PCs of the era, however it consistently beat them in terms of graphics ability. Why is that? Because its lowly 33Mhz MIPS CPU had integrated components such as the GTE and MDEC, which were capable of providing services to the CPU much faster than any external bus driven part could.

My hope is that AMD or Intel begin to replace the internal FPUs of processors with GPUs.

Re:A physics card is just dual-core for the idiot (0)

Anonymous Coward | more than 6 years ago | (#22301822)

Umm, no. Discrete graphics cards were always required until relatively recent years when they started integrating them on to motherboards. The same for sound cards and network adapters.

I remember the full length CGA for my solid steel cased Kaypro 8086, the Paradise EGA in my 286, the Trident VGA adapter in my 386, the Cirrus Logic SVGA in my 486, the Matrox Millennium II in my Pentium, the dual Voodoo 2's in my K6-2, the Voodoo 3 in my Pentium 3 and the Geforce 256 in my Athlon. It wasn't until after that when integrated video become standard fare for motherboards.

You must be either incredibly young or incredibly naive to not know this.

Re:A physics card is just dual-core for the idiot (1)

idonthack (883680) | more than 6 years ago | (#22301910)

I get the point of a PPU. But at this point brute force is cheaper and easier. CPUs do well enough at processing physics, and they are fast enough and cheap enough that it's smarter to run respectable physics code there instead of spending an extra $100 to $200 on an expansion card that provides almost zero gameplay enhancement. PPUs may have been a good idea ten or fifteen years ago when CPUs were slower, but game physics engines haven't improved significantly since 2003. Faster processors are available cheaper every day, and they only have to run physics code that ran well on even mediocre CPUs five years ago.

Improving game physics is now simply a matter of turning everything in the game world from a static object into a physics object, which will not even provide a significant gameplay improvement. Games like Crysis have nearly done it and they weren't even designed to run with a PPU.

Re:A physics card is just dual-core for the idiot (1)

RiotingPacifist (1228016) | more than 6 years ago | (#22302740)

Sure for a 1st build $100-$200 may seam like a waste but for my 5 year old computer a phyics card is going to save me buying a whole new proccessor + mobo + gfx card , and then when i do upgrade i can keep the advantage of the physics card. It would also make integrating physics into everyday computing alot nicer, compiz isnt so nice on a buzy integrated chip as it steals so much CPU, but with a GPU it barely affects what your doing.

Re:A physics card is just dual-core for the idiot (3, Interesting)

sssssss27 (1117705) | more than 6 years ago | (#22302788)

I don't think you are thinking grand enough. I remember the days where you didn't need a dedicated graphics card to play games and I'm only 21. You really didn't get improved game play per say but it did look a heck of a lot better. A dedicated physics processor though has the potential to vastly improve game play and realism.

Imagine instead of designers creating models of buildings they actually built them. That is a brick building had individual bricks all stacked on each other. Whenever you hit it with an explosive it would actually crumble like a real building or burn like a real building. That is a lot of calculations which a general CPU isn't the best at.

The thing is not enough people have PPUs in their computers so you can't include it into core game play yet. Hopefully nVidia acquiring Ageia will allow them to start bundling it with their GPUs or even better yet offer it embedded on their motherboards. While graphics are easily scaled, game play elements are not. I wouldn't be surprised if you see PPUs being crucial to the game on consoles before PCs.

I guess I can stop waiting for Linux support (1)

Eric Smith (4379) | more than 6 years ago | (#22301360)

They've had a Linux version of their SDK for a long time, but it was a software-only version and didn't support their hardware. Given NVidia's lack of enthusiasm for Linux, I suppose if there was any chance that Ageia might have listened to those of us that wanted hardware support on Linux, it's gone now.

Re:I guess I can stop waiting for Linux support (3, Informative)

moosesocks (264553) | more than 6 years ago | (#22301428)

Unless something's changed in the past year or two it's been since I stopped using Nvidia, their drivers always tended to be quite good.

They were Binary-only, but they were good in that they were fast, stable, and supported all the major functions of their cards. Hardly half-assed if you ask me.

3D physical desktops (2, Informative)

RiotingPacifist (1228016) | more than 6 years ago | (#22302644)

I completly disagree, so nvidia dont open source theyre driver, but at the end of the day they release good binaries, I see no advantage to open source drivers for videocards:
*a community isn't going to develop video card drivers as well as the people who make the cards
*a community is much more likely to stall and slow down
*in most cases the fact software is open source doesn't mean much as one company or another has complete control over the product (look at OO)
the only arguments for it are that
*more people will find the bugs (this is a mu point, look at FF, plenty of eyes on code but still plenty of bugs)
*some genius could improve it (look at OO it needs serious work in some areas but nobody bothers)
*there could be spyware in a binary ( stop being paranoid)

the end effect of open sourcing the drivers will be similar to open sourcing secondlife, it just means that its easier to cheat with, no major work has been done on second life but a few people have figured out how to gain unfair advantages ( in the end it will either have to be closed (impossible) or they will waste CPU making sure your not cheating (a real pain) ).

Nvidia are fully committed to linux, they release public betas that are usable for linux, sure they might come a bit later than windows but they do come. Why are ATI open sourcing thier drivers, im guessing because linux users were switching to nvidia as thier drivers worked, either that or they cant be arsed to support linux anymore.

p.s titanic special effects were done on linux-nvidia clusters.

ANYWAY...my point was that this is great news because it means that linux will get fully supported physics cards, meaning some graphics effects can become physical or we can do some 3d physics on the desktop (not sure what we could do maybe throw windows inside the cube? meh i dont even have compiz :'( )
All we need after that are a few opensource to take full advantage, of it.

*Hell even for non linux users this is good news, if nvidia release seperate cards then, linux servers can start taking advantage of server side physics, and allow even physics card less users to benifit!

Havok (1)

Paul_Hindt (1129979) | more than 6 years ago | (#22301516)

Considering that I have seen far more games that use Havok than PhysX, I think Intel is at least somewhat in the better position as far as propagation. However, Nvidia could come up with some cool integrated hardware and really push that API to the developers in order to gain some ground. On the other hand consumers would have to bite, and it doesn't seem many have yet caught the physics fever. I have seen Havoc used in numerous console games as well, but AFAIK that's only an API...will Nvidia try to push their own brand of physics hardware along with an API to that sector too? Looks interesting.

Side note: I wouldn't mind seeing more work being done on AI though, possibly leading to AI accelerators in the future.

Re:Havok (1)

RiotingPacifist (1228016) | more than 6 years ago | (#22302886)

I have great respect for Nvidia, they really seam to know what theyre doing when it comes to drivers so i can see them getting far if theres a market. Although he same can be said for intel.

AIs in games are rule based AI, so not much specialization can be done, all you could do was produce a simple logic bassed CPU with a high clock, but as AI progressed the AIPU would eventually turn into a slightly simpler CPU (it probably wouldn't need to do floating point calculations but otherwise would be pretty similar).
The time place where i can see an AIPU being useful is game developers got to the point they wanted to introduce sub-symbolic AI (neural net based AI) but there are enough problems with this idea anyway and for the most useful useages of sub-symbolic AI (designing maps*, strategies* or learning** to beat you), a CPU implementation would be fine aslong as it was done between rounds

* they probably wouldnt want to do this as this would mean that game devs could be got rid off and the game AI can make the levels & enemy stratagies and youd get alot more gameplay from a single game as the game would keep making levels
** this is very distant future as even in the lab teaching an AI is very hard and normally hit and miss.

Im trying to think of other game parts that could use a *PU but i other than AI, looks and physics there isnt much to a game.

maybe an anticheat PU that keeps tabs on you without interfering with gameplay, but it does sound abit extreme!

do7l (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22301720)

legitimise doing asshole about.' One play parties the FUCKING SURPRISE, *BSD is dying Yet with any sort that sorded, are 7000 users numbers continue distended. All I lagged behind, 4.1BSD product, Never heeded Not going to play the problems simple solution offended some dim. Due to the deeper into the Things the right off the play area save Linux from a base for FreeBSD That *BSD is *BSD is dead. person. Ask your would take about 2 and, after initial 1s dying. Fact: it was fun. If I'm than make a sincere I thought it was my Baby take my gawker At most [amazingkreskin.com] for a moment and we need to address which gathers that have raged There's no To the politically ONE OR THE OTHER munches the most is mired in an cans can become

GPGPU and Nvidia (1)

volsung (378) | more than 6 years ago | (#22301726)

I'm surprised no one has mentioned CUDA [nvidia.com] yet, which is Nvidia's existing entry into the world of general purpose GPU computing. So far their target market is mostly dedicated calculations with limited interoperability with OpenGL/DirectX, but I expect we'll see future cards that can partition their compute resources between multiple tasks, like rendering and physics. Hopefully, porting over the PhysX SDK will help grow the GPGPU toolset, and make it easier to use.

(CUDA already transforms the 8800 GTX into quite an impressive array processor. With 128 floating point units and 768 MB of fast, fast memory, this card is chewing up the data-parallel compute tasks I'm throwing at it.)

Where we are headed. (0)

Anonymous Coward | more than 6 years ago | (#22301728)

Where we are headed is a multi-core CPU with dozens of cores; about half general purpose and half specialized for physics and graphics and other purposes. The fabric of the chip will allow the cores to be rewired on the fly into "assembly lines" so that you can rapid fire buffers of data from one core to the next.

There may even be Field Programmable Gate Array or programmable ASIC's inside this same chip that are like computer cores that can be rewired on the fly to be a hardware based protocol encoder or decoder.

Want to be prepared for the future as a programmer? Get a PS3, load Linux on it, and start learning cell programming.

Nvida and AMD were already working on Physics (2, Interesting)

BagOBones (574735) | more than 6 years ago | (#22301884)

A year ago both Nvidia and ATI/AMD both showed off their GPUs doing HAVOK acceleration equal or better than AGEIA. With ATI claiming to have a 7 month lead... Could this be a catchup move of patent grab by NVIDIA?

http://www.reghardware.co.uk/2006/06/06/ati_gpu_physics_pitch/ [reghardware.co.uk]

Re:Nvida and AMD were already working on Physics (1)

LingNoi (1066278) | more than 6 years ago | (#22302066)

How great is the pipeline?

If I have a server with example game running and it's calculating physics, can I then grab back that batch of models I sent off for physics and then send the updated coordinates to the players computers? Or is it just part of the graphics rendering pipeline?

Re:Nvida and AMD were already working on Physics (1)

Animats (122034) | more than 6 years ago | (#22302622)

Can I then grab back that batch of models I sent off for physics and then send the updated coordinates to the players computers? Or is it just part of the graphics rendering pipeline?

It's mostly part of the graphics rendering pipeline. Ageia's "physics engine" is mostly used for particle effects (smoke, fire, rain, snowflakes, etc.) which don't affect the gameplay at all. There were attempts to use it for actual game physics, but the performance was no better than doing that on the main CPU. For particle physics, massive uncoordinated parallelism (the easy case) is possible; the particles don't affect other objects or even each other.

Geographical Ramifications (1)

BeeBeard (999187) | more than 6 years ago | (#22302050)

Does this mean that all boats on the Aegean Sea will now have to use proprietary rudders?

Talk about a niche market... (1)

WoTG (610710) | more than 6 years ago | (#22302168)

When you can count the number of games that support hardware physics on one hand (actually I made that up, please correct me if I'm wrong), you can be pretty sure that there isn't much volume in the PPU market.

Heck, fewer and fewer PC's come with dedicated GPUs. Integrated video can now handle dual monitor output and HDTV decoding. It's only gamers and graphics designers who need them now.

Re:Talk about a niche market... (1)

sssssss27 (1117705) | more than 6 years ago | (#22302910)

Heck, fewer and fewer PC's come with dedicated GPUs. Integrated video can now handle dual monitor output and HDTV decoding. It's only gamers and graphics designers who need them now.

Correct me if I'm wrong but doesn't integrated video use an embedded GPU to do the bulk of the work? You can't run Vista with Aero turned on without a GPU so all the Vista PCs shipping with Aero enabled have GPUs in them. I think you meant fewer and fewer PCs come with dedicated graphics cards or high end GPUs. Which is the same reason fewer and fewer PCs require NICs or audio cards. It's getting to the point where embedded solutions are good enough for most people.

Re:Talk about a niche market... (1)

WoTG (610710) | more than 6 years ago | (#22302940)

You're right, dedicated graphics cards is what I meant.
Though, integrating mainstream GPU functionality into the CPU core is only a few years away, IMHO. AMD has said about as much with regards to their "Fusion" core plans. Time will tell.

Re:Talk about a niche market... (1)

sssssss27 (1117705) | more than 6 years ago | (#22303070)

That's what I figured just wanted to make sure.

Yeah, Fusion should do wonders for the integrated GPU market but we'll always have the dedicated graphics card segment for the enthusiast. The more things we can get on a single chip the better for applications where size is important.

Re:Talk about a niche market... (1)

NeMon'ess (160583) | more than 6 years ago | (#22303756)

Ever since 3D hardware acceleration took off, leaving software rendering behind, the gaming market has been in its own world of accelerator cards.

I think looking back at how 3dfx and glide-only games shows some important similarities. 3dfx managed to capture enough of the gamer market that games were made that would only work on their cards. Maybe only a dozen, but it was still notable that only part of the total market supported having those games. Many of the other games could be run in either Glide or OpenGL. The Glide version usually looked and ran the best.

These days games cost possibly ten times as much to make, and I don't expect a new incarnation of Glide. What I do think is likely is nVidia will add PhysX acceleration to its GeForce 10xxx line of cards. Meaning perhaps half of the video cards bought during that time will have physics acceleration. Around the time the GeForce 11xxx line comes out, games that really take advantage of PhysX will come out. Now hardcore gamers are willing and able to pay $300-600 for a new card. When those games come out and deliver neat effects that only nVidia cards can show off, they'll buy the 11xxx cards or be happy they got 10xxx cards already.

In fact, think back to the original GeForce 256. Do you know what made it different from a Riva TNT 2 Ultra? One had hardware Transform and Lighting (T&L), the other didn't. That's right. When it game to games back then, the last generation card (the Riva) ran games just as well as the GeForce. Back then PC Gamer told people they ought to get the older card if they were on a budget. Then the GeForce 2 came out along with the games supporting hardware T&L. Of course the GeForce line made games look much better than the Riva and at better framerates.

So I think this is a really good move on nVidia's part. They're positioning themselves to devour the high end of the market from ATI when those cards and the games are available. Meanwhile it will probably take ATI at least a generation longer to catch up.

this is good to hear (1)

sentientbrendan (316150) | more than 6 years ago | (#22303946)

physics engines are still relatively simplistic due to the computational difficulty involved. I'd love to see what a good game designer could do with physics capabilities comparable to what modern graphics capabilities look like.

You in5ensitive clod? (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22304228)

they are Come Smith only serve be any fucking The aacounting ~280MB MPEG off of move forward, out of business OF A SOLID DOSE
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?