Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Game Devs Only Use PhysX For the Money, Says AMD

Soulskill posted more than 4 years ago | from the dem's-fightin'-woids dept.

AMD 225

arcticstoat writes "AMD has just aimed a shot at Nvidia's PhysX technology, saying that most game developers only implement GPU-accelerated PhysX for the money. AMD's Richard Huddy explained that 'Nvidia creates a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.' However, he adds that 'the problem with that is obviously that the game developer doesn't actually want it. They're not doing it because they want it; they're doing it because they're paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn't because the game developer wants it in there.' AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX, as these APIs can run on both AMD and Nvidia GPUs. AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers."

cancel ×

225 comments

They wish they'd thought of it first (4, Informative)

EvolutionsPeak (913411) | more than 4 years ago | (#31403034)

Sounds to me like AMD just wishes they'd thought of it first. There's no reason AMD couldn't offer similar deals.

Re:They wish they'd thought of it first (3, Insightful)

hedwards (940851) | more than 4 years ago | (#31403112)

But, should they? If a developer doesn't want to use PhysX, they shouldn't. If they're doing it purely for money, then chances are that it's damaging to the industry. Sure physics acceleration is cool for certain types of games, racing games and FPS, but the problem is that developers shouldn't be paid to use technology that isn 't helpful for creating quality games.

Especially if it causes games to be less enjoyable on other hardware platforms. I could see a real problem with this in terms of anti-trust actions.

Re:They wish they'd thought of it first (2, Interesting)

NeutronCowboy (896098) | more than 4 years ago | (#31403438)

Look at the alternative: instead of adding useless physics to a game that doesn't need it, they could be adding advertisements. Advertising dollars are dollars nonetheless, and I very much prefer a quick splash screen of "powered by PhysX" and some mindless physics interactions than an in-game billboard (possibly even updated over the Internet, shudder).

Re:They wish they'd thought of it first (1)

liquiddark (719647) | more than 4 years ago | (#31403560)

I don't think PhysX is doing anything to slow that particular vector of suffering. Those studios that would sacrifice goodwill for additional funds are almost certainly just waiting on the appropriate framework.

Re:They wish they'd thought of it first (1)

Vanderhoth (1582661) | more than 4 years ago | (#31403564)

I very much prefer a quick splash screen of "powered by PhysX" and some mindless physics interactions than an in-game billboard (possibly even updated over the Internet, shudder).

Isn't flashing "Powered by PhysX" at the beginning of the game more annoying then driving past a billboard or seeing a commercial on a TV as you run past it? I know I get annoyed when I sit down to play a game and have to sit through five minutes of logos "From studio x, Powered by Chip-set-something, in association with company z". I'm not saying advertising the other way is necessary, useful or not annoying to a lesser extent.

Re:They wish they'd thought of it first (0)

Dishevel (1105119) | more than 4 years ago | (#31403922)

Actually I prefer the billboards. Imagine. Now a game dose not become monetarily successful based solely on number of units sold. But with in game advertising (sold on a per impression basis.) Then a good portion of the revenue for the game would be how often and for how long people play the game. Which means the developers have an incentive to make me play the game over and over. Free add ons and DLC. all for the price of a billboard in the game I can look at or not. If the advertising becomes too intrusive then it will take away my enjoyment of the game making me play it less and making the ads less effective and giving the game devs less money.

I think it will be great fro gaming and gamers.

Re:They wish they'd thought of it first (1)

icebraining (1313345) | more than 4 years ago | (#31403926)

Games like GTA (at least San Andreas) already have billboards, would it be so bad if they had real companies instead of fakes? I wouldn't care.

In fact, I modded some of those to look like real ones, using The Sopranos wallpapers :)

Re:They wish they'd thought of it first (3, Insightful)

DragonWriter (970822) | more than 4 years ago | (#31403736)

But, should they? If a developer doesn't want to use PhysX, they shouldn't. If they're doing it purely for money, then chances are that it's damaging to the industry. Sure physics acceleration is cool for certain types of games, racing games and FPS, but the problem is that developers shouldn't be paid to use technology that isn't helpful for creating quality games.

The payment could just mitigate the risk associated with bearing the extra cost of adding PhysX to a game when not all of the market can utilize it and there is limited experience with it in the developer community. That doesn't mean its bad for the industry, or bad for the quality of the game.

Especially if it causes games to be less enjoyable on other hardware platforms. I could see a real problem with this in terms of anti-trust actions.

Really? Can you point to any provision of anti-trust law that this would violate?

Re:They wish they'd thought of it first (1)

Totenglocke (1291680) | more than 4 years ago | (#31404176)

Especially if it causes games to be less enjoyable on other hardware platforms. I could see a real problem with this in terms of anti-trust actions.

Really? Can you point to any provision of anti-trust law that this would violate?

Exactly - that's like saying that there's an anti-trust suit just because Modern Warfare 2 looks better on PS3 than on the 360 or because it looks better on a system running a 295 GTX than on a system running a 9800 GT.

Its all Hearsay (5, Insightful)

KharmaWidow (1504025) | more than 4 years ago | (#31404202)

We don't have any proof that developers don't want PhysX. What we have is spokes person from company A saying that no one wants company B's technology. There are no scientifically obtained statistics only one guy's - a competitor - opinion.

Nor did the article state *why* it may be unwanted, or any specific why-nots for using PhysX

Re:They wish they'd thought of it first (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31403172)

PhysX adds nothing to the game play. It's just stupid clutter on the ground... At least in any games I've played that use it. As an Nvidia user, PhysX is no longer a reason to keep with the brand... at least now that I have used it.

That said, it's no surprise to me that game developers wouldn't support it without incentive

Re:They wish they'd thought of it first (1)

AndrewNeo (979708) | more than 4 years ago | (#31403424)

There's nothing PhysX actually adds to the experience of a game that Havok (physics done in-CPU) doesn't. Sure, PhysX can handle more intense simulations, but I don't see how it could improve, say, Half-Life 2's implementation of Havok.

Re:They wish they'd thought of it first (3, Interesting)

Z34107 (925136) | more than 4 years ago | (#31403882)

It's a difference in scale over Havok. I haven't had much time to play video games lately, but I saw a particularly nifty shot from Arkham Asylum. Shoot a bookshelf without PhysX and it falls over. Shoot it with PhysX and suddenly every individual page from every book flies through the air, each tracing its own path down from the sky.

So, you can do physics in Havok. But not on that scale.

I'd suspect that it's not being used for anything other than "ground clutter" is because you can't design your game around PhysX - not everyone has an NVIDIA card. So, PhysX has to be optional and can't change gameplay - which pretty much relegates it to ground clutter.

Re:They wish they'd thought of it first (2, Interesting)

afidel (530433) | more than 4 years ago | (#31404370)

In my experience if you try to push PhysX that hard on a midlevel GPU or lower it trashes the FPS to unplayable. In fact I disabled GPU based PhysX and had the driver fallback to CPU rendering because in most games I have spare cores but the GPU is already being pushed to the max. With a quad core CPU costing almost nothing extra over a dualcore (on desktops at least) and decent GPU's starting at $100 and quickly going up from there I think that's probably the norm for the vast majority of gamers.

Re:They wish they'd thought of it first (2, Interesting)

BatGnat (1568391) | more than 4 years ago | (#31403502)

Just because some companies use PhysX for pretty effects only, does not mean that someone else wont come along and use it for something cool that will add something to gameplay...

Re:They wish they'd thought of it first (1)

InsaneProcessor (869563) | more than 4 years ago | (#31403176)

It reads like sour grapes to me. If there wasn't enough money in doing it (either promotional or customer desires), then they wouldn't be doing it.

Re:They wish they'd thought of it first (0)

Anonymous Coward | more than 4 years ago | (#31403178)

Great, so we'll have two graphics card manufacturers bribing games developers with money to implement visual features that only work on their graphics cards. Although the cynical part of me thinks 'they might not be able to afford it', the more benevolent part thinks 'because the concept is absolutely not in the best interest of customers'. In an economic analysis, at the very least, the costs spent by NVIDIA and ATi will be absorbed through their graphics card prices (so consumers pay in any case), with a net productivity loss from the developers implementing graphics features twice that could have been based on a single model.

Re:They wish they'd thought of it first (2, Interesting)

Fwipp (1473271) | more than 4 years ago | (#31403408)

Did you miss the part where OpenCL and DirectCompute run on both NVIDIA and AMD graphics cards, and that AMD is promoting an open industry standard instead of a proprietary vendor-specific API?

Because I know it was really obvious, and sort of the entire point of the article, but it really sounds like you did.

Re:They wish they'd thought of it first (3, Informative)

jpmorgan (517966) | more than 4 years ago | (#31403562)

Suggesting that OpenCL and DirectCompute are alternatives to PhysX is analgous to saying that OpenGL is an alternative to Unreal Engine.

The basic reality here is that four years ago NVIDIA decided invest a lot of money in making GPUs more general purpose, to apply them to more problems than just 3D rendering. ATI didn't care and just focused on making the fastest 3D card possible. Today there are alternatives to NVIDIA's technology, most notably OpenCL... but it's worth remembering that OpenCL is very strongly derived from CUDA. In fact, most of the OpenCL spec looks like they ripped it out of the CUDA spec and changed the function calls from cudaSomething to clSomething.

So yes, open standards are good. But in this case it does smell strongly of sour grapes. ATI made several bad business decisions and have been left playing catchup.

Re:They wish they'd thought of it first (1)

jpmorgan (517966) | more than 4 years ago | (#31403584)

Actually, cuSomething to clSomething. OpenCL is basically a copy of the the CUDA driver API, which prefixes its functions with cu. The runtime API (a higher level API) is what uses the cuda prefix.

Re:They wish they'd thought of it first (0)

Anonymous Coward | more than 4 years ago | (#31403712)

I did not miss it, my response was specifically to OP's "There's no reason AMD couldn't offer similar deals."
I assumed he referred to AMD adapting the strategy of NVIDIA, and I gave two reasons not to (they can't afford it, and, it would be bad for customers in the long run)

Re:They wish they'd thought of it first (1)

Peter Nikolic (1093513) | more than 4 years ago | (#31404356)

in Reality all this boils down to the fact that AMD fucked up big time in buying into that ATI craphouse and should have gotten in with Nvidia from the start which is a shame as i would far rather have AMD CPU's than Intel but i cant stand that freakin ATI crap it never works
 

Re:They wish they'd thought of it first (1)

DIplomatic (1759914) | more than 4 years ago | (#31403480)

I heard game devs are only incorporating the "third dimension" into games because of the money, not because they want to.

Re:They wish they'd thought of it first (1)

UnknownSoldier (67820) | more than 4 years ago | (#31403868)

Spoken like someone who has never had to

a) animate + draw 2D sprites doing:

- neutral pose
- run pose
- neutral with weapon
- neutral with shield
- running with weapon
- running with shield

for ALL THE FRAMES. I haven't even m mentioned the other million permutations of the avatar+enemies in states such as poisoned, etc,

OR

b) programmed a graphics engine that has had to light said 2d sprites.

3D "won" because of it scaled up content creation. i.e. The convenience of animating, texturing, lighting and shading blows away the sheer amount of worked needed in 2D content creation. It never was about the money, or quality. It was about betting on a technology that would eventually be "good enough." 2D still looks better because it is easier to make something look good from one angle with fixed lights. /rant off...

Re:They wish they'd thought of it first (1)

chronosan (1109639) | more than 4 years ago | (#31404122)

I heard game devs are only incorporating Video Graphics Array output into games because of the money, not because they want to.

STOP IT! (1, Funny)

Anonymous Coward | more than 4 years ago | (#31404232)

in Korea, only old people use PhysX for the money, because Netcraft confirms that Apple is dying

Re:STOP IT! (2, Funny)

chronosan (1109639) | more than 4 years ago | (#31404382)

Oh crap, did I just cross the line into trolldom?

Re:They wish they'd thought of it first (0)

Anonymous Coward | more than 4 years ago | (#31403962)

Yeah, but in truth, Nvidia didn't either. They bought Ageia to get this, but it was a smart buy.

FYI, at first you had to pay developers to use 3D at first too.

Oh and I'll give people another piece to chew on: AMD (and ATI before them) has the worst developer relations group. They utterly suck at anything useful other than throwing money themselves. How many millions of dollars did ATI spend again to gain a couple FPS (less than 10) in Source engine again?

Hypocrites shouldn't throw stones. Then again, it's never stopped Nvidia either...

Maybe (3, Interesting)

Hatta (162192) | more than 4 years ago | (#31403036)

I wouldn't be surprised if most game devs wouldn't implement PhysX if not for a subsidy. Only half the market is going to be able to take advantage of it after all. It may not be that they don't want it, just that it's not an economical use of their time otherwise.

Re:Maybe (2, Interesting)

Vorknkx (929512) | more than 4 years ago | (#31403070)

Exactly. Havok and in-house physics engine are perfectly fine for physics simulations in games. I don't see why we need another third-party physics engine. Flying boxes and wood splinters do not make a better game.

Re:Maybe (2, Interesting)

Monkeedude1212 (1560403) | more than 4 years ago | (#31403222)

Flying boxes and wood splinters do not make a better game.

Well - it's the little things that make the differences though. I mean, you wouldn't think that flying boxes and wood splinters don't make a game any more amazing, but those were basically THE core elements of the Force Unleashed, using the Havok engine. Not surprisingly though, Havok was strictly licensed to Lucasarts for all of 2009 - no one else could use it. It's only just recently become available. So - for most of 2009, PhysX was the best choice - not only subsidized for using it, but because its competitors weren't actually available.

Re:Maybe (1)

Improv (2467) | more than 4 years ago | (#31403382)

I thought Havok has been used by SecondLife for years....

Re:Maybe (3, Funny)

EvilMonkeySlayer (826044) | more than 4 years ago | (#31403432)

No, you're thinking of the Furry engine. It offers ultra realistic fur simulation, I hear it's quite popular with the Second Life crowd.

Re:Maybe (1)

soupd (1099379) | more than 4 years ago | (#31403466)

Maybe there is some subtlety I'm missing by Havok is part of every PS3 and Xbox 360 SDK and plenty of Havok games were released on PS3 last year, check out the list from Havok themselves: http://www.havok.com/content/view/584/96 [havok.com]

I think you are confuzled (4, Informative)

Sycraft-fu (314770) | more than 4 years ago | (#31403540)

Intel owns Havok (since 2007) and licenses it out all over the place. There's a page that has all the titles using it (http://www.havok.com/index.php?page=available-games) and it is not a small list. Havok also runs on the CPU exclusively (and will probably continue that way since Intel wants to sell quad cores) so works no matter what your graphics card.

It's also not just physics anymore, there's Havok animation libraries and so on.

Re:I think you are confuzled (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#31404074)

I was mistaken, it was the DMM from Pixelux that was licensed - which AMD is also giving out, according to the article.

Re:Maybe (1)

DeadboltX (751907) | more than 4 years ago | (#31403652)

It's only just recently become available.

I haven't been able to confirm your statement about Havok being exclusive to Lucasarts for 2009, but the Havok engine has been around since 2000 and has been used by over 100 different games, so it is by no means just recently available.

Re:Maybe (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#31404004)

Appears I was mistaken, the DMM by Pixelux and the Euphoria (for AI) was strictly to Lucas Arts, Havok is and was always available. My bad.

Re:Maybe (0, Redundant)

VGPowerlord (621254) | more than 4 years ago | (#31403696)

Not surprisingly though, Havok was strictly licensed to Lucasarts for all of 2009 - no one else could use it. It's only just recently become available. So - for most of 2009, PhysX was the best choice - not only subsidized for using it, but because its competitors weren't actually available.

I'm pretty sure the Source engine (HL2 and its derivitives) by Valve uses Havok. In fact, Havok is mentioned by name on the "Powered by Source" screen, near the bottom.

Re:Maybe (5, Insightful)

blahplusplus (757119) | more than 4 years ago | (#31403372)

"Flying boxes and wood splinters do not make a better game."

But dead guys laying 180 perpendicular off a cliff makes them awesome? Does no one here remember the good old days of early FPS where if you died on the edge of a ledge your body would lay flat over the edge? Does no one remember the time when you hit dead bodies with shots and they didn't move or flail around? What about mass effect 1 the anti-gravity at the end with the geth/dead bodies floating and flailing around, not cool at all?

All that is physics and yes the do make a better game WHEN they are applied to things that need them and not over-used, especially not using physics as a gimmick.

Re:Maybe (0, Offtopic)

Sir_Lewk (967686) | more than 4 years ago | (#31403606)

Not everyone includes "pretty" in their "good game" equation. Doom can still hold it's own against modern games in terms of actual fun.

Re:Maybe (4, Funny)

H0p313ss (811249) | more than 4 years ago | (#31403720)

Does no one remember the time when you hit dead bodies with shots and they didn't move or flail around?

Not everyone includes "pretty" in their "good game" equation. Doom can still hold it's own against modern games in terms of actual fun.

Clearly you don't get a kick out of shooting dead bodies and seeing them twitch.

What the hell's wrong with you?

Re:Maybe (1)

Opportunist (166417) | more than 4 years ago | (#31403826)

Ya know, when you've seen it happen, it looks so fake in a game, no matter how good the engine...

Re:Maybe (1)

blahplusplus (757119) | more than 4 years ago | (#31403778)

"Not everyone includes "pretty" in their "good game" equation"

No doubt but most people move on if a game has better graphics, otherwise you would still be playing wolf-3d, why did you move to doom? Oh yes, that's right it that darn graphics thing adds atmosphere and awesomeness to games. Why did people move on from super mario 1 for the NES, etc, etc? Not all games that come after are as fun but if you can get the same old fun in a shiny new package people will play it over the original.

Ever played the original Mechwarrior 2 dos games? No textures, all wireframe flat shaded practically, then 3D accelerators happened and every early 3D accelerator had a copy of Mechwarrior 2 with REAL textures that made the game look so damn awesome and took Mechwarrior 2 to a whole new level.

Re:Maybe (1)

Sir_Lewk (967686) | more than 4 years ago | (#31403972)

No doubt but most people move on if a game has better graphics, otherwise you would still be playing wolf-3d, why did you move to doom? Oh yes, that's right it that darn graphics thing adds atmosphere and awesomeness to games.

You are assuming it was because of the improved graphics, and not because of the improved gameplay mechanics and pacing. Polished turds are still turds and great games will always be great.

Re:Maybe (1)

blahplusplus (757119) | more than 4 years ago | (#31404380)

"You are assuming it was because of the improved graphics, and not because of the improved gameplay mechanics and pacing. Polished turds are still turds and great games will always be great."

They will but sadly great gamers like ourselves are in the minority, assassin's creed 1 was like a poor man's prince of persia, but it sold millions. Sadly lots of people have junk gaming tastes, which means many aren't discerning enough to tell the difference between a mediocre game that missed the mark and a good game.

Re:Maybe (1)

FatSean (18753) | more than 4 years ago | (#31403968)

Newer games offer more colors and higher resolution, nothing wrong with that. I'm sure the latest Gods of War would be just as fun in 8-bit 320x200.

Re:Maybe (0)

Anonymous Coward | more than 4 years ago | (#31403824)

Can PhysX even be used for non-gimicky stuff?

I say this seriously. Every implementation I've ever seen has been particle effects when you blow up trashcans, or papers flying around, or shrapnel littering the floor. Nothing is exactly repeatable and none of it is essential. You won't ever be unable to complete a mission because your trash didn't explode in the right direction.

This might be partially due to the fact that PhysX is a proprietary solution so anything that's important to gameplay is done in a more universal means but they could just use PhysX in cpu for gameplay elements, such as what halflife2 did.

Re:Maybe (1)

Krau Ming (1620473) | more than 4 years ago | (#31404280)

Most games still have body parts of other characters appear through doors/walls when they're on the other side as close to the door/wall as possible. Is that part of the physics engine? Obviously it's better than the days of goldeneye on n64 when one could kill half an army from inside a closed room, but come on!!!

Re:Maybe (4, Informative)

hedwards (940851) | more than 4 years ago | (#31403124)

If you noticed in the summary, AMD is advocating for a similar technology that works on their hardware as well as on nVidia's, seems like developers would prefer that for practical reasons.

Re:Maybe (1)

Hatta (162192) | more than 4 years ago | (#31403304)

Is PhysX just an API or is there hardware underneath supporting it? If it's hardware, then I'd say PhysX would be the better option practically. i.e. turning PhysX on would essentially be free in terms of resource usage. If it's just software then turning it on would take resources away from the rest of the rendering.

In any case, I've played a few games with PhysX. It's pretty fucking cool. Not cool enough to make a shitty game worth playing, but it makes a good game that much better.

Re:Maybe (1)

Aladrin (926209) | more than 4 years ago | (#31403498)

Everyone seems to be glossing over a nice little fact:

Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not. So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game. They get to play, too.

Re:Maybe (4, Funny)

MobyDisk (75490) | more than 4 years ago | (#31403528)

Open standards always win out over closed standards. Like OpenGL -vs- DirectX.... oh... wait... :-P

Re:Maybe (0)

Anonymous Coward | more than 4 years ago | (#31403656)

Nvidia's stuff deliberately doesn't work on non-Nvidia hardware.

Swapping it so that it's something cross-platform that works equally for both will show people just how much Nvidia's Physx was a bunch of crap. Basically, you will see no Nvidia performance advantage in physx games as you see right now. This is not a new issue, but AMD's approach is.

Now, Nvidia will have no excuse as to why people should support PhysX.

However, Nvidia has moved on already. They're doing that bullshit 3d stuff - OMG 3d TV! Half the performance due to twice the refresh rate! must buy!

etc.

Re:Maybe (5, Informative)

ASBands (1087159) | more than 4 years ago | (#31403946)

I've done some work with both PhysX and the things that AMD is pushing for. I try to keep with the Physics Abstraction Layer [adrianboeing.com] , which lets me plug in whatever physics engine as the backend, which gives a pretty damn good apples-to-apples performance metric. Personally, my ultimate choice of physics engine is the one which exhibits the best performance. My experience may differ from others, but I generally get the best performance from PhysX on with an nVidia GPU and BulletPhysics with an AMD GPU. Sometimes, the software version of PhysX outstrips the competition, but I have never seen anything beat PhysX in performance with GPU acceleration turned on. And with PAL, it is easy to check if there is GPU support on the machine and swap in the physics engine with the best performance (PAL is awesome).

Here's the thing: GPU-accelerated physics are just plain faster. Why? Because collision detection is a highly parallelizable problem. Guess what hardware we have that can help? The GPU. Another great part of using the GPU is that it frees the CPU to do more random crap (like AI or parsing the horribly slow scripting language).

AMD is working on both BulletPhysics and Havok so they can do GPU acceleration. But I have a feeling that PhysX performance will remain faster for a while: PhysX was designed to natively run on the GPU (technically, a GPU-like device), while these other libraries are not. Furthermore, nVidia has quite a head start in performance tuning, optimization and simple experience. In five years, that shouldn't matter, but I'm just saying that it will take a while.

So here is my message to AMD: If you want people to use your stuff, make something that works and let me test it out in my applications. You've released a demo of Havok with GPU acceleration. PhysX has been and continues to work with GPU acceleration on nVidia GPUs and will frequently outperform the software implementation. I'm all for open alternatives, but in this case, the open alternatives aren't good enough.

Re:Maybe (1)

jellomizer (103300) | more than 4 years ago | (#31403610)

Well most devs would want to use their version because of pride in their work.

Is anyone (0)

Anonymous Coward | more than 4 years ago | (#31403054)

surprised?

It's a new riff on the old joke (1)

idontgno (624372) | more than 4 years ago | (#31403062)

"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck."

Says the kid the dog without a dog to play with.

Re:It's a new riff on the old joke (2, Informative)

idontgno (624372) | more than 4 years ago | (#31403154)

Wow, that's so badly edited it's surreal.

This is one of those days where even the "Preview" button doesn't help.

That should read "Says the kid that the dog isn't playing with."

Re:It's a new riff on the old joke (1)

MartinSchou (1360093) | more than 4 years ago | (#31403768)

That should read "Says the kid that the dog isn't playing with."With whom the dog is not playing.

Re:It's a new riff on the old joke (0)

Anonymous Coward | more than 4 years ago | (#31403964)

Funny thing... it's perfectly acceptable to end a clause or sentence with a preposition in English, and always has been. Feel free to examine the works of any published author ever.

Re:It's a new riff on the old joke (1)

Anarki2004 (1652007) | more than 4 years ago | (#31403208)

"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck."

Says the kid the dog without a dog to play with.

Try again please. That statement is a grammatical failure. I'm not even sure what you were trying to say.

Re:It's a new riff on the old joke (1)

Minwee (522556) | more than 4 years ago | (#31403702)

Try again please. That statement is a grammatical failure. I'm not even sure what you were trying to say.

If you use Google Translate to translate it back into Korean, then Portuguese, Russian, Welsh and then finally back into English everything becomes much clearer.

"Children play with a puppy."

Re:It's a new riff on the old joke (0)

Anonymous Coward | more than 4 years ago | (#31403844)

I got: "Say hello to your son without a dog"

What does PhysX do anyways? (0)

Anonymous Coward | more than 4 years ago | (#31403156)

I've never figured out what PhysX is supposed to do. More realistic physics I suppose? Well I can't say I've ever noticed any difference between a game that uses it and a game that doesn't. So, what, the corpses flop differently?

Re:What does PhysX do anyways? (3, Insightful)

binarylarry (1338699) | more than 4 years ago | (#31403216)

duh, it's got what gamers crave!

Re:What does PhysX do anyways? (0, Redundant)

Pojut (1027544) | more than 4 years ago | (#31403366)

Electrolytes?

Re:What does PhysX do anyways? (0)

Anonymous Coward | more than 4 years ago | (#31403714)

duh, it's got what gamers crave!

Electrobytes?

Re:What does PhysX do anyways? (1)

david_thornley (598059) | more than 4 years ago | (#31404358)

Girlfriends?

Re:What does PhysX do anyways? (0, Troll)

religious freak (1005821) | more than 4 years ago | (#31403384)

http://lmgtfy.com/?q=PhysX [lmgtfy.com]

I [heart] this site... makes me happy every time I provide a link :)

Re:What does PhysX do anyways? (1)

Jeng (926980) | more than 4 years ago | (#31403730)

I think it makes you look like a dipshit everytime you provide a link.

You do not provide any sort of answer, and you assume that the person did not already google it.

So lets assume that the person did google this, and did not find an answer that helped him understand the issue. Wouldn't a good next step be to ask others for help? Is Slashdot not a good place to ask the question?

Re:What does PhysX do anyways? (1)

postmortem (906676) | more than 4 years ago | (#31403428)

The ones with PhysX are more likey to crash because nvidia drivers aren't prefect.

Not a trolling attempt: I have played Mirror's Edge game where flag vawes realistically on PhysX, but it crashes on same spot always. Just before crash flag looks weird.

On Radeon, flag doesn't wave so naturally, but game does not crash either.

Re:What does PhysX do anyways? (1)

Spatial (1235392) | more than 4 years ago | (#31404188)

It can be hardware accelerated on the GPU. That's it.

The benefit: Physics is one of those easily parallelised problems so a very large increase in complexity is possible.

The drawbacks: There's less GPU time available for drawing stuff so your framerate suffers. And of course, it's limited to Nvidia hardware only.

The latter leads to a drawback of its own: the technology can't be used to its full potential because many people who buy a game won't have the necessary hardware. So it can't be used in ways that would affect gameplay.

What else is new (1)

dvlhrns (1681218) | more than 4 years ago | (#31403218)

How is this any different then what Microsoft does ?

Sour grapes? (0, Redundant)

cbope (130292) | more than 4 years ago | (#31403224)

Sounds to me like AMD is just taking pot shots at NVIDIA. They probably wish they had either invented it or bought up Ageia before NVIDIA. Are there any games out that use OpenCL for physics? Or DirectCompute?

Trust me, NVIDIA will flog the Physx horse as long as it can. Eventually something will replace it anyway, so who gives a shit. Apart from software-based CPU physics, I haven't seen too many Physx titles and nothing for OpenCL or DC yet. I do have a 9600GT dedicated for Physx in my gaming rig, for those few titles that support it.

Re:Sour grapes? (1)

jandrese (485) | more than 4 years ago | (#31403500)

Are you seeing any benefit from that 9600GT? My complaint about physics in games is that because not everybody has the hareware for it, the developers have kept the number of physics objects down to a bare minimum (a create here, a rock there, sometimes a ragdoll deadguy), so you really don't get a feeling like there is physics in the world, just sometimes bodies will flop around and boxes will bounce a bit. Because of this, dedicated physics hardware goes to waste almost all of the time.

It would be cool to have a game where nearly everthing was physics enabled, and you just spent most of your time applying forces to objects to get stuff done--Oh' no! a badguy! I'll stop time for a second, apply a push to a nearby door, and cause the door to slam open in his face. Or maybe a clock will fly off of the wall and bonk him in the head. Maybe I'll knock a bookshelf over instead. And not in the pre-scripted "Hmm, I just got my gravity gun, and oh my, apparently a woodshop exploded over the town and littered the place with sawblades..."

Re:Sour grapes? (1)

Jeng (926980) | more than 4 years ago | (#31403596)

Actually this sounds fairly familiar, there are strong parallels between this and AMD's issue with Intel.

Nvidia is using their marketshare to push forward their software that can only run on their cards by paying companies to use it. If the developers are using Nvidia's solution then they are not using the competitors.

Re:Sour grapes? (1)

dvlhrns (1681218) | more than 4 years ago | (#31403694)

Exactly what M$ does...use their marketshare to push their software that can only run on their PCs, etc, etc, etc...

Not yet (1)

Sycraft-fu (314770) | more than 4 years ago | (#31403684)

Though the reason for that at this point is the newness of the APIs, not because they can't be used. We'll have to wait a couple years to see if one or both of the technologies take off. Please remember that the OpenCL API didn't get finalized until the end of 2008, and GPUs didn't implement it until several months later. So there has been less than a year that one could develop on real hardware using it. DirectCompute was released with DirectX 11, October of 2009. Also it requires DirectX 10, 10.1, or 11 and as such requires Windows Vista or 7.

There hasn't been the time to develop a physics engine using either of the technologies yet and implement it in a game. Also the two big middleware engines have no interest in using it at this point. One of them is PhysX which is nVidia of course. They want to use it to help sell hardware. The other is Havok, owned by Intel. Well they too want to use it to help sell hardware, meaning CPUs in this case. As such it'll probably stay all CPU based.

Means is we are to see an OpenCL/DirectCompute physics engine in a game it'll either need to be custom developed for that game, or a new middleware solution from someone else.

Is it actually allowed to also BE better? (2, Interesting)

DarkkOne (741046) | more than 4 years ago | (#31403268)

Even before hardware accelerated PhysX was on CUDA and you only got it with the standalone card, I always thought PhysX looked a bet nicer than Havok in action. I've been wishing more games used PhysX for a while, but it seems that if a game is going to be cross-target to the consoles as well, Havok is just a lot more likely. It may just be my own perceptions, but things seem to have a bit more consistent behaviour in regard to momentum and mass in PhysX whereas Havok seems a bit "floaty" a lot of the time. This may just be a result of constants designers pick, or something, I don't really know the details. But I personally just like PhysX better, from a player standpoint, hardware accelerated or not.

come on, AMD... (0, Offtopic)

toastliscio (1729734) | more than 4 years ago | (#31403352)

A few months ago I bought a new pc after years, and it has an AMD cpu but an nVidia gpu (it's assembled by myself). For hardware compatibility reasons it would appear obvious to buy an AMD/ATI gpu, but the problem is, I use Linux. And AMD graphic drivers on linux still suck compared to nVidia's. Why don't they shut up and strive to make decent drivers? They would get new customers, including me.

Re:come on, AMD... (1, Insightful)

Anonymous Coward | more than 4 years ago | (#31403620)

because, yet again, catering to the linux desktop community would be like gmc catering to the double amputee community. there are next to no users and the few users that are there seem to revel in running their machine off of 80 dollars worth of parts that they only upgrade after every other president.

sorry guy but the linux community is notoriously cheap and when you have a niche market like that you need for them to be big spenders to make it worth while. how else do you think that bentley and rolls royce get away with making less cars than most manufacturers use in crash tests in a year but still maintain a profitable business?

Re:come on, AMD... (0)

Anonymous Coward | more than 4 years ago | (#31404192)

...only upgrade after every other president.

I'm shopping for an AGP graphics card to go with some new RAM I just put into a dual Opteron machine I built just after the start of the Iraq war on Fedora Core 1. That's 12 Fedora releases, 1.5 Bush terms and a third of Obama.

It was a pretty sweet machine six-seven years ago, and I really splurged on it. Now it's pretty much exactly as you describe, and likely to continue serving me and my family through the current Obama administration.

Re:come on, AMD... (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31403822)

AMD drivers on linux are fine, troll.

Is it true? (1)

Improv (2467) | more than 4 years ago | (#31403436)

It seems a lot of people are kvetching at AMD for this because they're criticising a competitor. I think it's really more relevant to consider if what AMD says is true - if nVidia is paying people to use their proprietary stuff and then claiming it has broad industry adoption (and therefore is good), that's pretty shady.

I'm not sure how we really can tell if the criticism is valid unless we're in the industry though.

clutching at straws (5, Insightful)

obarthelemy (160321) | more than 4 years ago | (#31403488)

GPU makers are in a bind:
- IGP are now enough for 90% of users: office work (even w/ Aero), video, light gaming, dual-screen... all work fine with IGPs
- the remaining 10% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadays
- GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm
- devs have to target the lowest common denominator, which means no GPGPU for games

I'm actually think of moving my home PC to one of the upcoming ARM-based smarttops. They look good enough for torrenting + video watching + web browsing, consume 10 watts instead of 150...

Re:clutching at straws (0)

Anonymous Coward | more than 4 years ago | (#31403582)

I second that

Re:clutching at straws (5, Informative)

Ironhandx (1762146) | more than 4 years ago | (#31403766)

Tell that to AMD who have sold 2 million directx11 GPUs since release. (http://www.dailytech.com/ATI+Sells+Over+2+Million+DirectX+11+GPUs+Celebrates+With+Radeon+Cake/article17349.htm)

IGP are sufficient for 90% of users... but that hasn't changed since back in the Pentium 1 days. Many PCs were equipped with IGP or something that amounted to the same thing but in card form even then.

Also: GPGPU is NOT meant for gfx processing on the fly at all, so it has absolutely nothing to do with devs having to target the lowest common denominator. You even state that its useless except for scientific purposes in your own comment. The entire purpose of the GPGPU move is towards scientific purposes where vast quantities of repeated calcs have to be done. Something that GPUs excel at.

At least get SOME of your facts straight before spouting FUD.

Re:clutching at straws (3, Interesting)

obarthelemy (160321) | more than 4 years ago | (#31404110)

There are about 25 million PCs sold per month. I guess ATI is happy to have sold 8% of that monthly amount over the several months their 5xxx have been available, that's 3-4% of PC sales. Congrats to them, but still, fairly marginal.

Discrete cards have always been better than IGPs. I don't really get your point. Only recently (definitely way after the pentium 1) have IGPs become good enough to display all video files, or handle Aero.

PhysX is about making physics computations, not directly putting pixels on screen, so it's a kind of specialized GPGPU.

Some truth to that. (1)

eddy (18759) | more than 4 years ago | (#31403934)

While I don't think it's super dire, it's certainly a concern. I can add another point. Steam confirmed for Mac [nyud.net] .

Problem? Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.

On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.

re: Steam for Mac (1)

King_TJ (85913) | more than 4 years ago | (#31404210)

Yeah... this is more of a solution than a problem, any way you slice it. Why? Simple ... Many of the games they'll deliver to Mac users via Steam will offer cross-platform network play. So regardless of the specs they're constrained to for a native Mac version of the game, it will help keep a title popular having more people playing it. They can always support higher-res graphics capabilities in the Windows version, if they so desire. And if they do? All the more incentive for Apple to start releasing better graphics options for their own systems.

Re:clutching at straws (1)

LWATCDR (28044) | more than 4 years ago | (#31404042)

"GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm"
Well maybe for games but GPGPU will mean a lot for transcoding.
Home HD video is going to be big soon and it takes forever to transcode. However you can do that with an ARM. The TegaII and the OMAP line have enough GPU power to use it for transcoding.

Is 'Incentivizing' Anti-Competitive? (2, Interesting)

mpapet (761907) | more than 4 years ago | (#31403680)

This kind of incentive is anti-competitive.

1. It eliminates competition by feature/functionality.
2. It meaningfully constrains innovation. A novel product without capitalization to participate is shut out. (That's the goal anyway)

That said, this kind of incentivizing is everywhere. (game consoles, mega-retailers, mobile phones) No one seems to care about the increased costs consumers assume or constraint on innovation.

I have my bias, what is yours?

Re:Is 'Incentivizing' Anti-Competitive? (4, Interesting)

KillShill (877105) | more than 4 years ago | (#31404032)

Nvidia is very anti-competitive and has been for a very long time.

The recent "making physx stop working when AMD gfx card is present" is just one of the more public outings of their unethical behavior.

I wish someone would expose all of their shenanigans and anti-competitive practices so people can realize how badly these things affect the industry and consumers (ugh, hate that word).

The most recent thing I read about their practices is from the upcoming PC game, Just Cause 2. There's a trailer showing off Nvidia-only effects ...(something which is dead standard DirectX code) and artificially blocking out AMD/others from getting the benefits. The Batman Arkham Asylam scandal was one more people may recall. They claim (and their users/shills) that TWIMTBP is just "marketing"... more like bribery and blocking out the competition. They've been caught on many occasions but the public rarely sees anything negative about them.

Nvidia is the Intel/Microsoft of the video card industry but unlike them, isn't quite as dominant (thankfully for us) but they still do a hell of a lot of damage. (The Jupiter of the computer industry... too small to become a sun but still an 800 quadrillion ton gorilla).

I've stopped buying Nvidia cards since the Geforce 2. At that time for performance reasons but since then I vote with my wallet and let others know to support fair and legal competition.

Pure conjecture, but-- (0)

Anonymous Coward | more than 4 years ago | (#31404030)

What strikes me as the real issue here, is that game devs dont want to invest time on a proprietary graphics API that only a portion of their potentially targeted demographic will have access to.

EG, They don't want to use PhysX, when some non-trivial percentage of their customers will have AMD/ATI video.

It doesn't matter if PhysX would allow them to do real-time particle simulations of shrapnel, and thus create a more immersive FPS-- If some non-trivial percentage of their target audience cannot make use of that technology, then they need to be compensated for that loss.

To me, it isn't that PhysX is bad; it is that it is nVidia Only.

Take a short history lesson from the 1990s, when DOS games were the rage. Prior to the VESA standards people creating industry standards for high resolution video modes from DOS, certain games would only work on certain hardware; Same situation with audio capabilities. You wanted wavetable audio? Too damn bad unless you have a genuine MT32 plugged into your soundcard, or you have an actual AWE soundcard.

All that changed when Microsoft proposed DirectX for windows gaming.

Early versions of DirectX were indeed; total shit. Now, however, it is a well mature API, and is the primary target for game developers, because of its uniformity and ubiquity. It doesnt matter what random POS hardware is in there, as long as it has a directX driver; the game will at least start, and display a picture.

What needs to happen with "Processing on the GPU" taking off, is that a standardized implementation that is hardware agnostic needs to be drafted and approved.

Otherwise, it's just a case of yet another patent trolling, market playing pissing match between two or more squabbling children.

"He has to BUY PEOPLE OFF! His stuff is OBVIOUSLY crap! (Use my free license!)"

What we actually need is a platform and hardware agnostic API for doing GPU processing tasks. Not vendor kickbacks, like paid incentives or free licenses.

I find both players equally culpable in this debacle.

Best example with the MMORPG UTOPIA (4, Interesting)

SharpFang (651121) | more than 4 years ago | (#31404038)

A friend told me about his experience with Utopia. It implemented GPU-accelerated physics in one of recent patches. But try hard as you wish, he failed to notice any difference for weeks of gameplay. Until he entered the central city. With flags by the entrance fluttering smoothly in the wind, instead of the old static animation.

Yep, that's it. Many megabytes of a patch, a game of hundreds of miles of terrain, hundreds of locations, battles, vehicles, all that stuff... and physics acceleration is used to flutter flags by the entrance.

My complaint (1)

w0mprat (1317953) | more than 4 years ago | (#31404072)

Nvidia has failed to engage the coding community in the right way. Any hardware-accelerated physics API needed to be openly available at the DirectX/OpenCL level from the begining. AMD has kind of seen the light here.

The original intention of Ageia and their PhysX set up seemed to be just to sell the company, rather than try to make a viable business model of selling hardware. Ageia would have been more open with API and code right from the start if they intended to make a business selling hardware.

Wha? (1)

rgviza (1303161) | more than 4 years ago | (#31404328)

"They're not doing it because they want it; they're doing it because they're paid to do it."

Doesn't this describe just about any paid project? Just sayin'

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...