Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA To Enable PhysX For Full Line of GPUs

Soulskill posted more than 6 years ago | from the new-toys dept.

Graphics 140

MojoKid brings news from HotHardware that NVIDIA will be enabling PhysX for some of its newest graphics cards in an upcoming driver release. Support for the full GeForce 8/9 line will be added gradually. NVIDIA acquired PhysX creator AGEIA earlier this year.

cancel ×

140 comments

Sorry! There are no comments related to the filter you selected.

Linux Support (0)

Helmholtz (2715) | more than 6 years ago | (#23881011)

Hopefully they'll include their Linux drivers.

Re:Linux Support (4, Funny)

jandrese (485) | more than 6 years ago | (#23881039)

And hopefully some Linux game/app will come out that can use it.

Re:Linux Support (4, Funny)

gujo-odori (473191) | more than 6 years ago | (#23881175)

And hopefully when it does I'll get first post in the /. article about it.

Re:Linux Support (4, Funny)

carlmenezes (204187) | more than 6 years ago | (#23881693)

And hopefully the /. article won't be a dupe.

Re:Linux Support (4, Funny)

Arethan (223197) | more than 6 years ago | (#23881899)

And hopefully the comments in the article won't all be attempts at +5, Funny.

Re:Linux Support (1)

Anpheus (908711) | more than 6 years ago | (#23882979)

And hopefully everyone won't just be stuck at +4 Funny and some negative karma mods that make the whole thing feel worthless.

Re:Linux Support (0)

Anonymous Coward | more than 6 years ago | (#23881907)

http://tech.slashdot.org/article.pl?sid=08/05/26/0346228

Re:Linux Support (4, Funny)

3vi1 (544505) | more than 6 years ago | (#23881909)

And hopefully the story wont be posted 4/1/2009.

-J

Re:Linux Support (0)

Anonymous Coward | more than 6 years ago | (#23882565)

of a dupe

Re:Linux Support (1, Flamebait)

KingOfBLASH (620432) | more than 6 years ago | (#23881045)

Hopefully they'll include their Linux drivers.

If you're going to make a comment that useless you might as well just say "frist p0st!" and be honest about your intentions

Re:Linux Support (5, Funny)

Gewalt (1200451) | more than 6 years ago | (#23881473)

iduno, I'm inclined to believe his post was more useful than yours... or mine...

Re:Linux Support (0, Redundant)

mrbluze (1034940) | more than 6 years ago | (#23881507)

iduno, I'm inclined to believe his post was more useful than yours... or mine...
I bet my post is less useful than all of the above.

Re:Linux Support (4, Interesting)

keithjr (1091829) | more than 6 years ago | (#23881691)

That's not a useless comment at all unless I'm missing something. UT3 hasn't been able to put out the long-promised Linux driver because AGEIA is being so unwilling to release the license grapple hold they have over the PhysX engine. This is a legitimate concern. Unless their stance changes, Linux drivers will not be possible.

and the mac osx drivers (1)

Joe The Dragon (967727) | more than 6 years ago | (#23881395)

and the mac osx drivers

Imagine... (-1)

Anonymous Coward | more than 6 years ago | (#23881509)

Imagine what a beowulf cluster of these could d@&*#_(T)JMMMMMMM#@W%^W$@W^Z$$ NO CARRIER

Re:Linux Support (2, Informative)

GonzoPhysicist (1231558) | more than 6 years ago | (#23881625)

They might have some incentive to now that AMD is both working with Havok and releasing Linux drivers with the new ATI card.

Re:Linux Support (3, Interesting)

Zymergy (803632) | more than 6 years ago | (#23881887)

So ATI has in their new Linux drivers Havok technology and it works under Linux for the new ATI cards?
What Linux application/game uses Havok?

I didn't RTFA (0)

Anonymous Coward | more than 6 years ago | (#23881033)

So what's this PhysX?

Re:I didn't RTFA (3, Informative)

aliquis (678370) | more than 6 years ago | (#23881123)

Hardware accelerated physical acceleration, gravity and particlestuff if I remember correctly, atleast old examples used to be throwing away items or exploding walls and such.

Re:I didn't RTFA (4, Funny)

somersault (912633) | more than 6 years ago | (#23881305)

Mmmmm.. hardware accelerated litter..

Re:I didn't RTFA (3, Interesting)

slaker (53818) | more than 6 years ago | (#23881641)

It makes City of Heroes look all awesome, particularly if you use Gravity, Storm, Kinetics or Assault Rifle power sets.

Having bullet casings, leaves, newspapers and the like drop and swirl around in response to player actions is actually pretty nifty from an immersion standpoint, particularly for a game that's essentially set in something that resembles the real, modern world.

Re:I didn't RTFA (4, Funny)

bmo (77928) | more than 6 years ago | (#23881675)

"Having bullet casings, leaves, newspapers and the like drop and swirl around in response to player actions is actually pretty nifty from an immersion standpoint"

That's it. I'm done with immersion games. I'm going outside to stand in the rain. Back later.

--
BM0

Re:I didn't RTFA (4, Funny)

amRadioHed (463061) | more than 6 years ago | (#23881935)

particularly for a game that's essentially set in something that resembles the real, modern world
Because leaves didn't drop and swirl before modern times?

Re:I didn't RTFA (1)

slaker (53818) | more than 6 years ago | (#23882091)

I'm not sure the bullet casings or newspapers did, and given that essentially every PC game that's not City of Heroes is either a D&D ripoff, a Doom clone or a WWII shooter, I didn't want there to be any confusion.

Re:I didn't RTFA (1)

TheThiefMaster (992038) | more than 6 years ago | (#23883557)

Don't forget futuristic shooters, futuristic rtss and driving games.

Re:I didn't RTFA (1)

LilGuy (150110) | more than 6 years ago | (#23882483)

People still actually play that piece of crap?

I went out and bought that quite a few years ago, and my friends all did too so they could play with me, and many of them won't speak to me anymore.

I didn't realize people actually liked it though.

Re:I didn't RTFA (3, Interesting)

Vectronic (1221470) | more than 6 years ago | (#23881133)

Basically exactly what it sounds like... its a real-time physics calcuating engine.

Used in games for things like shooting the limbs off of creatures, or even wind on trees, or water...

Likewise for other 3D applications, im not sure how extensive it is, or what its limitations are, but im looking forward to it, and more because calculating physic type things on most 3D software takes a lot of CPU power, so if the GPU can handle that, that takes a great load of the main CPU. (from what I would assume)

Re:I didn't RTFA (2, Interesting)

trooperer (1305425) | more than 6 years ago | (#23881397)

I begin to wonder what's the use of having a multi-core CPU if GPU will be taking all the hard work?

What's next? "Graphic" cards with hardware accelerated AI support?

Re:I didn't RTFA (0, Offtopic)

Vectronic (1221470) | more than 6 years ago | (#23881503)

Yes... quite obviously, that's why AMD + ATI, and Intel has their own Graphics stuff, VIA, etc.

Given the power that can be crammed into millimeters of space, may as well combine them, eventually your entire "PC" will be the size of the average processor is now, which just aren't there yet, so there are some naysayers used to older tech, and the junkies waiting for the next best thing.

Which, is both exciting and scary, welcomed and feared... but its just interesting right now like watching two planets collide.

Re:I didn't RTFA (2, Insightful)

Carbon016 (1129067) | more than 6 years ago | (#23881703)

There hasn't been for a while, that's why buying a quad-core CPU is largely useless for gamers and one of the best uses of a dual-core CPU is running a single-threaded application alongside Windows. Graphics cards are massively parallel multi-core systems and have much better real-world and theoretical performance in physics simulations. Physics and AI are all the GPU has left to conquer. I still see the CPU doing a lot of AI work, though, because those sort of algorithms (hey no recursion neat) are naturally far from the linear access sort of thing CUDA and related technologies are best at.

Re:I didn't RTFA (4, Interesting)

ya really (1257084) | more than 6 years ago | (#23883097)

There hasn't been for a while, that's why buying a quad-core CPU is largely useless for gamers and one of the best uses of a dual-core CPU is running a single-threaded application alongside Windows.

Not exactly true, all of the Unreal Tournament Edition 3 engine games consistantly use all four cores in my Intel Q6600 with over a dozen threads spaced throughout my cores. The most notible examples would be UTE3, Bioshock and Mass Effect, 3 of the biggest games of 2007 and 2008. I can typically max out settings for UTE3 engine games.

On the other hand, performance demanding games like Crysis are total doucebags and peg just one core and sometimes using one more if it feels like it every now and then. Although it's not a very good comparison since there's so many different factors involved, I would gather to say that if crysis took an approach of optimizing better for duo and quad core cpus, their publisher would have far less complaints about performance from gamers.

Re:I didn't RTFA (1)

mrchaotica (681592) | more than 6 years ago | (#23882089)

I begin to wonder what's the use of having a multi-core CPU if GPU will be taking all the hard work?

And now you know why the people at Intel have been pushing raytracing so hard recently: they know this, and are trying to avoid becoming irrelevant.

Re:I didn't RTFA (1)

AHuxley (892839) | more than 6 years ago | (#23882503)

AI support would need 'plot'
I think we will have to wait a few generations until game developers see profit in expensive 'text'.
Eye candy is what sells. Why waste time and cash with AI's :-)

Re:I didn't RTFA (1)

Bootarn (970788) | more than 6 years ago | (#23883193)

When it comes to games, very few games take advantage of multi-core CPU:s, unfortunately. The only games I can recall benefiting from multi-core technology that I've played is Doom 3, Quake 4 and ETQW: Quake Wars. When it comes to general purpose computing, multi-core systems are often the way to go if you use multiple applications simulaneously. When I got my first multi-core system, I was completely amazed by how fast large program packages would compile. And this was only with two cores. I can't even begin to imagine what a delight it would be to compile stuff on a quad-core system.

Re:I didn't RTFA (2, Insightful)

negRo_slim (636783) | more than 6 years ago | (#23881417)

im not sure how extensive it is, or what its limitations are,
Me niether, but the PhysX denos certainly had a neat feel to them... I think I ended up with them from a UT3 install, can't find a link to the originals, but I found this [kpnet.fi] which looks exactly the same.

But going from a little physics demo to full blown kick ass 3d game with any meaningful results is a whole 'nother matter.

Re:I didn't RTFA (1)

Fecal Troll Matter (445929) | more than 6 years ago | (#23881515)

Newegg has the 280 in stock now for the staggering price of 650-700 depending on the model... the 9800/9800+ will represent an amazing value in light of this.

Hentai (5, Funny)

jaguth (1067484) | more than 6 years ago | (#23881051)

Maybe we'll finally see some realistic physics with fantasy tentacle rape hentai games. Is it just me, or do the current tentacle rape game physics seem way off?

Re:Hentai (5, Funny)

FeepingCreature (1132265) | more than 6 years ago | (#23881157)

It's a problem with the underlying ragdoll representation.
They're having difficulties realistically modelling penetration. Close contact like that tends to lead to numerical instabilities in physics engines. There's not much Physx can do to help, though.

Re:Hentai (5, Funny)

Minwee (522556) | more than 6 years ago | (#23881303)

That's why there are teams of researchers working night and day to improve the state of tentacle modeling.

If you have what it takes to advance the state of the art there could be a big government grant and a PhD in it for you.

Re:Hentai (3, Funny)

hairyfeet (841228) | more than 6 years ago | (#23882147)

Which is why the need to use motion capture! Of course,getting both the girl and the octopus to hold still while you stick all those little white balls in places that little white balls weren't meant to go won't be easy,but I'll be happy to take the girl if someone else wants to get the octopus.

Re:Hentai (0)

Anonymous Coward | more than 6 years ago | (#23882169)

Government grants for tentacle rape hentai games? I never knew I could make money doing what I loved!

Re:Hentai (2, Funny)

AHuxley (892839) | more than 6 years ago | (#23882521)

Ask DARPA to fund computer modeling a tight cavity IED searching arm?

Interesting Moderation (1, Funny)

JeremyBanks (1036532) | more than 6 years ago | (#23882199)

I love that this was modded insightful.

Re:Hentai (1, Funny)

Anonymous Coward | more than 6 years ago | (#23883509)

physx can't, but physex can

Re:Hentai (3, Funny)

Darlo888 (1235928) | more than 6 years ago | (#23881165)

lol?

Re:Hentai (4, Insightful)

maz2331 (1104901) | more than 6 years ago | (#23881265)

That's just disturbing.

Re:Hentai (3, Funny)

somersault (912633) | more than 6 years ago | (#23881317)

Not as disturbing as the Chronicles of Goatse.cx Part IV: Rick Astley's Revenge

Re:Hentai (0)

Anonymous Coward | more than 6 years ago | (#23882225)

If you think that's bad you should see Tubgirl Strikes Back and Return of the Zombie Goatse... Buuh.

Re:Hentai (0)

Anonymous Coward | more than 6 years ago | (#23883609)

http://alchemist.excessivelydangerousthing.com/ljcartoons/first-contact.png [excessivel...sthing.com]

Caption: "when first contact with an alien race came it turned out that japanese cartoons were wrong ..." (Don't want to spoil the punchline)

Re:Hentai (1)

TopSpin (753) | more than 6 years ago | (#23881279)

You won this story.

Happy Friday.

PhysX? (0)

Anonymous Coward | more than 6 years ago | (#23881079)

So what exactly is PhysX? Click on both the links didn't really get me anything concrete.

Re:PhysX? (5, Informative)

aliquis (678370) | more than 6 years ago | (#23881135)

http://en.wikipedia.org/wiki/PhysX [wikipedia.org]

Realtime hardware accelerated physics. Used to be on a separate expensive board which few games supported but Nvidia are implementing it on CUDA so it can run on their graphic cards instead.

Re:PhysX? (4, Insightful)

arbiter1 (1204146) | more than 6 years ago | (#23881173)

nvidia bought out he company so they own it and can put it on their cards, games that decide to add support for it it will benefit nvidia.

Re:PhysX? (1)

Barny (103770) | more than 6 years ago | (#23881995)

I did hear in an interview with an NV engineer recently that they are working to have a CUDA environment under a standard x86 cpu, just with reduced speed (since there's only 1-8 CPUs).

This stuff they designed specifically for their gforce shader unit (or vice verser), why should they do the work to key in AMD or anyone else to be able to do it, when AMD built their own GPU processing API do you think they offered to port it to NV cards?

The big question, how hard are NV going to push TWIMTBP (The Way Its Meant To Be Played) affiliated game producers to use it.

Re:PhysX? (1)

mrchaotica (681592) | more than 6 years ago | (#23882097)

The big question...

The real big question, for people in the Real World who need to be able to support Nvidia and ATI GPUs, is when there's going to be a standard GPGPU and/or physics API that works on both.

Until then, all this shit's entirely useless.

Re:PhysX? (1)

Barny (103770) | more than 6 years ago | (#23882141)

The number of people in this world you speak of, is not as large (or indeed any substantial subset) of the end users of the devices in question.

As I said in another post, fire up all your games one at a time, and look through the start credits, TWIMTBP shows up a lot now on the "big" titles.

And I am not even going near their new CUDA processing array servers, which stand on their own merits with this tech.

Re:PhysX? (1)

smaddox (928261) | more than 6 years ago | (#23882191)

It's not useless. There are a few graphics engines out there that are capable of scaling to different capabilities for different cards. For example: id Tech 4, Havok, Source engine, etc.

It does make development more difficult, though.

Re:PhysX? (2, Informative)

Tawnos (1030370) | more than 6 years ago | (#23883507)

The source engine, while "capable" of scaling to multiple cores, does a very poor job on current x86 chips. The games become very unstable with mat_queue_mode 2 on, and there are problems with jerky motion in any sort of latency.

It's a shame, too, because the engine works with multicore on various consoles, and it's a lot faster when it does work on PC.

Works on just the one card? (3, Interesting)

neokushan (932374) | more than 6 years ago | (#23881177)

I read TFA, but it didn't really give many details as to how this works, just some benchmarks that don't really reveal much.
Will this work on single cards or will it require an SLi system where one card does the PhysX and the other does the rendering?

Plus, how does handling PhysX affect framerates? Will a PhysX enabled game's performance actually drop because the GPU is spending so much time calculating it and not enough time rendering it, or are they essentially independent because they're separate steps in the game's pipeline?

Re:Works on just the one card? (1, Flamebait)

Ant P. (974313) | more than 6 years ago | (#23881343)

The effect on framerate doesn't matter - the target audience for this will have at least one spare graphics card to run physics on.

Re:Works on just the one card? (1)

neokushan (932374) | more than 6 years ago | (#23881389)

Are you sure that's the target audience, though?
See I've only got 1 card and I'd love hardware accelerated physics, but I sure as hell wouldn't buy a separate card for it.

Re:Works on just the one card? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23881609)

Previously you had to buy a $200+ physics card from Ageia. I'm not sure how well a graphics card can do physics, but it'd be neat if I could take an older graphics card and repurpose it to do physics instead of throwing it away.

Re:Works on just the one card? (1)

QuantumLeaper (607189) | more than 6 years ago | (#23881637)

Ageia isn't a hardware company, so you couldn't buy one from them but they did license the hardware to other who did make cards for $200. The PS3 use a PhysX chip, if I remember correctly.

Re:Works on just the one card? (4, Informative)

lantastik (877247) | more than 6 years ago | (#23881437)

That's not true at all. It works in a single card configuration as well. Modern GPUs have more than enough spare parallel processing power to chug away at some physics operations. Guys are already modifying the beta drivers to test it out on their Geforce 8 cards. The OP in this thread is using a single card configuration:
http://forums.overclockers.com.au/showthread.php?t=689718 [overclockers.com.au]

Re:Works on just the one card? (1)

Goalie_Ca (584234) | more than 6 years ago | (#23882445)

While the card is off to render the cpu sits waiting. While the cpu is busy the card sits waiting.

Re:Works on just the one card? (1)

TheThiefMaster (992038) | more than 6 years ago | (#23883615)

No, don't be stupid. Any half-decent games engine nowadays does everything with parallel threads.
They (cpu and gpu) still have to wait on each other if they finish early (to synchronise the frames), but they will spend at least 50% of their time both running at once. Ideally it would be 95%+, but games are often unbalanced in favour of graphics complexity these days.

Re:Works on just the one card? (5, Informative)

Kazymyr (190114) | more than 6 years ago | (#23881769)

Yes, it works on one card. I have enabled it on my 8800GT earlier today. The CUDA/PhysX layer gets time-sliced access to the card. Yes, it will drop framerates by about 10%.

OTOH if you have 2 cards, you can dedicate one to CUDA and one to rendering so there won't be a hit. The cards need to NOT be in SLI (if they're in SLI, the driver sees only one GPU, and it will time-slice it like it does with a single card). This is actually the preferred configuration.

Re:Works on just the one card? (1)

ya really (1257084) | more than 6 years ago | (#23883149)

I have an 8800gt and the latest drivers. How exactly are you enabling physx for it? I don't see any options listed for the drivers config. Just curious, not calling you a liar since there may be a way.

Re:Works on just the one card? (3, Informative)

Kazymyr (190114) | more than 6 years ago | (#23883701)

You need the latest unreleased yet drivers for toe GTX2xx series, version 177.39. Then edit the nv4_disp.inf file and add an entry for device ID of 0611 (=8800GT). You will then be able to install the driver on the 8800GT. Next, install the new (also unreleased yet, but google is your friend) 8.06 software for PhysX. That's it.

Re:Works on just the one card? (2, Interesting)

nobodyman (90587) | more than 6 years ago | (#23881803)

According to the Maximum PC Podcast [maximumpc.com] they saw significant framerate hits with single card setups, but that it was much better under SLi. They did stress that they had beta drivers, so things may drastically improve once nvidia gets final drivers out the door.

Re:Works on just the one card? (0)

Anonymous Coward | more than 6 years ago | (#23881859)

As of right now the PhysX stuff doesn't support multi-GPUs, but supposedly that's in the works as well.

Kind old news (1)

j-turkey (187775) | more than 6 years ago | (#23881183)

This was reported in February [techreport.com] , shortly after Nvidia purchased PhysX. Of course, the GF9 series had not been released yet, so it was not mentioned in the news posting -- but future support sort of goes without saying. I'm fairly certain that it was reported on /. with a nearly identical headline in February as well.

mo3 down (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#23881247)

If you 4ave of the founders of

Does anyone else remember... (0, Flamebait)

lantastik (877247) | more than 6 years ago | (#23881353)

...how much gamers used to shit all over PhysX cards? Now, they can't wait to get their hands all over it.

Re:Does anyone else remember... (2, Insightful)

urbanriot (924981) | more than 6 years ago | (#23881431)

Really? I don't know any gamers that are excited about this. Name more than one game (without googling) that supports Physx?

Re:Does anyone else remember... (2, Informative)

lantastik (877247) | more than 6 years ago | (#23881441)

I don't need to Google. Anything built on the Unreal 3 engine has PhysX support built in.

Re:Does anyone else remember... (1)

urbanriot (924981) | more than 6 years ago | (#23881483)

So... Unreal 3? That's one...

Re:Does anyone else remember... (3, Informative)

lantastik (877247) | more than 6 years ago | (#23881521)

Reading comprehension...anything built on the Unreal 3 engine.

Like one of these many licensees:
http://www.unrealtechnology.com/news.php [unrealtechnology.com]

Native PhysX Support:
http://www.theinquirer.net/en/inquirer/news/2007/05/30/unreal-3-thinks-threading [theinquirer.net]

Re:Does anyone else remember... (0, Redundant)

urbanriot (924981) | more than 6 years ago | (#23881581)

So... Unreal 3?

Re:Does anyone else remember... (4, Insightful)

neokushan (932374) | more than 6 years ago | (#23881589)

Unreal 3 is an engine that's used on LOTS of games - technically ALL of them have PhysX support, so no, not "just" Unreal 3, because there is no game called Unreal 3.

Re:Does anyone else remember... (3, Funny)

Anonymous Coward | more than 6 years ago | (#23881603)

So just Unreal Tournament 2007?

Re:Does anyone else remember... (1, Informative)

Anonymous Coward | more than 6 years ago | (#23881721)

No, there are a few games which use the Unreal 3 engine:
Clicky [unrealtechnology.com]

Re:Does anyone else remember... (1)

rsmith-mac (639075) | more than 6 years ago | (#23882947)

All of them can support PhysX. They don't have to offer hardware acceleration, they don't even have to use PhysX for physics if they don't want to. This is critical, a lot of UE3 games are not supporting PhysX hardware acceleration. UT3 is still the only game.

Re:Does anyone else remember... (2, Funny)

Anonymous Coward | more than 6 years ago | (#23881547)

Duke Nukem Forever.

Re:Does anyone else remember... (1, Interesting)

bluefoxlucid (723572) | more than 6 years ago | (#23881605)

Except modern physics engines (see: Quake 1 for MS DOS) use threads for each individual moving physics object, and the Render Thread that manages control of the graphics card uses 1 thread itself (hard to split up that...), so with new Quad Core and 8 and 16 core systems you've got a much better physics processing engine running on your CPU.

Re:Does anyone else remember... (2, Informative)

SanityInAnarchy (655584) | more than 6 years ago | (#23882203)

Except modern physics engines (see: Quake 1 for MS DOS) use threads for each individual moving physics object
Name one engine that is that stupid.

When we're talking about game worlds in which there could easily be 50 or 100 objects on the screen at once, it makes much more sense to have maybe one physics thread (separate from the render thread, and the AI thread) -- or maybe one per core. I very much doubt one real OS thread per object would work well at all.

Re:Does anyone else remember... (3, Informative)

bluefoxlucid (723572) | more than 6 years ago | (#23882379)

Um, except if you you have exactly 1 physics thread you have to juggle complex scheduling considerations about who needs how much CPU, handle the prioritization against the render and AI threads, handle intermixing them, etc. You have to implement a task scheduler. ... which is exactly what Quake 1 did. Carmack wrote a userspace thread library, and spawned multiple threads. Since DOS didn't have threads this worked rather well.

An OS thread will give any thread a base priority, and then raise that priority every time it passes it over in the queue when it wants CPU time. It lowers the priority to the base when it runs. If a task sleeps, it gets passed over and left at lowest priority; if it wakes up and wants CPU, it climbs the priority tree. In this way, tasks which need a lot of CPU wind up getting run regularly-- as often as possible, actually-- and when multiple ones want CPU they're split up evenly.

If you make the render thread one thread, you have to implement this logic yourself. Further, the OS will see your thread as exactly one thread, and act accordingly. If you have 10000 physics objects and 15 AIs, keeping both threads CPU-hungry, then the OS will give 1/3 CPU to the physics engine; 1/3 CPU to the AI; and 1/3 CPU to the render thread. This means your physics engine starves, and your physics start getting slow and choppy well before you reach the physical limits of the hardware. The game breaks down.

You obviously don't understand either game programming or operating systems.

I called it (3, Insightful)

glyph42 (315631) | more than 6 years ago | (#23881359)

I called this when the PhysX cards first came out. I told my excited coworkers, "these cards are going to be irrelevant pretty soon, because it will all move to the GPU". They looked at me funny.

Re:I called it (4, Funny)

ruiner13 (527499) | more than 6 years ago | (#23881539)

Awesome! Would you like a medal or a monument? What stocks should I buy next week? Who will become the next president, oh wise prophet?

Re:I called it (5, Funny)

neveragain4181 (800519) | more than 6 years ago | (#23881595)

Hi

We need an address for your 'Sarcastic Achievement - Level 3' certificate - you'll have to pay postage, but I'm sure you won't mind that, right?

Ned Again
COO - Sarcasm Society
Level 5 Sarcasm Ninja (certified)

Re:I called it (0, Troll)

slaker (53818) | more than 6 years ago | (#23881689)

That is kind of a non-sequitur when you work in a whorehouse.

Re:I called it (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23881917)

Even more so, when you work in the whorehouse's morgue.

Re:I called it (1)

Barny (103770) | more than 6 years ago | (#23882053)

Of course not that NV were packaging a GPU accelerated Havok engine with TWIMTBP for developers (look at company of heroes for that kind of thing), their plans with Havok dropped out when Intel brought the engine tech, so NV secured Ageia so that this time its tech can't be yanked out from under it.

A fun thing to do, load up all your favorite games, and actually watch the intros, how many have TWIMTBP, how many of the new games from these makers will require a NV card for their games physics to run well?

Re:I called it (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23883659)

I called this when the PhysX cards first came out. I told my excited coworkers, "these cards are going to be irrelevant pretty soon, because it will all move to the GPU". They looked at me funny.

People in the food service industry aren't as up on computer equipment as you'd think.

heh (0)

Anonymous Coward | more than 6 years ago | (#23881755)

Oh gee this sounds so nice for the 3 games that will use it, but I wonder if it will crash and burn the 700 series chipset motherboards like their video corruption issue. LOL

Why keep the specialized interface? (0)

Anonymous Coward | more than 6 years ago | (#23882935)

They got rid of the specialized hardware, why keep the the specialized interface?

If you use standard shaders for these pre-render calculations you can avoid excluding support for half the graphics hardware available on market, possibly a license fee aswell.

How anti-opensource and non-slashdot for supporting this attempt of NVIDIA to monopolize.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>